In honor of Mathematics Awareness Month I will take an idiosyncratic look at mathematical aspects of complexity, in particular some history of mathematical approaches and insights into this subject...
Posted April 2011.
Are tax forms complex? Is the world wide system of selling stocks a complex one? What about the Mandelbrot set?
Figure 1 (Mandelbrot set - Courtesy of Wikipedia)
What is a complex system? How can we measure complexity? Is complexity "good" or "bad?"
As the importance of a topic matures it is often useful to look back and see the roots of the subject and how it evolved with time. It is not an accident that deterministic systems were studied before stochastic ones. For systems that have a mixture of deterministic and chance elements, it is common to take on the deterministic parts first. One source of complexity is that chance events are are harder to sort out. Or so it seemed until relatively recently - more of this in a bit. Thus, algebra and geometry were developed in many sophisticated directions (in Egypt, China, Babylonia, India, etc.) before a systematic approach to probability theory started to emerge in the 17th century in the work of Pascal and Fermat. Another early pioneer in the complexities that chance events cause in the understanding of our work was Thomas Bayes.
Figure 2 (Reverend Thomas Bayes)
Whereas until relatively recently a frequentist-based approach to making decisions was the norm, more and more discussions are being made of the use of ideas implicit in Bayes work for making decisions and reaching conclusions. For example, suppose one wants to argue that a certain die, which looks very symmetrically manufactured, is a fair die. One might roll this die 60 times and record the relative frequency of one dot, two dots, ..., six dots. However, it not very likely that each number of spots will occur equally often. If 6 occurs slightly more often than the other numbers of spots, does this mean a slight bias towards 6 appearing? A Bayesian approach might use the "data" from the 60 tosses as input for further work to try to give an answer to the question as to whether or not the die is fair.
Figure 3 (Bifurcations of the logistic map - courtesy of Wikipedia)
This involves the iteration of computing from a given input x, the value r(x)(1-x) (where r is a fixed constant) as an "output" and then using this output value to compute r(x)(1-x) again. The fact that a quadratic equation could show this level of complexity was quite a surprise. It was Robert May (now Lord May of Oxford) who in 1978 first called attention to the complexities of this map in a variety of simple models motivated by questions in biology.
Figure 4 (Lord May, Courtesy of Wikipedia)
The mathematician and meteorologist Edward Lorenz had in essence called attention to a similar phenomenon in what is seen in difference equations for differential equations. It was Lorenz who coined the term "butterfly effect." This notion suggests that something as minute as a butterfly's flapping its wings can be responsible for much more dramatic effects separated from the original action in time and space. In the area of chaos people also speak of sensitive dependence on initial conditions. For example, if one runs a simulation of a system (based on a differential equation or difference equation), one needs to have initial conditions to indicate the starting state for the system. The difference equation then can be used to see what the future state of the system would be. It had been thought that a small difference in the starting conditions for running the simulation would lead to similar long term outcomes. This turned out not to be true. Two simulations that differed in the initial conditions that varied in the 10th decimal place might have nothing to do with each other after a very small number of time steps! Lorenz noted this phenomenon in the context of weather predictions.
Computers and complexity
One of the major tools people use to deal with complex systems are computers. It is often difficult to understand the relation of computer systems to humans these days because humans have come to depend so much on complex systems that can be run only by computers. For example, trades are now routinely carried out on stock markets much more quickly than could be done by humans without computer systems. Historically people have taken comfort in the fact that computers do impressive things because of their speed but that they don't seem capable of feats that seem to involve "thinking" or "creativity." Traditional examples of seemingly unique human skills are playing checkers, playing chess, and other "games" that seem to require thinking and intelligence. First, a checkers-playing program was developed which played checkers better than any human opponent. IBM developed a computer system known as Deep Blue which beat Gary Kasparov in a well-publicized match. Although Deep Blue was "retired," modern chess playing programs are amazingly good.
Figure 5 (Photo of John Von Neumann)
Von Neumann was a pioneer in developing the theory and implementation of early stored program digital computers. Turing's interest in computers was spurred by his World War II efforts to use computers to help break German codes.
Figure 6 (Photo of Alan Turing)
Turing, one of the pioneers of computer science and the complexity questions that became a part of computer science, invented a simplified "conceptual" computer now known as the Turing machine. The Turing machine is a conceptual device which marks symbols on an infinite tape which can be advanced in either direction from the cell that is currently being "looked at" by a "head" or reader. Turing was able to show that any problem that could be solved on a "standard" computer could also be solved on a Turing machine. Thus, this "standardized" device enabled one to talk about complexity without talking about the hardware of an actual machine. A supercomputer can solve problems more quickly than a Turing machine when they can be solved, but it cannot solve any problem that a Turing machine cannot solve!
In discussing the complexity of different kinds of problems, the problems involved can be placed into different categories. Even for experts who are trying to stay abreast of developments in this rapidly emerging field, there is a complex alphabet soup of classes and types of problems considered. Here I will try merely to address some issues related to the best known of these classes and ideas. Among the kinds of questions that are considered are decision problems (ones that have a "yes" or "no" answer) and optimality problems (problems where one is seeking a "best" solution with respect to some objective function). When talking about complexity it is common to use the phrase "instance" of a problem to refer to a particular case of the problem involved. For example, if one's problem is to sort a list of n integers from smallest to largest, an instance of size 6 would be to sort the six numbers 5, 9, 23, 3, 1, 100 and an instance of size 10 would be to sort 1, 5, 2, 7, 90, 30, 200, 38, 2, 11. One might talk about the problem of factoring integers with n digits and an instance of this problem for n = 8 would be to factor 81,123,121.
The part of mathematics and computer science concerned with complexity theory has had many major contributors. However, two individuals responsible for a major breakthrough were Stephen Cook and Leonid Levin.
Figure 7 Photo of Leonid Levin (Courtesy of Wikipedia)
Figure 8 Photo of Stephen Cook (Courtesy of Wikipedia)
It was Cook and Levin who pioneered (independently) the notion of NP-complete problems. Though their work was first done in different countries and in different languages and the details of the approaches they used were also different, they got to the same place. In what is now called the Cook-Levin Theorem, they showed that the problem known as SAT belongs to the complexity class now known as the NP-complete problems. A Boolean expression is one that can be built up from the logical connectives "and," "not" and "or." For example: (x and not y) or (z and not x) is a Boolean expression involving the three variables x, y and z. A Boolean expression becomes true or false when the variables in the expression are assigned the truth values true or false. SAT is the question: Given a Boolean expression, is it always true that there is a truth assignment to its variables so the whole expression is true? Cook and Levin showed that this problem was in NP and that for an instance of any other problem Q in NP, there was a polynomial time algorithm for a (deterministic) Turing machine which would transform this instance of Q to an instance of SAT. It followed that NP-complete problems are ones such that they can be transformed to each other using a polynomial time algorithm. If one of these problems was shown to be in P, then all of them would be in P. There still remains the major (million dollar!) question, already mentioned, as to whether or not P and NP are really the same class!
Figure 9 Photo of Oded Goldreich (Courtesy of Wikipedia)
Figure 10 Photo of Shafrira (Shafi) Goldwasser (Courtesy of Wikipedia)
Figure 11 Photo of Silvio Michali (Courtesy of Wikipedia)
We all know from personal experience that computers have much more storage and are much faster than they were in even the relatively recent past. We are also rightly impressed with the success of researchers at IBM in designing a computer system, Watson, that successfully defeated the best human opponents at Jeopardy!. The nature of playing Jeopardy! requires knowing a tremendous amount of very diverse "facts" as well as the ability to interpret ordinary language often given in terms of puns.
Figure 12 The "avatar" for IBM's "Watson" (Courtesy of Wikipedia)
Interacting with natural language seems not to be on the radar of the computer systems we interact with on a daily basis, so the accomplishment of Watson is all the more impressive. What seems to have made Watson's accomplishment possible was a combination of hardware--processors that could do computations with incredible speed and memory units to store tremendous amounts of data--put together with ideas from machine learning and artificial intelligence (AI). Watson was programmed so that the more it played Jeopardy!, the better it got. This is the goal of researchers in machine learning, an area which overlaps with artificial intelligence. It has been hard to define artificial intelligence but loosely speaking the idea is to get computers to do things which seem to display "intelligence/creativity."
Figure 13 Photo of Leslie Valiant (Courtesy of Wikipedia)
Mathematics and computer science will continue to look at the issue of complexity in attempts to sort out not only the relative difficulty of solving various problems, but using insights into complexity to understand our world better. April will be a good month to ponder all of these exciting issues!
Alligood, K., and T. Sauer, J. Yorke, Chaos: An introduction to dynamical systems, Springer-Verlag. New York, 1997.
Welcome to the
These web essays are designed for those who have already discovered the joys of mathematics as well as for those who may be uncomfortable with mathematics.
Search Feature Column
Feature Column at a glance