## ComplexityIn honor of Mathematics Awareness Month I will take an idiosyncratic look at mathematical aspects of complexity, in particular some history of mathematical approaches and insights into this subject...
Joseph Malkevitch
## Introduction
Are tax forms complex? Is the world wide system of selling stocks a complex one? What about the Mandelbrot set? Figure 1 (Mandelbrot set - Courtesy of Wikipedia)
What is a complex system? How can we measure complexity? Is complexity "good" or "bad?" ## HistoryAs the importance of a topic matures it is often useful to look back and see the roots of the subject and how it evolved with time. It is not an accident that deterministic systems were studied before stochastic ones. For systems that have a mixture of deterministic and chance elements, it is common to take on the deterministic parts first. One source of complexity is that chance events are are harder to sort out. Or so it seemed until relatively recently - more of this in a bit. Thus, algebra and geometry were developed in many sophisticated directions (in Egypt, China, Babylonia, India, etc.) before a systematic approach to probability theory started to emerge in the 17th century in the work of Pascal and Fermat. Another early pioneer in the complexities that chance events cause in the understanding of our work was Thomas Bayes.
Figure 2 (Reverend Thomas Bayes)
Whereas until relatively recently a frequentist-based approach to making decisions was the norm, more and more discussions are being made of the use of ideas implicit in Bayes work for making decisions and reaching conclusions. For example, suppose one wants to argue that a certain die, which looks very symmetrically manufactured, is a fair die. One might roll this die 60 times and record the relative frequency of one dot, two dots, ..., six dots. However, it not very likely that each number of spots will occur equally often. If 6 occurs slightly more often than the other numbers of spots, does this mean a slight bias towards 6 appearing? A Bayesian approach might use the "data" from the 60 tosses as input for further work to try to give an answer to the question as to whether or not the die is fair. Figure 3 (Bifurcations of the logistic map - courtesy of Wikipedia)
This involves the iteration of computing from a given input Figure 4 (Lord May, Courtesy of Wikipedia)
The mathematician and meteorologist Edward Lorenz had in essence called attention to a similar phenomenon in what is seen in difference equations for differential equations. It was Lorenz who coined the term "butterfly effect." This notion suggests that something as minute as a butterfly's flapping its wings can be responsible for much more dramatic effects separated from the original action in time and space. In the area of chaos people also speak of ## Computers and complexity
One of the major tools people use to deal with complex systems are computers. It is often difficult to understand the relation of computer systems to humans these days because humans have come to depend so much on complex systems that can be run only by computers. For example, trades are now routinely carried out on stock markets much more quickly than could be done by humans without computer systems. Historically people have taken comfort in the fact that computers do impressive things because of their speed but that they don't seem capable of feats that seem to involve "thinking" or "creativity." Traditional examples of seemingly unique human skills are playing checkers, playing chess, and other "games" that seem to require thinking and intelligence. First, a checkers-playing program was developed which played checkers better than any human opponent. IBM developed a computer system known as Deep Blue which beat Gary Kasparov in a well-publicized match. Although Deep Blue was "retired," modern chess playing programs are amazingly good.
Figure 5 (Photo of John Von Neumann)
Von Neumann was a pioneer in developing the theory and implementation of early stored program digital computers. Turing's interest in computers was spurred by his World War II efforts to use computers to help break German codes.
Figure 6 (Photo of Alan Turing)
Turing, one of the pioneers of computer science and the complexity questions that became a part of computer science, invented a simplified "conceptual" computer now known as the Turing machine. The Turing machine is a conceptual device which marks symbols on an infinite tape which can be advanced in either direction from the cell that is currently being "looked at" by a "head" or reader. Turing was able to show that any problem that could be solved on a "standard" computer could also be solved on a Turing machine. Thus, this "standardized" device enabled one to talk about complexity without talking about the hardware of an actual machine. A supercomputer can solve problems more quickly than a Turing machine when they can be solved, but it cannot solve any problem that a Turing machine cannot solve! ## Complexity classes
In discussing the complexity of different kinds of problems, the problems involved can be placed into different categories. Even for experts who are trying to stay abreast of developments in this rapidly emerging field, there is a complex alphabet soup of classes and types of problems considered. Here I will try merely to address some issues related to the best known of these classes and ideas. Among the kinds of questions that are considered are decision problems (ones that have a "yes" or "no" answer) and optimality problems (problems where one is seeking a "best" solution with respect to some objective function). When talking about complexity it is common to use the phrase "instance" of a problem to refer to a particular case of the problem involved. For example, if one's problem is to sort a list of ## Evolving ideas
The part of mathematics and computer science concerned with complexity theory has had many major contributors. However, two individuals responsible for a major breakthrough were Stephen Cook and Leonid Levin.
Figure 7 Photo of Leonid Levin (Courtesy of Wikipedia)
Figure 8 Photo of Stephen Cook (Courtesy of Wikipedia)
It was Cook and Levin who pioneered (independently) the notion of NP-complete problems. Though their work was first done in different countries and in different languages and the details of the approaches they used were also different, they got to the same place. In what is now called the Cook-Levin Theorem, they showed that the problem known as SAT belongs to the complexity class now known as the NP-complete problems. A Boolean expression is one that can be built up from the logical connectives "and," "not" and "or." For example: (
Figure 9 Photo of Oded Goldreich (Courtesy of Wikipedia)
Figure 10 Photo of Shafrira (Shafi) Goldwasser (Courtesy of Wikipedia)
Figure 11 Photo of Silvio Michali (Courtesy of Wikipedia)
## Recent news
We all know from personal experience that computers have much more storage and are much faster than they were in even the relatively recent past. We are also rightly impressed with the success of researchers at IBM in designing a computer system, Watson, that successfully defeated the best human opponents at
Figure 12 The "avatar" for IBM's "Watson" (Courtesy of Wikipedia)
Interacting with natural language seems not to be on the radar of the computer systems we interact with on a daily basis, so the accomplishment of Watson is all the more impressive. What seems to have made Watson's accomplishment possible was a combination of hardware--processors that could do computations with incredible speed and memory units to store tremendous amounts of data--put together with ideas from machine learning and artificial intelligence (AI). Watson was programmed so that the more it played
Figure 13 Photo of Leslie Valiant (Courtesy of Wikipedia)
Mathematics and computer science will continue to look at the issue of complexity in attempts to sort out not only the relative difficulty of solving various problems, but using insights into complexity to understand our world better. April will be a good month to ponder all of these exciting issues! ## References
Alligood, K., and T. Sauer, J. Yorke, Chaos: An introduction to dynamical systems, Springer-Verlag. New York, 1997.
Joseph Malkevitch |
Welcome to the These web essays are designed for those who have already discovered the joys of mathematics as well as for those who may be uncomfortable with mathematics. Search Feature Column Feature Column at a glance |