**Wednesday, June 3, 2020**: From 6:00am–8:00am the AMS website will be be down during maintenance.

Visit our **AMS COVID-19 page** for educational and professional resources and scheduling updates

Mail to a friend · Print this article · Previous Columns |

Tony Phillips' Take on Math in the Media A monthly survey of math news |

This month's topics:

- How complex is mathematics?
- Quantifying the evolutionary dynamics of language
- Nash equilibria for bacteria

How complex is mathematics?

Richard Foote (University of Vermont) has a review article, "Mathematics and Complex Systems," in the October 19 2007 *Science*. His goal is to analyze mathematics itself as a complex system. (There is in fact no exact and generally accepted definition of "complex systems," but they are usually characterized as a) made up of many interconnected elements and b) expressing *emergent* behaviors that require analysis at a higher level that that appropriate for the component elements. The standard example is the brain, with neurons as its component elements, and consciousness as emergent behavior.) Foote proposes "that areas of mathematics, even ones based on simple axiomatic foundations, have discernible layers, entirely unexpected 'macroscopic' outcomes, and both mathematical and physical ramifications profoundly beyond their historical beginnings."

The area he chooses to examine in detail is Finite Group Theory: he gives the axioms, defines a *simple group,* and studies the history of the classification problem for finite simple groups as one might study the evolution of a life-form, emphasizing the points where the theory underwent a transformation comparable to an emergent behavior. He distinguishes three epochs:

- From Galois to the early 1960s. It was understood how any finite group could be (essentially uniquely) decomposed into simple groups; the classification of simple groups was underway. There were 18 (infinite) families of finite simple groups and in addition 5 "sporadic" finite simple groups belonging to no family.
- The Feit-Thompson Odd Order Theorem (1962; the only odd-order simple groups are the cyclic groups of order > 2) was, according to Foote, "a breakthrough to the next level of complexity." Their huge paper "spawned the first 'quantum jump' in technical virtuosity that practitioners would need in order to surmount problems in this arena." The road to classification was not smooth: a sixth sporadic group was discovered in 1965, 20 more surfaced during the next few years. but by 1980 the enormous project was done.
- "The Monster and Moonshine." The Monster (the king of the sporadics, with some 10
^{54}elements) is "the nexus of a new level of complexity." Starting in 1978, "striking coincidences," mysterious enough to merit the appellation Moonshine, were discovered between the structure of the Monster and the classical theory of modular functions. Finding a basis for this correspondence led to a Fields Medal for Richard Borcherds in 1998; the new level of complexity comes from the string theory methods used in Borcherds' work. These directly connect Moonshine to currrent research, often mathematically problematical, in theoretical physics.

Foote concludes by remarking: "... the work of scientists is inherently incremental and precise. On the other hand, it is incumbent on us all to work toward enhancing the understanding of 'big picture' issues within our own disciplines and beyond."

Quantifying the evolutionary dynamics of languageThat's the title of a "letter" published in *Nature* for October 11, 2007. Martin Nowak (Harvard) and his co-authors examine the rate at which grammatical rules change in time, using the* `-ed' suffix --> past tense* rule for verbs in English (e.g. walk, walked). As they remark, this rule has been expanding over the centuries as special ("irregular verb") rules have faded. Chaucer (1343-1400) used *holp* as the past tense of *help*; in Modern English the regular form *helped* has taken over. It is a common observation about languages with conjugation that the most common verbs tend to be irregular. In English, as this article reminds us, the ten most common verbs (be, have, do, go, say, can, will, see, take, get) are all irregular. The phenomenon is commonly attributed to frequent use reinforcing existing forms and making them less susceptible to change. These authors *quantify* the relation between a verb's frequency and its longetivity:

**"The half-life of an irregular verb scales as the square root of its usage frequency."**

Or, "a verb that is 100 times less frequent regularizes 10 times as fast." [This law, while delightfully quantitative, is not on the same footing as "The squares of the periods of the planets scale as the cubes of their semimajor axes." It would be more correct for the authors to write "scales as a power, approximately 1/2" since for the two data sets they examined, the scaling was as the .51 power and as the .48 power respectively. -TP]

"Cooperation counts for math professor" is the *Boston Globe*'s report about Martin Nowak's work on this and other projects (October 15). Their correspondent Heather Wax describes Nowak as being "at the forefront of a new field called evolutionary dynamics, in which Darwin's idea of natural selection is formulated in terms of math equations." Most of the ideas Wax describes concern the emergence (or loss, in the case of cancer cells) of cooperative behavior. She mentions the prisoner's dilemma, four rival cyclists in a large race forming a peloton to share the wind burden, and the theology of cooperation. She quotes Nowak: "Molecules combined to form the first cells. Human societies are made up of individuals that cooperate. Whenever evolution is doing something amazingly new, cooperation is involved." "I study what makes cooperation a winning strategy."

More about cooperation in the the next issue of *Nature* which ran "The equilibria that allow bacterial persistence in human hosts" on October 18, 2007, The authors, Martin Blaser (NYU School of Medecine) and Denise Kirschner (U of Michigan Medical School) examine persistent infections: relationships "that have resulted from the pairing of a microbe and host that have survived the challenges of cohabitation," and focus on three notorious bacteria: *Helicobacter pylori* (stomach ulcers), *Salmonella typhi* (typhoid fever), and *Mycobacterium tuberculosis*. As opposed to many diseases where the host dies or the microbes are destroyed by the host's immune system, in these cases infections can persist at a low level or even imperceptibly ("Typhoid Mary") for long periods. The authors remark that "Relationships between persistent microbes and their hosts span many spatial scales and timescales" (at the level of cells, organs, individuals and populations) and propose "that microbial persistence represents a co-evolved series of nested equilibria, operating simultaneously on each of these multiple scales, to achieve an overall homeostasis." Blaser is less technical in the interview he gave for the Authors page in that same issue. "In classic evolutionary theory, everything is based on competition because it's assumed that, in a cooperative situation, 'cheaters' would usurp resources and the system would fail. But cooperation exists, so I wanted to determine how it fits into the model. About six months ago, I came across game theory's Nash equilibrium concept. which can be summed up as 'if you cheat, you lose.' I realized that persistence relies on creating a system that would be disadvantageous to either a bacterial or a host cheater." In the case of *Heliobacter*, the model involved "five prototypic populations that are followed over time," represented by a system of coupled nonlinear first-order differential equations. "The model produced equilibrium solutions under a wide range of relevant biological variation."

Tony Phillips

Stony Brook University

tony at math.sunysb.edu