Visit our **AMS COVID-19 page** for educational and professional resources and scheduling updates.

Mail to a friend · Print this article · Previous Columns |

Tony Phillips' Take on Math in the Media A monthly survey of math news |

This month's topics:

- More Origami Mathematics
- Algorithmic complexity in evolution
- Relativity. Incompleteness. Uncertainty.
- Six degrees of self-similarity

A "cube" constructed by Demaine, Demaine and Lubiw out of |

Each Tuesday the "Science Times" features a "Scientist at Work." For February 15, 2005 the scientist was Erik Demaine of MIT. Margaret Wertheim wrote the piece; she tells us that Demaine "is the leading theoretician in the emerging field of origami mathematics, the formal study of what can be done with a folded sheet of paper." Some pretty amazing things can be done. For example, Demaine, working in 1998 with his father Martin Demaine and with Anna Lubiw, showed that *any* polygonal shape can be made out of a single piece of paper by folding it flat appropriately and making one complete straight cut. The *Times* shows a swan; Demaine's Fold-and-Cut webpage gives angelfish, butterfly, jack-o'lantern and more. Wertheim leads us through some of Demaine's other interests: linkages ("A linkage is a set of line elements hinged together like the classic carpenter's rule.") which are related to protein folding ("... molecular biologists would like to be able to predict from the chemical structure of a protein what shape it would fold into") and graph theory ("known to be fiendishly difficult, but Dr. Demaine is confident he can make headway once he immerses himself in its arcane lore.") There is more in the article and much, much more on Demaine's website. For earlier media math origami see this page for July, 2004.

Algorithmic complexity in evolution

The idea of algorithmic complexity goes back, in some sense, to Leibniz (see Greg Chaitin's home page). The general concept is suggested by Chaitin's definition: *The (algorithmic) complexity of a sequence of 0's and 1's is the length of the shortest computer program that will generate the sequence.* An international team, led by Ricardo Azevedo (University of Houston), has recently applied this concept to the study of the development of multi-cellular organisms. Their work appears as "The simplicity of metazoan cell lineages" (*Nature,* January 13, 2005). "Lineage" refers to the fact that all the cells in an organism descend from a single cell, the fertilized egg. But a typical metazoan has a large variety of different kinds of cells (brain, skin, bone, etc.). So in the family tree, traced back from a single cell in the complete organism, there must be one or more *nodes* where a mother cell divided into two dissimilar daughters. Algorithmically, each of these nodes corresponds to a division and differentiation rule. In the study, part of the tree (a *lineage*) is reduced by identifying functionally similar nodes. The number of reduced nodes divided by the original total is the algorithmic complexity of the lineage.

A schematic (non-biological) lineage illustrating the reduction process. Here nodes R4 and R5 are collapsed to RR4 in the reduced lineage: the algorithmic complexity of the original lineage is 4/5=80%. |

The team computed the complexity for lineages in four different multicellular organisms: three species of free-living nematodes (microscopic groundworms) and a sea squirt. The numbers worked out to 35%, 38%, 33% and 32% in the four cases. In a first analysis, the team "compared each real lineage to lineages with the same cell number and distribution of terminal cell fates but generated by random bifurcation." They found that real lineages were 26-45% simpler than the corresponding random lineages. Conclusion: evolution selects for simpler lineages. (Tentative explanation: "the specification of simpler cell lineages might require less genetic information, and thus be more efficient.") In a second analysis, they "used evolutionary simulations to search for lineages that had the same terminal cell number and fate distribution as the actual lineages but were simpler." They found that after 20,000 to 50,000 generations they "could evolve lineages that were 10-18% simpler than the ancestral, real lineages." One explanation is "developmental constraints imposed by the spatial organization of cells in the embryo." They added these constraints to their simulations and conclude that "the metazoan lineages studied here are almost as simple as the simplest evolvable under strong constraints on the spatial positions of cells."

Relativity. Incompleteness. Uncertainty.Thus runs the first paragraph of Eric Rothstein's February 14 2005 "Connections" column (every other Monday, in the *New York Times*). The piece is a meditation on Einstein, Gödel and Heisenberg, occasioned by the publication of Rebecca Goldstein's new book "Incompleteness: The Proof and Paradox of Kurt Gödel" (Atlas Books; Norton). Rothstein contrasts Heisenberg, whose "allegiance to an absolute state, Nazi Germany, remained unquestioned even as his belief in absolute knowledge was quashed," with Einstein and Gödel who "fled the politically absolute, but believed in its scientific possibility." Most of the column is saved for Gödel's Incompleteness Theorem. "Before ..., it was believed that not only was everything proven by mathematics true, but also that within its conceptual universe everything true could be proven. Gödel shattered that dream. He showed that there were true statements in certain mathematical system s that could not be proven. And he did this with astonishing sleight of hand, producing a mathematical assertion that was both true and unprovable." Rothstein, following Rebecca Goldstein, gives Gödel's result a positive twist: "But what if the theorem is interpreted to reveal something positive: not proving a limitation but disclosing a possibility? ... In this, Gödel was elevating the nature of the world, rather than celebrating powers of the mind. There were indeed timeless truths. The mind would discover them not by following the futile methodologies of formal systems, but by taking astonishing leaps, making unusual connections, revealing hidden meanings."

Six degrees of self-similarity

The authors' network nenormalization, applied toa schematic network with 8 nodes. For each box length l |

The "six degrees of separation" phenomenon (so named when the network is acquaintance among people today) is often observed in complex networks. The "six" becomes the average diameter of the network. A Letter in the January 27 2005 *Nature* shows that many naturally occurring complex networks are also self-similar. The authors (Chaoming Sung and Hernán Makse (CCNY), Shlomo Havlin (Bar-Ilan)) focus on "connectivity between groups of interconnected nodes on different length scales," which they study by a renormalization procedure (see caption at left). They apply this analysis to the following networks: hyperlinkages among the 325,729 web pages of a subset of the World Wide Web; 392,340 actors (linked if they have been cast together in at least one film); and various networks from molecular and cellular biology. All of these networks turn out to be self-similar: d_{B} for the WWW is 4.1, for the actors 6.3. Here is the author's illustration of renormalization carried out on their WWW sub-network, with l_{B} = 3:

Box size 3 renormalization of a 325,729-page subnetwork of the World Wide Web.

Image from Nature

Tony Phillips

Stony Brook University

tony at math.sunysb.edu