"Sweet Nothings," by Ian Stewart. New Scientist, 26 January 2002, pages 27-29.
Since the time of Archimedes, mathematicians have used the concept of an "infinitesimal"---a number that is bigger than zero but smaller than any other number. If this sounds like a contradiction, it rather is, and during the 18th and 19th centuries, mathematicians worried a lot whether infinitesimals could be established as a rigorous mathematical concept. Still, infinitesimals persisted because they are so useful in mathematics. It wasn't until the 1960s that the mathematician Abraham Robinson finally figured out a way of putting infinitesimals on a solid theoretical footing. Essentially, Robinson treated them not as ordinary real numbers, but as a new kind of beast, called a nonstandard number. Nonstandard analysis, the branch of mathematics that grew out of his work, has been used in a variety of ways, in particular in physics, to model gases or groups of particles that move randomly. There has also been speculation that nonstandard analysis could be useful in computer science, where one often studies extremely complicated but finite situations. "The limits of computing power may be related to the passage between the infinite and the finite," the article states. "[I]nfinitesimals may be the way to explore this borderline."
--- Allyn Jackson