Wednesday, August 11, 2010

Digital Thinking in an Analog World

The most fundamental difference between humans and computers is captured in the distinction between Analog and Digital. The analog world is continuously changing, without central coordination, massively parallel and inherently asynchronous. The digital world, on the other hand, is made up of easily separable, discrete states, where even time itself is measured in countable clock ticks. Real world machines are always analog at their base level, but they are designed to "round off" their imprecise measurements to clean up any noise and make everything as neat and tidy as a set of integers. Digital thinking is a descriptive phrase for what intelligent computer programs do, because computers are digital machines. But it also captures the essence of how humans (Westerners at least) are taught to think about Systems, both to analyze how they work and to design them to perform particular functions. Philosophical schools refer to this kind of digital thinking using big word labels such as Reductionism, Rationalism, Logical Positivism, etc. How people are taught to think about the world has big implications for how they think when they try to understand their brains.

Digital machines have the advantage that they are much more likely to produce the same results when they repeat their performances, and they provide more tolerance for error in their own manufacturing precision. You might find it interesting to realize that the first autonomous digital machines were not computers as we know them, but were actually one of the key components of a computer, namely the clock. Long before electronic clocks, the invention of the escapement mechanism was the key to making clocks work mechanically. This is the clever device that in some sense digitizes time, by snapping turning gears into countable ticks. Even earlier clocks that worked by falling sand (hourglasses) or dripping water (water clocks) were also basically digital however, since grains of sand or drops of water are themselves naturally discrete, countable things. Sundials, on the other hand, are analog devices whose continuously changing shadows eventually get "digitized" by an observer who uses the tick marks on the dial to read off the time value. Another ancient forerunner to the modern computer is the Abacus, and as a counting and tracking device, it too relies on digital representations. Although the Chinese abacus is probably older than the Egyptian water clock, I don't consider an abacus to be a "machine" in the same sense as clocks since it only records digital states as manipulated by a person, and doesn't change them on its own.

Another more modern invention, one that rivals the mechanical clock escapement in its technological significance, is the transistor. This is the basic electronic circuit that digitizes an electronic analog signal, snapping the output voltage to essentially one of two very different (and therefore discrete) values. Before the transistor we used electromechanical relays or vacuum tubes, but such binary (two-valued) switching devices are at the heart of all modern digital electronics, which of course includes computers. Digital representations can use other number system bases besides base 2, but binary is so primitive, generic, common, and useful that is it practically synonymous with the word "digital". The fact that the earliest mechanical and electro-mechanical computers used decimal (base-10) digitizations is now just an interesting historical artifact.

Software mimics its underlying hardware, so computer programming languages work with primitive data types that are encoded as various fix-length bit-strings—that is, binary numbers. Working with digital models has proven to be very convenient, and all modern computer programming simply takes this as given. But it is also a slippery slope whose appeal has led many great thinkers on what I would call a wild goose chase, looking for an understanding of the world that is precise, logically pure, and complete. If you study the history of Artificial Intelligence, you will see that many researchers believed that knowledge was inherently a digital thing, and that logic should form the foundation of knowledge-based systems and thinking machines.  An alternative approach, which I think will ultimately prove to be more capable, useful, and essentially "correct", is to view knowledge and reasoning more from an analog perspective, namely in terms of inexact or probabilistic representations and processes. We need to realize that digital thinking is not necessarily limiting, because just as digital machines are built on top of analog ones, we can go another level and simulate analog thinking on digital hardware. I will have much more to say on this topic in later postings.

For further reading about the fascinating history of the development of the first accurate shipborne clock, see "Longitude", by Dava Sobel.  For the story of the first mechanical computing device and its inventor, see "The Difference Engine: Charles Babbage and the Quest to Build the First Computer", by Doron Swade.

0 comments:

Post a Comment