Charles Babbage's Digital Compyter

In the 1820's Charles Babbage, a British mathematician developed a computer that performed mathematical computations using gears and wheels. In 1833, he designed the Analytical Engine and even built a partial prototype. This would have been a true digital computer with memory, had it ever been built. The technology of the early Nineteenth Century was not well developed enough to make this invention a reality.

Lady Augusta Ada Byron was the world's first computer programmer. The Analytical Engine, had it actually been built, would have executed instructions and performed operations on data. Lady Ada worked out the logic for this early computer, which included instruction sets and was actually capable of manipulating the contents of the machine's primitive memory. Ada, a modern computer programming language used on U.S. Government mainframe computers was named in her honor.

Interestingly enough, the Difference and Analytical Engines initiated a long and fondly held idea in the mind of the public that computers are primarily a mathematical calculating tool. A particularly fast and powerful one is oftern referred to as a "Number Cruncher", and one asks if it can calculate prime numbers. In fact, the very word "Computer" bespeaks its mathematical origins. Today, the computer is used as a communications tool, an information manager and server, and a creative tool for artists more than it is used to grind away on complex mathematical problems. For even the most onerous mathematical tasks required of a computer by the average consumer, the Apple II Computer of the late 1970's was massive overkill. If you find this hard to believe, try calculating by hand the bank balance of a one dollar deposit at 5 percent interest after 599 years. Then, obtain a copy of this author's simple Applesoft Basic program on 143K floppy disk and run it on an old Apple II with 16K of RAM. Which is faster, human or machine?



Return to History Of Communications Timeline