Internet
Fact Checked

# What is an Analog Computer?

John Lister
John Lister

An analog computer is one that can perform multiple calculations at once and can cope with infinite fractions of numbers. The term analog does not relate to how the computer is powered, and it is possible for these computers to be electronic. The characteristics of an analog device can even mean it may be better than a digital computer at particular tasks.

A computer is simply a machine that processes data in a set fashion or, to put it another way, calculates. Today, most computers are digital and work by reducing all data to binary numbers before processing. Analog computers go back thousands of years, but they vary from digital computers in only two fundamental ways.

The first is that this type of computer works in parallel, which means that it can carry out multiple tasks simultaneously. A digital computer, even though it may work considerably faster, can only perform one calculation at any one instant. The only way around this in a digital computer is parallel computing, where a single machine has multiple processors, and even then, programs must often be rewritten to take advantage of this.

The second difference is that an analog computer handles continuous variables, while a digital computer works with discrete numbers. The difference between these is that continuous variables can include every conceivable number, even irrational numbers, such as Π (pi).

Discrete numbers are those that are either whole numbers; those where the decimal fractions are either limited, such as one-eighth being 0.125; or those that have sequences that repeat, such as one-sixth being 0.1666 recurring. The infinite nature of irrational numbers means they cannot be reduced to the binary figure needed for a digital computer. This means that only analog computers can act as so-called “real computers” and solve some of the most complicated problems in mathematics.

This type of computer can work both mechanically and electronically. Mechanical computers have existed for thousands of years, with the oldest known example being the Antikythera. This is a Greek machine, thought to have been made around 100 BC, designed for calculating astronomical positions. A more recent and common version is the slide rule.

The electronic analog computer works on the same principles, but uses electrical components to replace the physical parts. The big advantage is that the properties of these components can often be varied, whereas the physical parts would have to be replaced to change their properties. The downside is that the electronics are subject to noise, a type of interference caused by external physical factors.

## You might also Like

anon1006688

Geared calculating machines are usually considered to be digital computers because the geared teeth provide discrete inputs and outputs that are repeatable. The famous Antikythera mechanism is often described as an analog computer, but why?. It is a hand cranked orrery, which, once set up with their current positions, can display the positions of the moon and major planets to within 1 degree, up to 500 years into the future. And if you reset the starting positions and turn the crank again the same number of times you will get the same result. And you can convert the gear ratios into equations, program them into an Excel spreadsheet and get the same results. Sounds like a digital computer to me.

I think the term analog electric computer is very misleading but it does make sense in principle. You would think with all these smart, geeky scientists that they would create a more understandable term.

jeancastle00

The most famous analog computer is of course the the calculating engine that Charles Babbage created from mechanical parts. His forethought and inspiration allowed our industry and technology leaders move forward on a path to the incredible systems that we have today.

When most people think of computers, analog is a word almost opposite of their thoughts. The general public will often hold the assumption that the beginning of the computer revolution was actually started in the 20th century and this simply is not the case.

The mechanical computer that Charles Babbage built was a forerunner in it's time and one for the history books.