Friday, November 04, 2005

Why didn't Babbage use binary?

My mother asked me an interesting question today as research for her book on the history of textiles: "Why did Babbage use denary rather than binary in his difference engine?".

The short answer is that he should have used binary, but the reason why goes to the very heart of the digital computing revolution. Here is my answer:

The key thing about Babbage's "difference engine" and all modern computers is that they are digital (like an abacus) rather than analogue (like a slide rule). The reason this is important is to do with precision: If you want to make a slide rule 10 times more accurate, you need a finer, more precise, machine to craft it with, and every improvement in precision is increasingly expensive: a saw that can cut to the nearest millimeter is 10 times more expensive than one that is accurate to a centimeter, and so on. In comparison, adding more accuracy to a digital computer means just adding another "cog". For instance, if Babbage had 4 cogs in his machine and could crank out numbers like this: 43.25; he could add a single new cog and crank out 43.248, which is 10 times more accurate. The difference in cost is only ¼ of the amount he already spent on the previous 4 cogs. So for the same precision improvement, the analogue computer was 10 times as expensive, whereas the digital was only ¼ more expensive.

So Babbage made the right decision choosing a digital design, but why denary and not binary? One disadvantage of binary was that it would have complicated the printer he designed: it would have had to translate 101011.01 into its decimal equivalent (43.25) when writing out the results. However, binary beats denary because of the precision you need to use when manufacturing the "cogs", and it was the cost of achiving this precision that ultimately defeated Babbage.

To elucidate futher: the less states each "cog" has to represent, the easier it is to distinguish between states. Therefore, the parts can be manufactured cheaply and with more generous tolerances, allowing for greater bang-for-buck. The machines of Thomas Fowler realised this, and employed just 3 states. Obviously the binary system is the logical conclusion of this argument, although why we don't use base-1 is an interesting topic for a seperate post.

There are some nice features of analogue computers, and (I was suprised to learn) this area is not completely dead, but in almost all cases digital computing is the best choice.

As an epilogue to this argument, consider the three things important in a modern computer: speed, speed, and speed (to quote Bill Walster). Computers are getting faster and faster because of increasing density of components; the closer the components are, the faster electricity can travel between them. This leads to an ironic situation that greater component density relies on higher precision chip fabrication, which as we know costs increasingly more to provide. So why are computers still getting cheaper? The answer is that the size of the production runs gets longer and longer, so the initial cost of the foundary is spread more thinly. If the market for chips saturates, this will no longer hold true.

1 comment:

DU said...

Babbage was mainly defeated by perfectionism in design, not a failure of technology. If insufficient precision were the problem, the Science Museum in London would have failed, since they used period accuracy.

In other words, if he'd stopped screwing around and just DONE IT, it would have been done. (Also, his first chief technical guy was goldbricking.)