To help us provide you with free impartial advice, we may earn a commission if you buy through links on our site. Learn more

A brief history of the computer

With computers now commonplace in every home, workplace and pocket, Simon Handby traces the development of the technology that changed the world

If you’re a typical Expert Reviews reader, the chances are you use a computer at work, that you’ve got one or two at home, and that there’s more than a handful between your television, games console, car and mobile phone. Computers and computer technology have become an indispensable part of modern life, and their widespread uptake is changing the way we live, but computing for all is still relatively new – and it’s something that many early pioneers didn’t foresee.

The first true computers were electromechanical giants, developed by governments and institutions driven on by the desperate circumstances of the Second World War. Computers remained in the hands of universities, governments and big business for decades after the war’s end, but as the technology improved they became smaller, more affordable and more accessible until they came into our homes and ultimately our pockets. Here we chart the history of computing, telling the story of how such powerful tools have ended up in so many hands.

EARLY BEGINNINGS

Most histories of the computer start with the English mathematician and engineer Charles Babbage, whose unfinished ‘analytical engine’ was undoubtedly the first design for what we now think of as a computer: a machine that takes an input, mathematically manipulates it according to a customisable program, and produces an output. Babbage was a true visionary; it’s a somewhat macabre indication of the esteem in which he was held, that one half of his brain remains on display at the Hunterian Museum in the Royal College of Surgeons, and the other at the Science Museum. Still, even his work built on some existing fundamentals.

Mankind had been using machines to aid calculation since at least the appearance of the abacus, thought to date back before 2300BC; but it was in Renaissance Europe that engineers began to produce far more sophisticated calculating devices, some of which had some degree of programmability. In 1801 as the Industrial Revolution gathered pace, Joseph Marie Jacquard invented a weaving loom that could be programmed with punched cards to produce different patterns – the first machine to be given instructions in this way.

Difference Engine
The reconstruction of Babbage’s difference engine at the London Science Museum

Babbage sought a way to remove human errors from the mathematical tables available in the early 19th century, devising his mechanical ‘difference engine’ to calculate polynomial functions (a type of algebra equation). Though it was never finished, the first difference engine would have contained more than 25,000 parts and weighed over 13 tonnes. A revised design was completed in 1991 by the Science Museum, and found to work perfectly.

More complex still, and also unfinished, Babbage’s analytical engine added features that define modern computers. It could be programmed with cards, but could also store the results of calculations and perform new calculations on those. Babbage intended to support conditional branches and loops, fundamental to all modern programming languages. His death in 1871 meant that he never finalised his designs for the engine, but his son Henry completed its core computing unit – ‘the mill’ – in 1888.

Pages: 1 2 3 4 5 6 7 8 9

Read more

In-Depth