Ever since the abacus was invented thousands of years ago, the human race has been using devices to help with computation, but it wasn�t until the mechanical calculator was invented by Wilhelm Schickard in 1623 that the era of the computer truly began. His invention � the calculating clock � used cogs and gears and was a long way away from where we are today with our mobile phones, tablets and laptops, but it did signify a significant development in the use of calculating devices, and mechanical calculators were used well into the 20th century. In fact, the slide rule, which is a type of mechanical calculator, is still used today by some engineers, even though it was invented way back in the 1620s by William Oughtred.
The invention of the punched card in 1801 was another significant milestone in the history of the computer. In 1801, Joseph-Marie Jacquard developed a loom in which punched cards controlled the pattern being woven. The series of cards could be changed without changing the mechanical design of the loom ... this was a landmark point in programmability.
The defining feature of a computer is the ability to program it, and programmable machines gradually became more widespread from about 1835 onwards. A program enables a computer to emulate different calculating machines by using different stored sequences of instructions. In 1837, Charles Babbage described his Analytical Engine, which was a general purpose programmable computer that used punched cards for input and a steam engine for power. Babbage never built his Analytical Engine, but a model of part of it is on display at the Science Museum in London, UK.
Electronic calculators didn�t appear until the 1960s, with the Sumlock Comptometer Anita C/VIII quite possibly being the first. Its price tag though was a hefty $2200. Electrical digital computers themselves were invented in the 1940s, with the onset of the Second World War causing great advances to be made in computer design and development. Electronic circuits, relays, capacitors and vacuum tubes replaced their mechanical equivalents, and digital calculations replaced analog ones.
The next major step in the history of the computer was the invention of the transistor in 1947, which replaced the fragile and power-hungry valves with a much smaller and reliable component. This was the beginning of miniaturization. Through the 1950s and 60s computers became more widespread, and in 1959 IBM started selling the transistor-based IBM 1401. In total, 12000 were shipped, making it the most successful computer of that period.
The explosion in computers really began with the invention of the integrated circuit (or microchip), which led to the invention of the microprocessor at Intel. The microprocessor led, in turn, to the development of the microcomputer. Microcomputers were affordable to small businesses and individuals alike, and continued the unstoppable technological march that has brought us to where we are today.