Back

The History of the Computer

26 February 2011

34SP.com Staff

It’s a Website World: The History of the Computer

The computers development began with man wanting to calculate in an ever more efficient and timely manner. The first manner of calculating began with a bone device back in 35,000 B.C. known as a ‘tally stick’. That simple beginning started a process of invention that has not stopped even to this day.

The first known calculator was the abacus, which was the foundation for later invention. As the Chinese developed moving differential gears and mathematicians in ancient India started using a zero in calculations around 500 B.C., the stage was set for numbers and technology to come together. It was around 300 B.C. that Pjngala described the binary number system that would be used in the design of nearly all modern computing equipment.

In 125 BC, technology took another jump as the Antikythera mechanism was born. Built in Corinth, it could track the known stars and planets in the sky. Most consider it the first analog computer. Six hundred years later, a Chinese inventor named Liang Lingzan built the first mechanical water clock. This established a great technological jump considering that all future computers would be based on that of the clock. The year 820 began the use of algebra and Al-Jazari; an Arab engineer invented the earliest programmable computer, an astronomical clock.

Over the next years, the geared calendar was invented, as was a notional machine for calculating answers to philosophical questions and in 1588, Joost Buerghi’s discovery of natural logarithms. All were advances towards computer technology. Then in 1774, Philip Matthaus Hahn of Germany invented the first portable calculator, which could perform all four mathematical operations. It wasn’t however, until the 1800s, that computing technology really took a leap forward.

It was 1801 when Joseph-Marie Jacquard came up with a loom that could be controlled by punch cards. This and the previous decades advancements in calculators led to invention after invention, each expanding on the previous inventors discoveries. Charles Babbage was a major player in all of this as he conceived the analytical engine and designed many programs throughout the early 1800s and after his death; a committee felt that completing the Analytical Engine wouldn’t be possible. Thankfully, men like Howard Aiken did not believe so and went on with his work.

In 1884, Dorr Felt developed the first calculator that used keys rather than dials and in 1889, he also invented the first desk calculator that could print. This inspired others in the field and led to the establishing of the Tabulating Machine Company (later to become IBM) in 1890, founded by Herman Hollerith. His invention was that of recording data onto punched cards that a machine could read. Data had never been the focus of readable machines before this time. The machine was later used to tabulate in the 1890 census. William S. Burroughs of St. Louis, Missouri improved upon Felt’s original machine eight years later. His design became the platform for the mechanical calculator industry.

The 1900s was the real decade for the computer age. It began in 1924 with Walther Bothe’s Nobel Prize winning work in building an AND logic gate for physics experiments. Then the capability of solving differential equations in 1930, then there was the demonstration of a 1-bit binary adder using relays in 1937, and the ‘Z1’, the first mechanical binary programmable computer a year later. It was in 1939 that William Hewlett and David Packard started the company, now known, as HP and many other companies were to follow in those footsteps. April 1, 1940, Konrad Zuse of Berlin, Germany, founded the very first computer start-up company called Zuse Apparatebau. He also presented the Z2 and a year later the Z3 computer.

Not to be outdone by others, Dr. Thomas Flowers of London built the Colossus, all in order to crack the cipher of the German Lorenz, which contained 2400 vacuum tubes for logic and applied a logical function that was programmable to read 5,000 characters a second from a stream of input characters on punched tape. The second program-controlled machine was the Harvard Mark I created by Howard Aiken and his team from IBM. In 1945, Konrad Zuse was still hard at work, developed the first high level programming language, and launched the Z4 the following year. 1947 brought the Harvard Mark II to market and in 1948, IBM finishes the SSEC, which for the first time allows a stored program to be modified.

1948 began what could be called the computer age. For the first time, a computer could store its data and programs in RAM, just as modern computers today do. The computer was called “Baby” and was built at the University of Manchester. Within a year, “Baby” was equipped with a magnetic drum for more permanent storage and was renamed the Manchester Mark I. In May of that same year Maurice Wilkes and his team from Cambridge University used the first stored program. It was on the EDSAC computer, which had a tape input-output. This made May 6, 1949 the unofficial birthday of modern computing. It was also the year that Popular Mechanics stated, “Computers in the future may weigh no more than 1.5 tons.”

Vacuum tubes were used in the 1950s, as were transistors, but it wasn’t until March of 1951 that the first commercially and general-purpose computer was created by J. Presoper Eckert and John Mauchly, It was the UNIVAC, which was designed for both textual information, as well as numeric. In April of the same year, Mr. Jay Forrester and his team introduced the Whirlwind computer. It was created for the US Air Defense System and the first real-time computer. New small developments continued until 1957 when the dot matrix printer was introduced by IBM and also the Texas Instruments integrated circuit.

From 1959 through 1964, computers are considered second generation. This is based on their printed circuits and transistors, which resulted in much smaller computers that were more powerful. Newer computers could handle compilers for scientific and business language making them more flexible for use. Through these years, there was a global race to create the better and faster computer, although most only made small contributions, while others expounded on their ideas. This competition also led developers to think about individual users of computers and a personal computer prototype was created in 1961.

Advances continued with the mouse being conceived in 1963, but it was not really used much until Apple computer added it to their Macintosh in 1983. CDC then developed the first supercomputer in 1965 and the general-purpose computer business began with Hewlett-Packard’s HP-2115. The computer industry was growing by leaps and bounds at this stage and in 1973; alphanumeric information could be displayed on a television screen. This is also the year in which Silicon chips and Ethernet was developed. Ethernet allowed PCs and other computers to be connected together so as to share data. The progression of computers was now focusing on ease of use. Single board keyboards, better displays, mass storage, etc. 1975 introduced Microsoft and Apple computers, as well as new releases from IBM, Commodore and other companies.

It was still generally believed that business use was where the computer would stay, but as computer companies made advances, more and more individuals began to purchase them for personal use. This started another competition among computer companies, with newer and faster personal computers appearing every year or so in the 1980s, but with the introduction of the World Wide Web, (which was invented by Tim Berners-Lee) new advances came even faster by the 1990s, sometimes hitting the market before a year had passed.

Software, graphics, color and processing speeds increased regularly, as computers became smaller and smaller. The Internet was also expanding, what had started as a way to exchange simple information from place to place, had exploded into an international obsession after being combined with the World Wide Web and simple navigation abilities. People were now using computers to seek out information, play games; interact with others, advertise business and more.

While smaller portable computers like the Xerox NoteTaker, Sharp’s suitcase-sized computer, and Kyocera’s popular Kyotronic 85 suitcase sized computers would be considered the first portables. 1989 brought Apple Computer’s Macintosh Portable, which was even smaller. Apple’s made changes in 1991 with its PowerBook. Changes that would become standards on all future laptops like a palm rest and trackball. The following year IBM released the ThinkPad 700C with a similar design. As the laptop industry grew, there would be more options like color displays, touch pads, stereo audio and built-in Ethernet network adapters.

As with standard sized computers, the race continued to make it smaller, faster and more user friendly. Battery life was extended by adding power saving processors, liquid crystal screens, improved storage capacity, connectivity, internal modems and drop safe shells. There are now also options for peripherals like cameras, video, fingerprint sensors, musical options and the list grows. Smartbooks, which means a hybrid device that is between a laptop and smart phone, were introduced in the 2000s, as were Netbooks, which are lightweight computers that are half the size of regular units.

What started as a tool to help humans calculate has developed into a sophisticated electronic marvel that few in the world today are not familiar with. These high-speed and low-cost digital computers have connected the world and have made them a widespread commodity.

History of Computers

Computers: History and Development

The History of Computing

Computer History Collection

The History of Apple Computers

Compaq Computer Case

The IBM 407 Accounting Machine

IBM’s Early Computers

Linux System Administration for Researchers

The Making of Linux

Nature and History of Operating Systems for Computers

A Brief History of Word Processing (Through 1986)

IBM Punch Cards

UNIVAC I

Control Data Corporation (CDC) 6600

Intel Processors