Learn why a computer error is called a “bug”

Irving Juárez
6 min readMay 23, 2021

Learn to program is hard. That’s the reason why I am making iteasy for you. This blog is part of a blogpost series where I write about the exciting history of computer science. Go to the first post “here” to follow the thread of topics we are going to cover in this one.

In this post you are going to learn about

  • The beginning of calculator optimization
  • The Harvard Mark 1, a 5 tons computer
  • Why a computer error is called a “bug”?
  • The first high-level programming lenguage
  • The superiority of electronic computers
  • The micro-electronics revolution
  • The first general-purpose programmable digital computer

The beginning of calculator optimization

While IBM was developing mechanical calculators for the accounting industry, the U.S. military desired a mechanical calculator more optimized for scientific computation. By World War II the U.S. had battleships that could lob shells weighing as much as a small car over distances up to 25 miles.

WW2 U.S battleship

Physicists could write the equations that described how atmospheric drag, wind, gravity, muzzle velocity, etc. would determine the trajectory of the shell. But solving such equations was extremely laborious. This was the work performed by the human computers. During World War II the U.S. military scoured the country looking for (generally female) math majors to hire for the job of computing these tables. But not enough humans could be found to keep up with the need for new tables.

Sometimes artillery pieces had to be delivered to the battlefield without the necessary firing tables and this meant they were close to useless because they couldn’t be aimed properly. Faced with this situation, the U.S. was willing to invest in whatever solution to solve the problem.

The Harvard Mark 1, a 5 tons computer

One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed out of switches, relays, rotating shafts, and clutches.

The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding like a roomful of ladies knitting.

The Harvard Mark I

Why a computer error is called a “bug”?

One of the primary programmers for the Mark I was a woman, Grace Hopper. Hopper found the first computer “bug”: a dead moth that had gotten into the Mark I and whose wings were blocking the reading of the holes in the paper tape. The word “bug” had been used to describe a defect since at least 1889 but Hopper is credited to attribute the word “debugging” to describe the work to eliminate program faults.

Today’s bug.
The first computer bug. Note that there is an insect in the photo.

The first high-level programming lenguage

A high-level language is designed to be more understandable by humans than is the binary language understood by the computing machinery. In 1953 Grace Hopper invented the first high-level language, “Flow-matic”. This language eventually became COBOL.

A high-level language is worthless without a program — known as a compiler — to translate it into the binary language of the computer and hence Grace Hopper also constructed the world’s first compiler.

The superiority of electronic computers

The Mark I operated on numbers that were 23 digits long. It could add or subtract two of these numbers in three-tenths of a second, multiply them in four seconds, and divide them in ten seconds. Forty-five years later computers could perform an addition in a billionth of a second! Even though the Mark I had three quarters of a million components, it could only store 72 numbers!

Today, home computers can store 30 million numbers in RAM and another 10 billion numbers on their hard disk. As well, a number can be pulled from RAM after a delay of only a few billionths of a second, and from a hard disk after a delay of only a few thousandths of a second. This kind of speed is obviously impossible for a machine which must move a rotating shaft. That is why electronic computers are so superior than their mechanical predecessors.

Example of all the machinery inside a mechanical computer

As a curious note, the principal designer of the Mark I, Howard Aiken of Harvard, estimated in 1947 that six electronic digital computers would be sufficient to satisfy the computing needs of the entire United States. IBM had commissioned this study to determine whether it should develop this new invention into one of its standard products.

Aiken’s prediction wasn’t actually so bad as there were very few institutions (principally, the government and military) that could afford the cost of what was called a computer in 1947. He just didn’t foresee the micro-electronics revolution which would allow something like an IBM Stretch computer of 1959.

The micro-electronics revolution

Computers had been incredibly expensive because they required so much hand assembly, such as the wiring seen in this CDC 7600.

The CDC 7600 wires.

The microelectronics revolution allowed the amount of hand-crafted wiring seen in the above photo to be mass-produced as an integrated circuit which is a small sliver of silicon the size of your thumbnail .

A integrated circuit (black rectangule)

We can see the integrated circuits as the Gutenberg’s printing press for computers. This is because integrated circuits have two main advantages.

  • Can be created and interconnected in a mass-production process. All the elements on the integrated circuit are fabricated simultaneously via a small number (maybe 12) of optical masks that define the geometry of each layer. This speeds up the process of fabricating the computer and hence reduces its cost.
  • The size is tiny. Hence, the size of the computers itself decreases.

The IBM Stretch computer of 1959 needed its 33 foot length to hold the 150,000 transistors it contained. These transistors were tremendously smaller than the vacuum tubes they replaced, but they were still individual elements requiring individual assembly. By the early 1980s this many transistors could be simultaneously fabricated on an integrated circuit. Today’s Pentium 4 microprocessor contains 42,000,000 transistors in this same thumbnail sized piece of silicon.

The first general-purpose programmable digital computer

The Z3, built in 1941 (Nazi Germany) by Konrad Zuse, was probably the first operational, general-purpose, programmable (that is, software controlled) digital computer. Without knowledge of any calculating machine inventors since Leibniz (who lived in the 1600's), Zuse reinvented Babbage’s concept of programming and decided on his own to employ binary representation for numbers (Babbage had advocated decimal).

The Z3 computer.

The Z3 was destroyed by an Allied bombing raid. The Z1 and Z2 met the same fate and the Z4 survived only because Zuse hidden it up into the mountains. Zuse’s accomplishments are all the more incredible given the context of the material and manpower shortages in Germany during World War II. Zuse couldn’t even obtain paper tape so he had to make his own by punching holes in discarded movie film.

The history of computer science bootstramp after the microelectronic revolution. Stay stuned to my twitter and medium accounts to see the “Modern history of computing”. Share the content if you liked it.

--

--