Aardvark DailyNew Zealand's longest-running online daily news and commentary publication, now in its 25th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.
Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at aardvark.co.uk
Please visit the sponsor!
I must admit that yesterday I spent far more time than I should have, strolling down memory lane and beyond.
I'm not talking about my early involvement with computers, something I've droned on about already. No, I'm talking about the *very* early days of computing, back in the 1930s and 1940s.
These were the days when computers were built from technology such as relays and where memory was measured in bits rather than bytes.
This stuff is absolutely fascinating, and awe-inspiring.
It is awe inspiring because of what they were able to achieve with the incredibly primative technologies of the day.
Today we tend to think of a computer requiring millions or billions of transistors etched into a silicon chip of hundreds of square millimetres and supported by an army of other LSI devices on a single PCB.
Imagine instead an entire room, filled with relays motors, wiring and things as archaic as mercury delay memory units. Now THAT'S old-school computing!
Take for example, the BTL computer invented by George Stibitz and built by Bell Labs back in 1939. This machine cost an insane $20,000 to build but ran for an entire decade and occupied a volume of almost 5 cubic metres into which masses of relays and wiring were crammed.
Its memory was sufficient only to store two values and it was more like an advanced calculator than a true computer -- but it worked and set the ground for bigger, better and more sophisticated computers to come.
Unsurprisingly, its performance was "lacklustre" when compared to today's computer, or even the cheapest 4-function calculator. The best it could do was a rather paltry 0.2 instructions per second, yes, that's right 0.0000002 MIPS. Even that's a little misleading however because it didn't execute instructions as such, it simply used its hardwired logic to perform the same single arithmetic operation on the numbers fed into it.
By 1942, prices were falling and processing power was on the increase, with the ABC machine costing a mere $7,000 and using the much newer and faster technologies of vacuum tubes for logic and capacitors for memory storage. This had an astounding 64 words of memory, each word containing 50 bits. It was still a "hardwired" computer however, with no ability to control its function via software.
Perhaps the first truly programmable computer was the ASCC, built by IBM in 1943 at a cost of half a million dollars. This machine was not only far more expensive than those that came before it but was also massive in size -- measuring some 16m x 2.5m x 2.5m, effectively occupying an entire room. Surprisingly it was also a relay-based computer but had paper-tape as well as punched-card I/O and could be programmed via "plugboards" which effectively contained the code that controlled the sequencing of operations performed by the logic unit. With an astounding 132 words of 23 decimal digits each, this was a powerhouse (in more ways than one). At 3.3 operations per second it was already an order of magnitude faster than the BTL machine that preceded it by a mere four years.
This machine was also known as the Harvard Mark 1 and a fantastic first-hand account of its use by none other than Grace Hopper can be seen in this video. In fact that whole video is well worth a watch if you have any interest in the subject of early computer history.
Perhaps few will share my passion for these very early years of computing but again I must reiterate my respect for those who could build such capable machines from such primative components. Imagine creating memory by sending ultrasonic sound waves through a tube filled with mercury -- the original form of dynamic RAM, where the stream of data had to be regenerated after each cycle through the tubes. There's more fascinating insight into this technology on Wikipedia.
I hope this column may stimulate a few readers to take their own stroll back through the history of computing and be as wowed as I was.
Please visit the sponsor!
Have your say in the Aardvark Forums.