Oh, how we easily take things for granted.
We sit smugly in our own little world in front of a comfortably small machine that speedily and unobtrusively transports us to the vast frontiers of cyberspace, in matters of seconds. And oh, we can say, in nanoseconds.
A machine that in its tiny physical parameters can list really stupendous capabilities and powers – a processor that boasts of speeds of over 3Ghz, storing processed information in random memory of over 1Gig, and permanently storing files of any format or flavor into a hard drive with a capacity of over 200Gigs, and whatever new-fangled bells and whistles we have accessorized it with.
We of course, do not necessarily gloat over this prized possession because we know there are out there machines that can run circles on our own. Machines that may have multiple processors, or even multiple-core processors, and even faster and larger memory chips, and multiple, faster, and larger hard drives.
But we have reasons to take these for granted – because they are now quite commonplace, and relatively cheap for the typical consumer to hanker for and acquire.
But in the not too distant past, the world was a lot more primitive, coarse, and snail-like in motion. And we do not mean centuries or eons past. What about in the early 80’s?
In 1982, working for a first-class hotel in downtown San Francisco, one had the opportunity to work in EDP (Electronic Data Processing) and toy around with a mini-computer the size of a small refrigerator. It was the heart of a network connected via co-axial cable with nodes attached to dumb terminals. The computer itself (named Four Phase and made by a company named Datahost) had two spinning drives, one live and the other back-up. Anyway, measuring against current technology, it was so old one hardly remembers much about it.
Then by 1985 came the IBM System 3x (34, 36, 38), technically marketed as mini-computers though larger than the previous one, measuring at least like a big commercial-size freezer. And the tandem printer measuring and weighing just a little bit less. Still networked via co-axial cable with nodes attached to dumb terminals. But at this stage, progress signs were already evident, with the use of sub-systems on a token-ring model with exchanges of data coming from outside via circuit-switched phone lines, with registered speeds of 4-6 MHz.
In late 1994, hotel management decided to try IBM’s newest but mysterious and incredibly small and squat machine labeled the AS400 (Application System). Unfortunately, its introduction and marketing was at the cusp of the breakout of the personal computer as the machine of choice for business. Thus, we never really had time to get acquainted with the black console, which if located on the floor required back-stretching motions to operate its buttons and view its tiny console screen.
But backtracking a bit, in 1981, IBM launched its first Personal Computer and in the process coining the word, PC. x286, x386, x486 – terms most probably now lost in current-day tech-jargon. But it will be decent to remember that a computer with an x486 processor powered the original Mars rover that did many of that wonderful stuff for earthbound operators, doing so many millions of miles away. Is this the same rover that to this date continues to receive commands from earth stations? I can’t say.
But by 1993, the first Intel Pentium chips had been launched with initial processor speed of 60-66 MHz. and quickly progressing to 90-100-160 MHz.
And quickly by 1995, the PC had already firmly established itself as a business tool, and faster and more efficient network models were very well into their seamless integration with the PC. Thus, we had our auspicious introduction to the Server/Workstation model via Ethernet, using the then popular server operating system, the Windows NT 3.1, in tandem with Windows 95 for the deployed workstations.
But even well into the start of the new millennium, our server PCs were still running with single processors at speeds of no faster than 400Mhz and we had workstations running at 200Mhz , with a good number still chugging along at 160Mhz.
But the following years up to the current one we have witnessed the remarkable explosion in processor speed that had most everybody worshipping Moore’s law on its evolution.
Now thinking and planning beyond the physical limits imposed by nature on silicon, people in the industry are now talking about nanotechnology – venturing and striking into the molecular or atomic level of matter.
Are we now approaching what has been projected and termed by some scientists as Singularity? – An exponentially expanding future from an exponentially shrinking technology.
Who can say? Another projection for Brave New World Revisited 2x?