GreyMatter

Interesting Times

Many of us are familiar with Moore’s Law but not many will be aware of the 12-year-old cycle of evolution in Computing.  Rajesh Jain, in an insightful post on the subject, explains : 

1945 saw the invention of the world’s first computer, the ENIAC (Electronic Numerical Integrator Analyzer and Computer). PBS.org has more: “ENIAC, with its 17,468 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1,500 relays, and 6,000 manual switches, was a monument of engineering. The project was 200 percent over budget (total cost approximately $500,000). But it had achieved what it set out to do. A calculation like finding the cube root of 2589 to the 16th power could be done in a fraction of a second. In a whole second ENIAC could execute 5,000 additions, 357 multiplications, and 38 divisions. This was up to a thousand times faster than its predecessors…ENIAC’s main drawback was that programming it was a nightmare. In that sense it was not a general use computer. To change its program meant essentially rewiring it, with punchcards and switches in wiring plugboards. It could take a team two days to reprogram the machine.”

In the late 1950s, IBM switched from using vacuum tubes to using transistors. VisionEngineer.com writes: “Vacuum tubes are large, expensive to produce, and often burn out after several hundred hours of use. As electronic systems grew in complexity, increasing amounts of time had to be spent just to ensure that all the vacuum tubes were in working order. Transistors, in comparison, rarely fail and are much cheaper to operate.” In the same year, IBM also introduced Fortran (FORmula TRANslation), a programming language based on algebra, grammar and syntax rules, and which went on to become the most widely used computer languages for technical work. These twin breakthroughs made computers reliable and easily programmable.

In 1969, IBM changed the way it sold technology. It unbundled the components of hardware, software and services, and offered them for sale individually. This is what gave birth to an independent software industry. 1969 also saw the setting up of the Arpanet, which later grew to be the Internet. 1970 also saw the birth of the first general-purpose microprocessor – the Intel 4004. Unix began its life around the same time, as did the programming language, “C”. 1970 was also the year when the theory of relational databases was introduced by Ted Codd at IBM. Taken together, these developments in semiconductors, software and networks laid the foundation for modern-day computing.

1981 saw the launch of the IBM Personal Computer. From the IBM archives: “[It] was the smallest and — with a starting price of $1,565 — the lowest-priced IBM computer to date. The IBM PC brought together all of the most desirable features of a computer into one small machine. It offered 16 kilobytes of user memory (expandable to 256 kilobytes), one or two floppy disks and an optional color monitor. When designing the PC, IBM for the first time contracted the production of its components to outside companies. The processor chip came from Intel and the operating system, called DOS (Disk Operating System) came from a 32-person company called Microsoft.” The rest, as they say, is history. IBM’s decision to source the two key components from external suppliers led to the modularisation of the computer industry, and the emergence of Intel and Microsoft as its two superpowers.” In 1982, Time magazine chose the personal computer as its Man of the Year.

The period of 1992-94 saw many key developments which have shaped our present. Microsoft launched Windows 3.1, which rapidly became the standard desktop interface for millions. Around the same time, Intel started shipping its Pentium systems. The duo’s dominance led to the coining of the phrase “Wintel”. SAP launched its enterprise software program, R/3, which established the client-server paradigm. The Internet’s commercialisation and proliferation got a major boost with the launch of Mosaic, a graphical web browser based on the HTTP and HTML standards, by Marc Andressen and his team at the National Center for Supercomputing Applications in the US.

Computing has come a long way since the development of the first computer in 1945. Even though innovation has happened in an almost-continuous manner, my observation is that every twelve years or so comes a paradigm shift which blows out the old, and rings in the new.

So, the next computing Kumbh Mela should happen sometime soon (or is already underway). What is it going to be? Microsoft’s Longhorn? Google as the supercomputer? Cellphones as always-on, always-connected computers? Utility computing? Wearable computers? Something unseen as of today…?

While the Microsofts and IBMs of the world are betting billions on their choice of technology, no one can really say what the next computing revolution will be.  But, going by Rajesh’s calculations, it should be just round the corner. 

As a wise man once said, “May you live in interesting times!”