Law Office Computing™
The impact of bad Information in the Information Age
The information age has been built upon sub-miniture transistors built into the silicon chips that process the information in computers. In 1965 Gordon Moore, who co-founded Intel with Andrew Grove, made a remarkable prediction. He predicted that transistor density on on those chips would double every eighteen months to two years and his prediction has over the last 20 years proved to be amazingly accurate. Indeed, it has been so accurate that it is no longer called a prediction but rather "Moore's Law" which ranks it with Sir Isaac Newton and the principles of thermodynamics. Moore's law is founded in fundamental principles of physics and the complexity of building the facilities called "fabs" in which micro-chips are manufactured. Today, the main potential limit to Moore's law is the barrier created by the physical size of the circuits needed to make the transistors in the chip function. Under current technology, the smallest circuit possible is about 0.25 microns wide. IBM has recently announced the development of a copper circuit that can be created at 0.2 microns and it, along with the other chip manufacturers, hope soon to be able to build copper circuits at the 0.10 micron level, long the holy grail of chip fabrication. Just for comparison purposes, a human hair is about 100 microns thick. We are talking here about putting one thousand circuits side by side in the width of a human hair. That takes the concept of thin circuits somewhere close to the logical physical limit. If they can do that someday, IBM expects be able to maintain Moore's law for a few more generations of the current chip fabrication technology. IBM stock rose 4% on the news. Across the road at Intel I have been told that people are working on ways of tricking a transistor into believing it is much bigger than it really is, much the way Lute Olsen inspires his front line. If they can make a transistor believe it is really four times as powerful as it really is, Moore's law may be broken, says an industry source. But in the short term even the IBM advance is not expected to change Moore's law.
The implications of Moore's Law are profound and if it holds for the next decade all computer development of the last decade is but a prelude to the information technology century. Continuation of the current trend may lead to chips containing a billion transistors by 2005. The magnitude of the exponential expansion of computing capacity is demonstrated by noting that the first 8080 chip had only 10 thousand transistors in 1975. And, if in fact the Moore paradigm is broken and we are soon to see multiples of the current doubling in eighteen months progression, we are in for a very, very wild ride that could quickly take us to a billion transistors per chip by 2001 or before. And, for reasons that are beyond the scope of this column, those billion transistor chips will necessarily be smaller than their predecessors. The implications of such a mass produced expansion of world computing capacity are hard to imagine.
A graph of the changes in the number of transistors on processing chips over the last 30 years is in fact a road map to the Information Highway and its rapidly increasing speed limits. Increases in computational speed are directly related to the intensity with which the information technology revolution impacts society. The promise of continued exponential progression of physical computational capacity into the next century is simply astounding. Indeed, change is already coming with such blinding speed that the rate of change itself challenges the capacity of the human mind and social structures to adapt. We do adapt, however, and we are continually integrating technological change into our lifestyle. Faxes, microwave ovens, cell phones, the World Wide Web and automated tellers are but a few of many, many examples of new technologies that have quickly become central to our day to day affairs. Other new technologies will continue to emerge, however, swelling the tides of change that engulf us.
Most of those technological changes are but peripheral tributaries to the mainstream. A few, however, like the World Wide Web, constitute what the techies call a "paradigm shift" where the context in which some fundamental activity occurs is totally changed. Just as the development of electronic banking, credit cards and automated tellers have changed the context within which we conduct our day-to-day financial affairs, some technological innovations propel rapid and dramatic paradigm shifts. The development of the microprocessor chip itself is the most compelling example.
A few weeks ago, the New York Times announced another paradigm shift. This time it was an announcement that Moore's Law had been dumped by the introduction of a new Intel memory chip. The headlines screamed that Moore's Law was dead and the lead to the story proclaimed: "The world is no longer flat. Earth is no longer at the center of the solar system. And Moore's Law, a longstanding axiom of the computer age, is no longer true." The author went on to say that the new technology meant that instead of doubling computing capacity every 18 months, we should now expect that it would double every nine months or less. A casual reader of the New York Times' story might have been moved to buy Intel stock immediately because the story, if true, would mean that Intel had made such a quantum leap in chip manufacturing technology that it would quickly become the only remaining significant player in the micro-processor industry. In other words, if Intel had in fact developed a new fabricating process that allowed it to double the computing capacity of its microprocessors every nine months, it would surely wipe out all of its serious competition. The truth, however, was a far cry from the screaming headline. Intel stock stayed essentially flat because the truth of the matter is that the Intel development related not to microprocessors which are the primary subject of Moore's law but to so-called "flash" memory chips such as those used in cell phones and other consumer products where non-volatile memory is required. Intel has developed a way to put twice as much non-volatile memory on those chips so your cell phone can remember more numbers and your microwave oven can do more complicated gastronomical tricks. While this is a wonderful and important development, it really has nothing directly to do with processor speed and computational power. Intel actually introduced the flash memory chip called "StrataFlash" in 1994 predicting that it would be a usable product in 1995. Its ultimate introduction in 1997 indicates that instead of a breakthrough, Intel is actually two years late in its promised product introduction. Indeed, during that time period other manufacturers such as Toshiba and Samsung developed similar and competing flash memory technologies. Finally, as I noted above, the subsequent IBM announcement of successful use of copper circuits, is potentially a much more significant development--maybe even a micro-paradigm shift. There, the market did respond with a significant uptick in IBM stock. Moore's law, however, is not in doubt--at least for the moment and IBM itself has said that.
There is an important lesson to be learned from this journalistic snake oil. Information technology (IT) is in reality no different from any other important segment of our existence. Truth is central to our common understanding and in regard to most things we develop a healthy skepticism about shocking announcements. It so happens, however, that IT paradigm shifts occur with an unsettling regularity that tends to dispel the normal skepticism we should have. When we lose that skepticism we leave ourselves vulnerable to the hype and shell games that are all too common in the IT industry. When the misinformation comes from a source as venerable as the New York Times we are doubly vulnerable to deception.
None of that is to say that Moore's law will not someday be broken. History tells us that must certainly be so--someday. But the breakthrough will come not from continuing the present chip fabrication technology but from some dramatic "paradigm shift" to a radically different way of fabricating processing chips. The probability is that the next revolutionary development will come from adapting biologic or organic processes to computer intelligence. We may someday grow microchips the way we grow chickens, catfish and salmon. That shift in technology will have staggering implications for human kind and moral philosophy. As Rick Derringer noted around the time that Gordon Moore first made his famous observation: Hang on Sloopy, hang on!!!