As of 2018, out of hundreds of thousands of IBM employees around the globe, only 295 are IBM Fellow Emeritus – an exalted position within the company conferred on those with exceptional technical, scientific, engineering, or programming talent. This position was instituted in 1962, by the dynamic and one of the most charismatic leaders in American history, Thomas Watson Jr, the prodigal son of Thomas Watson Sr-the founder of IBM. Since then, each year, only less than a dozen employees have been honored with this title. Once an IBM Fellow, the IBM’er is relieved from mundane work, and free to focus on areas of their choice. Just as society offers its saints the security of food and shelter so that they can continue to flower in their spirituality and spread the fragrance around, these IBM fellows are given all the funding and tools required to pursue their dreams without conditions, with the faith that these minds left to ponder, brood and experiment with their ideas, are capable of transforming our knowledge of the material world. Naturally talented, highly qualified, and intensely committed, all the 295 IBM fellows have so far lived up to the promise of their brilliance. Between themselves, they have been awarded nearly ten thousand patents, secured five Nobel prizes, contributed thousands of seminal research papers on a variety of subjects, and sat on important committees affecting the direction of science and computer engineering. In short, the IBM fellows are the cream of the cream. During the last week, one such IBM fellow, in his eighties now, received in absentia ( due to ill health) a prestigious accolade to decorate his already overwhelming intellectual accomplishments. Dr. Robert H. Dennard, the man who pioneered innovation behind the DRAM chip was honored with the highest recognition in the computer chip industry — The Robert. N. Noyce award, named after the first chief of Intel, the chip manufacturing company. Let’s understand Bob’s contribution and the context a little more.
Every once in a while, there appears an innovation or a discovery that changes the texture of knowledge, the direction of science, and the course of human progress. The names of Copernicus, Isaac Newton, Louis Pasteur, Charles Darwin, and Albert Einstein come to mind for their daring brilliance and originality of ideas. In the field of computers, those “Aha” moments have come within the last century and a half. The greatest challenge in the formative years of building computing machines had been to find the right balance between size and processing capability. Old pictures of supercomputers occupying whole floors of buildings, spewing out heat and unbearable noise, strike us as hilarious now, but when they were built that was the only architecture possible to achieve the purpose. From the heavyweight beginnings of Charle Babbage’s analytical machine to the sophistication of powerful computers, to the plethora of miniature electronic devices of the modern-day, it is amply clear that sizes have shrunk; and the capacity and processing power have increased exponentially. These memory chips at the heart of these devices provide the processing base for these innovations. The core controlling processor alone is useless if it does not have the legroom of memory. To fire a missile, to launch a rocket, drive a car, load a Spotify song, watch a Netflix movie, talk over the phone — for any activity that involves dynamic loading and processing of data, the device must be equipped with a segment of storage that can hold and process temporary data, and flush it out when the job is done. This seminal concept of volatile memory is known to everyone (even with a rudimentary knowledge of computers) as RAM or Random Access Memory. When we buy a computer, we request for higher RAM. When we study the new mobile phone, we enquire into the RAM. We may not understand the electronic underpinnings of how the RAM works, but everybody who has ever worked with a computing device knows for certain this truism – that more RAM is synonymous with better speed. The discovery for an elegant and simplified solution to this problem had to wait till the late 1950s for Robert Dennard, working quietly in the IBM research center at Yorktown New York, to stumble upon the mathematics behind the one-transistor memory cell. The idea that it did not need multiple transistors to hold and retrieve data, and if the circuit could be designed with just one transistor, the size of the chip would shrink, and processing speed will increase, is an idea that scientists have known for some time, but weren’t able to figure out how to build one. Bob was fascinated by this problem and spent long hours working at his desk thinking about it. He intuitively knew that there had to be an elegant and simple solution, but it wasn’t manifesting itself clearly and with scientific certainty. That is how all great scientists feel when they encounter a problem that grips them. Bob awakened to his insight one evening after attending a lecture on microelectronics when the speaker made a casual reference to a technique of compressing transistors into smaller chips. In Bob’s mind, the reference translated into “ Why not just one?”. In a later interview he says, “ it just fell in place that day”. After that moment, it was the painful process of unfurling the idea and laying it down systematically for the world to understand. Robert’s discovery underlies and vindicates the famous Moore’s law that computing power will increase exponentially with a decrease in chip size. Like every great discovery, Bob’s idea looks simple and self-evident once stated, but to get there required a deep understanding of microelectronics, mathematical rigor, the capacity to look at a problem afresh, and above all the ability to bring together several matrices of thought to make a scientific breakthrough. Not easy to do.
In the summer of 1958, Bob Dennard distilled his discovery into an IBM patent notebook (a spiral-bound book given to key researchers ) complete with the theoretical foundations and circuit diagrams that would eventually become the blueprint for the DRAM – the Dynamic Random access chip. For years, Bob’s invention was put on the back burner by his managers at IBM, not realizing the far-reaching implications of this idea. In 1968, Bob was awarded the patent – one among the twenty-six patents he would acquire over the years. Despite the recognition, the patent continued to languish in the secure lockers of IBM for some more time before it eventually found its implementation. When it did, the idea transformed the computer industry by replacing the SRAM( Static RAM), with DRAM which was designed on a single transistor, with a single capacitor to store a single bit of data on a single cell to be shared by the chip circuitry. The efficiency and simplicity of Bob Dennard’s solution to a nagging problem were staggering, and nothing short of genius. Little did anyone realize that this single innovation would open the door of computing to areas that would have seemed impossible. Every device that we hold in our hands and processes data has the creative imprint of Bob Dennard’s seminal work. The DRAM chip alone today is nearly a seventy-eight billion-dollar industry.
Bob grew up in a small town in Texas, and his education was conventional. There is nothing extraordinary about his childhood to mark him out for such seminal creative work. After his doctorate from Carnegie Mellon University, Bob joined the vibrant IBM community. The Watson family had pegged their vision to the blossoming computing business, and Watson Jr was perspicacious enough to steer his father’s business from electronic typewriters to computing machines. Watson Jr recognized that IBM needed solid research, and Men and Women with the capacity and acumen to drive such research. The IBM York town research center was the cynosure of IBM. It was there that the IBM 650 was showcased, and not surprisingly the first IBM fellow under the Emeritus program was Frank Hamilton who built it. Those were glorious years of revolutionary innovations and discoveries in the world of information technology, and the IBM research facility at Yorktown was the incubator for such ideas. It still is. So many great innovations have sprung from the glass-covered building, so many brilliant minds have found the purpose of their lives within those walls, that IEEE, the international body of electrical and electronics engineers, has declared the period between 1960 – 1984 in the Yorktown center as a milestone in human scientific endeavor. Bob joined IBM as a young researcher, and he spent his professional life within the hallowed walls of York town center. He has remained an integral part of IBM’s commitment and passion for pure research. There are no awards, citations, or honors that Bob has not received. A fellow of the prestigious IEEE, recipient of the President’s Medal, and innumerable honors from virtually every academic and scientific body across the globe is a testament to the fact that Bob is one of the living embodiments of excellence in science and engineering. The Robert Noyce award given last week is just another icing on an already stacked cake of honors.
The next time you open your phone, laptop, or any other device and watch your favorite apps load in quick time — pause for a second, and send out a wish and bow of gratitude to Bob. His discovery made this comfort and progress possible.
God bless…
yours in mortality,
Bala
Very insightful and descriptive recount of one of the most important inventions in the 20th century! Often the most genius ideas don’t come from an aspiration for fame and fortune, but just from a keen desire to change something for the better.
Thank you Ashvin. Yes, I agree. Also, pls accept my gratitude for getting this site up along with Aniruddh. It wouldn’t have happened without your initiative.
🙂 Well, after reading your articles from facebook, I knew they needed a website that will hopefully broaden their reach to many other interested readers who’d find them engrossing.