How do some technologies get adopted and go on to become an infrastructure that the “community” depends on? What are some of the hallmarks of technologies that have gone on to widespread adoption and conversely, some that have fallen by the wayside despite technical brilliance. I will do a quick tour through the ages of cell phone, The Internet, and now Internet of Things or IoT.
The idea for the blog came to me in late April from Bob Kahn’s talk and panel presentation at Purdue and my subsequent discussion with him.
Some common factors for widespread adoption
There appear to be some common ingredients in making this leap into becoming indispensable infrastructure. First, the technology has to be a good fit for the current times, i.e., fits well within the larger technology ecosystem. But also importantly, it has the ability to move with the times as the technology landscape around it improves. For example, the Internet worked well on a few Kilobits/second physical connection and does so (with a little but not much adaptation) on the much higher network speeds of today. Second, there has to form a band of talented people (deep technical innovators as well as others) who are willing to push the technology through the early exciting flush of new discovery through the grinding process of strengthening it and standardizing it. It is only then that the commercial interests take notice and put the mighty push needed for widespread adoption. Third, the technology should not get walled off too early, such that only one innovator and her company gets to reap the benefits. Rather, there should be an open playing field with technical expertise and grit being the price of admission. Finally, there is the major unknown of lady luck that is necessary. Else some innovation that has all of the three other features can fall by the wayside.
Another sign of our times is that technology movement has been sped up significantly over the years. This means that (some) technology goes on to become ubiquitous infrastructure faster and that (most) technology goes into oblivion too faster. Take a look at the chart below by Nick Felton of the New York Times. As a telling instance, it took more than five decades for the telephone to reach 50% of households, beginning before 1900. It took less than five years for cellphones to accomplish the same penetration in the 1990s.
Cell phone and the cellular network is a classic case of an infrastructure that people rely on across the globe, and across the economic spectrum, from the rich elites in the developed world to the farmer in sub-Saharan Africa taking his wares to the market.
The cell phone took a relatively long time to mature to the point where it could be widely adopted among the general population. World War II’s US military radios manufactured by Motorola can justly claim to be the forefather of the cell phone that again Motorola developed for the general population. However, at the steep price tag of when it was introduced, only the Joneses could afford it. It helped to have a steady paying customer in those early days who could keep driving the invention’s cost to a point where you and I could venture out in droves to get it. That angel customer was the US military, which used a steady stream of Motorola products (military radios, walkie talkies, etc.) which guided the path toward wide adoption.
The second important ingredient was the role of standards. Many of us are familiar with the terminology 4G and 5G that are used with cell phones these days. The cavalcade of standards started early and stayed with us. The 1G analog telephony standard called AMPS was introduced by Bell Labs in 1983 and followed by the 2G standard created by a broad coalition called the the European Telecommunications Standards Institute (ETSI). The standards did not wall off the technology early, allowing a few players to come in initially (Motorola, Bell Labs, Nokia, Kyocera), tentatively, and then as they showed the commercial potential, more players came in, and the standards laid out a somewhat even playing field.
This brings me to the third factor which helped with the cell phone adoption—some good ol’ fashioned rivalry between two US players, Motorola and AT&T. They competed to reduce the sizes of the devices and their costs. In a bit of theatrical flourish, the reputed first cell phone call, in a non military setting, without it being a car phone, was made by Motorola Vice President Martin Cooper. And who did he choose to call? In a brilliant demonstration of chutzpah, he called his counterpart at rival Bell Labs.
And in all this, surely we should not forget our debt to Star Trek, which through its Communicator inspired a long line of Motorola’s initial cell phone models.
The ubiquitous Internet had humble beginnings without aspirations of serving as the de facto means of communication across far-flung reaches of the globe. Most world-changing technology does. You can read a detailed, partly technical, and very approachable history of the Internet from its early days till the mid 1990’s, as told by some of its inventors. I will not provide a blow-by-blow account, but reflect on some of the factors that helped it become the universal infrastructure that it has become today.
First, the wonder that the Internet is did not happen in a day, or even a decade. It called for sustained public funding and imagination and perseverance by a group of smart technologists and program managers. Interestingly, the second category were more often than not genuine members of the first category. Following the story of the Internet, it became clear to me that the urban legend of a visionary and proactive program manager making a profound difference to the story of technology, is not an urban legend after all. J.C.R. Licklider of MIT in August 1962 discussed his “Galactic Network” concept. He envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA, starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept. Then is the more well-known DARPA Program Manager, Bob Kahn, who gave three contracts to develop the TCP/IP protocol, the lingua franca of protocols for the Internet. He went on to author a paper on the protocol with one of his grantees — Vinton Cerf from Stanford. The paper appeared in IEEE Transactions on Communications in their May 1974 issue.
The public funding issue has given rise to some controversy over the ages — see for example [WWW] [WWW] [WWW] — but I come away convinced that that played a singularly important role in the development and maturation of the Internet. There were public universities at the front and center (UCLA, UCSB, Utah, UIUC), quasi-public entities (Stanford Research Institute), and public European entities (CERN). Where private companies played significant roles, and there certainly were several, such as, BBN and Xerox PARC, they were funded quite significantly by public funding agencies (DARPA) or were in it seemingly without immediate profit motives (PARC).
Second, there was the belief that the walled garden mode would not be appropriate for this technology. Growth from niche ARPANET to the much more chaotic and widespread Internet was surprisingly painless. This happened in large part because the inventors of the ARPANET technology were willing to make it extensible, hence the “no-walled-garden-here” ethos. Whether it was noblesse oblige or an example of visionary thinking or just plain chance, I do not know. But to speculate, I think the fact that most of the inventors did not have immediate commercial concerns would have helped them in thinking long term and they had the vision … so ergo, visionary thinking.
This focus on extensibility had the desirable by product of stopping damaging turf wars, though I am told in private conversation with some of the authors of the above ISOC article and who were there on Day Zero that there were several. But none destructive enough. This level of kumbaya was there among university colleagues—today as well, we get along well for the most part—but happily it extended to funding agencies. Thus DARPA’s ARPANET was adopted warmly by NSF when it started its investments in the NSFNET starting in 1985.
Third, there was a “killer app” identified early on, that the technologists as well as the broad user base, gravitated toward. The killer app was none other than the humble email. Early on in 1972, when Bob Kahn did a much publicized demonstration the ARPANET, email program was unveiled by BBN. It allowed for people-to-people interaction, something that human beings have loved for all of its 150,000 years of existence—human existence that is, not email’s. No matter what is the technology (or lack of it) that is used to facilitate such interaction, we love it and cannot get enough of it. As an aside, email provides an example where famous first words can be quite handy. Ray Tomlinson of BBN, credited with writing the first email client software, sent a first message that he remembers was likely QWERTYUIOP. It was sent from Ray to Ray across two machines that sat side-by-side, connected over the ARPANET.
The Internet of Things
This brings us to the current day where technology is being shaped for the Internet of Things (IoT). It is a messy landscape today—little in the way of standards, bad to worse usability, and little regard for reliability, security, or privacy. Is there hope of moving ahead toward widespread adoption through this chaotic landscape?
Let us get back to the factors I identified near the beginning, that act as portends for technology becoming widely adopted and cast as an “infrastructure” and see where IoT measures on these factors. First, IoT seems to be a good fit for the current technology ecosystem and positioned to ride the curve of technology developments. Take for example the fact that sensors have dropped drastically in their price and ubiquitous connectivity, even high bandwidth connectivity, seems to be within reach. Second, is the need of a band of talented people who want to create an open garden ecosystem. I see that this is lacking in the IoT space. There is a lack of a cohesive band of visionary leaders, with participation from academic and commercial organizations, to drive the match forward. Third, is the trait that a technology must not get into the hands of one, or a few, monopolistic commercial organization too early. In the IoT space, there seems little danger of that. In fact, the danger so far has been too much fragmentation. The issue is the lack of a starting framework from which to prototype to commercial production using the same set of tools (compilers, debugging, etc.). Much of what is available are components that require highly specialized knowledge and skills to make use of and they do not transfer easily across platform. The DIY Maker community has its Arduino and Rasberry Pi boards to create toy educational experiments but it takes Herculean effort to make them into something even nearing a stable product. Fourth, regarding a killer app. The public at large has not warmed up to the idea of sensing everywhere and cognition out of that ubiquitous sensed data, with privacy fears coming to the forefront. The mood has turned increasingly sour over the last year. So a killer app, or more likely a few killer apps, each appealing to a subset, but a sizable subset, of the population is needed to make the transition to ubiquitous infrastructure.
The shape of things to come
We all hear of the peta, exa, zeta, … put another Greek prefix … bytes of data being generated by IoT devices. We need someone to filter the data since the overwhelming majority is useless, not worth the electric charge of the storage device it is stored on. Then we need someone to organize it, and let the owners of the data take actionable decisions out of all that data. Rather than one Google to organize all the information of the world, I anticipate there will be one or a handful to organize the information, and several domain-specific winners, one (or a few at best) for each domain, to provide insights from that information.
And then we come to the factor of lady luck. Does IoT have that to drive it from an inventor’s toy to widespread adoption? My crystal ball seems to be having an off day today and so I will have to wait for sunrise to share that answer.