[OT-article] bcr.com
How "Visionaries" Lead Us Astray Volume 28, Number 10 October 1998, pp. 22-24 By Jim Sobczak
Only a few years ago, virtually everyone agreed that Asynchronous Transfer Mode (ATM) would replace existing local and wide area network architectures. Everyone was convinced high-speed data and video applications would emerge, and that they would require sizable increases in bandwidth.
So the "experts" predicted that ATM would become the ubiquitous end-to-end network solution we had apparently all been searching for. Not only did the trade press and consultants climb on board the ATM bandwagon, telecom companies invested billions developing ATM products and venture money spawned several startup companies.
But the ATM dream hasn't materialized--and it won't. While ATM's long-term role in wide area networks is still being debated, all but the most committed zealots concede that Ethernet has won the battle at the desktop. The concept of a seamless end-to-end networking architecture, once touted as ATM's greatest strength, is not going to happen. With ATM relegated to use in specialized situations, the absolutely "can't miss" market has missed.
And ATM isn't an aberration. Remember all the hype during the 1980s about Integrated Services Digital Network (ISDN) being the "next" communications panacea? Or other sure bets from that era--the use of satellite technology for business communication, voice/data PBXs, videoconferencing and the "Office of the Future." Meanwhile, virtually none of the experts noticed the proliferation or impact of fiber optic communications and the Internet until they were upon us.
As an industry we must ask ourselves how we can be so wrong. Maybe more important, what should we do to prevent similar mistakes in the future?
Alvin Toffler Legacy ATM and the other failed technologies mentioned above have one important theme in common: They're based on the conviction that an explosion in data, image and video applications is imminent. Many people share that conviction today, even though those applications, for the most part, still haven't arrived. Still, it's almost heresy to challenge what is becoming an article of faith within our industry: There is an insatiable demand for bandwidth.
I believe that visionaries such as Alvin Toffler are at least partially responsible for this predicament. They have convinced us that society is caught up in a fire-storm of change, and that information technology in particular is the engine for this change. We're on the brink of a second industrial revolution that is based on the power of information and in which rapid change is the order of the day, they claim.
This philosophy permeates our industry, and it creates a mindset that easily accepts that applications requiring huge amounts of bandwidth will soon flourish. It also encourages businesspeople to act quickly so market opportunities will not be lost. The few dissenters are dismissed as not possessing the "vision" to grasp what is happening.
Market Studies Miss the Mark But maybe, just maybe, the dissenters are right. Maybe the information "revolution" really isn't moving as fast as we're told. I'm convinced that the basic premise--that we are caught in a whirlwind of an ever-expanding reliance on information which requires orders-of-magnitude increases in network capacity--is flawed.
Now, I realize that there are plenty of market studies that support the need for higher-speed networks. There were, for example, numerous market studies projecting that ATM would enjoy a rapidly expanding market.
But because almost everyone in the industry believed information requirements were exploding, how could those studies have turned out any different? It is no wonder "hockey stick" forecasts have become a staple of our industry.
One reason market studies fail is that it's very difficult to obtain valid input from end users. Even in the largest companies, only a few individuals possess the blend of business and technology perspective to properly evaluate the impact of a new technology. These people must construct a business case by translating the technology's advantages into value to the organization and then securing approval from senior management. When market researchers attempt to contact these people, they're usually politely referred to someone further down the organizational chart--typically to a technologist who is as caught up in the hype as the researchers themselves. Talk about the blind leading the blind.
"Early Adopter" Fallacy Market studies also tend to focus on leading-edge users. To be fair, this makes some sense--after all, it's hard to conduct meaningful market research interviews with people who don't show any interest in the subject.
But the so-called progressive users may not be bellwethers of anything but themselves. There certainly were plenty of "early adopters" of ATM, but they can serve as market indicators only if their business justification process is similar to those that will follow later.
That was not the case with ATM. Most early buyers were either research companies, government agencies or universities; they're not "typical" corporate buyers. And while carriers also were another important early market for ATM equipment, their purchases were based on the belief that a market would exist for public ATM services. Meanwhile, most network decision-makers within the business community have taken a wait-and-see stance relative to ATM, perhaps because they have witnessed the failure of other surefire technologies.
House of Cards So why did ATM fail to meet expectations? ATM is differentiated from other network architectures by its ability to provide quality-of-service guarantees. But applications that require such guarantees have not emerged--and that is the root cause of ATM's failure.
Videoconferencing, perhaps the application most sensitive to service quality, is a case in point. Though it has been commercially available in one form or another for almost 20 years, it hasn't caught on. While most major corporations use videoconferencing for specific applications, it has not achieved any mass appeal. Even in companies where these specific uses have been very successful, the use of video rarely proliferated--perhaps the most telling indication that videoconferencing will never have widespread appeal. Any suggestion that video will become commonplace reflects wishful thinking rather than discernible market forces.
Simply put, video's incremental value hasn't been enough to justify its associated costs; most of the time, the phone, facsimile and email work just fine. In the case of videoconferencing, the market has been sending a pretty loud message for some time now, but some folks refuse to listen.
Video's lack of success is crucial to understanding why ATM has failed. Videoconferencing, either as a standalone service or as part of multimedia applications, was to be a critical component in the new networking "paradigm." That is why ATM advocates still hold out hope for desktop videoconferencing and video servers.
Once you remove the necessity to support videoconferencing traffic, the underlying assumptions that have driven the need for ATM begin to fall like a house of cards. After all, video's high-bandwidth requirements coupled with stringent delay considerations explained why we needed an ATM-like architecture that could provide quality-of-service guarantees.
But when video is removed from the equation, the quality-of-service problem is greatly simplified. If multimedia applications that combine the use of voice and data do not materialize, and there is no indication that they will, the case for ATM becomes even weaker. All that remains is Voice over IP, which can probably be delivered by simpler service mechanisms than ATM provides. And, of course, there is some doubt whether Voice over IP will ever proliferate.
Guidelines for the Future I think it is fair to ask how come this industry has made so many mistakes. ATM, after all, is just the latest. Is it just a matter of not understanding the market? And while this appears to be true, it begs the question: What are we going to do different to prevent this from happening again? The answer is to improve our ability to forecast market developments. But how?
End users are reluctant to express a negative view of a highly touted technology; nobody wants to be thought of as being "nonvisionary." So the research needs to become more sophisticated; it needs to understand and take into account the complexity involved when a user must create a business case to justify a technology and its associated risks.
An increasingly complex business case is accompanied by a decreasing level of certainty that the benefits can be achieved. The most complex justification occurs when benefits are derived by creating value to the organization. And this seems to be where we get into trouble, because many times the value won't be realized unless users adapt to a new way of doing things.
Put another way, the time it takes for a new technology to be accepted is directly proportional to the amount of change the end user must undergo. Mobile wireless has experienced rapid growth because it is relatively easy to adapt to a tetherless telephone. Multimedia and virtual collaboration applications are at the other end of the spectrum. They require users to implement fundamental changes in the way they conduct their business day.
In the Gorilla Game, a well-read book in the investment community on how to pick winners in high technology stocks, the authors argue some technologies "are all so discontinuous with the current technological infrastructure, require so much customization to be effective, and demand so much data administration, that it is virtually impossible for any of them to generate a mass market" (Gorilla Game, Harper Business, 1998, ISBN 0-88730-887-2, p. 150).
The message is that we cannot intelligently make market projections without analyzing these issues. We must resist the temptation to join the "visionaries" and assume applications will flourish just because they are technically feasible.
Projecting a multibillion-dollar market is the same as saying customers will receive more than that in value. In the future, we need to devote more time to understanding the source of that value, as well as any obstacles that might stand in the way of its realization. Toffler also argued the acceleration of change results in more uncertainty about future needs, but that lesson seems to have fallen on deaf ears so far. |