Bandwidth Glut - Fearing the Worst 11/29/00 By Mark R. Langner Managing Director, Senior Research Analyst Todd M. Fernandez Associate Analyst Bert T. Bangayan Associate
Epoch's Viewpoint
We do not foresee the long-term bandwidth oversupply, or "glut," now predicted by many industry analysts. Rather, we think investors' fears concerning an imminent bandwidth glut in the long-haul and metropolitan portions of the network are currently being fed by broader pessimism in the financial markets. We believe the fundamentals of large, well capitalized data-network builders remain intact. However, not all communications services networks are built equally. Networks based on the needs of "yesterday's" market will be hard pressed to compete effectively against the efforts of these new data-focused network builders. We continue to foresee a rational market for bandwidth pricing over the longer-term that will justify network builders' current capital investments in this critical portion of the broadband and IP data services delivery chain. The return on investment (ROI) is likely to be higher for network builders that focus primarily on data and have architected their networks using open standards and are continually upgradeable.
Key Points
Bandwidth Glut Concerns Are Overblown: We believe a glut of bandwidth does not currently exist in either the long-haul or metropolitan portions of the network.
High-Capacity Bandwidth Is not a Commodity: Data connectivity, by definition, lacks some of the key traits that define a commodity (liquidity, accessibility, price visibility, etc.) Market pundits have erringly associated price compression with additional capacity, and have drawn the mistaken conclusion that bandwidth has achieved a state of "commoditization."
Economics Will Dictate a Rational Pricing Market: Deployed fiber does not equal operational bandwidth. The costs associated with making fiber operational are significant. The introduction of additional capacity in the network will be dictated by networkers'/users' return-on-investment thresholds. This economic fact necessitates a rational market for bandwidth pricing where supply and demand continually seesaw based on broader market forces.
Demand To Keep Pace with Supply: Evolving macro trends in networking systems and application development, as well as in the broadband wireless and broadband services markets will continue to drive bandwidth consumption at a pace that will equal or surpass current capacity forecasts.
The Future of Bandwidth
We do not foresee a long-term bandwidth oversupply, or "glut," now predicted by many industry analysts and market pundits. Rather, we believe investors' fears concerning an imminent bandwidth glut in the long-haul and metropolitan portions of the network are being fed by broader pessimism in the financial markets. Rather than attempt to put forth detailed projections of future bandwidth supply and demand (we could cite projections that support either side of the argument), we will look at the way sentiment concerning network infrastructure builders has evolved over the last 18 months and explain the fundamentals underlying network providers' service value. We hope to give investors a reference point from which they can apply their own unprejudiced analyses concerning the future market for bandwidth and networking service providers.
Where We Came from and Where We Are
In our opinion, fears about a future bandwidth glut are central to the recent downward turn in the public valuation of next-generation network builders such as Level 3 Communications (LVLT), Global Crossing (GBLX), MetroMedia Fiber (MFNX), and Williams Communications (WCG). These concerns are not unfounded. But little thought was given to them in early 1999 when the number of Internet users was skyrocketing, e-commerce was exploding, and both industry and research analysts were predicting the coming of innumerable potential bandwidth-hungry Internet applications. The crash of the Victoria's Secret fashion-show website and the inability to download the Starr Report (whether or not these systems deficiencies were linked to a lack of bandwidth capacity), made infrastructure builders such as Level 3 look like saviors to public investors. In knee-jerk reaction, the public markets poured billions of dollars into the coffers of these aggressive infrastructure builders to support their business visions.
The recent turn in investor sentiment concerning first-generation dot-coms, business-to-business commerce enablers, application service providers, and voice-based competitive local exchange carriers (CLECs) have subdued this previous unbridled enthusiasm for a bandwidth-hungry world. Network builders, however, have continued to pursue their aggressive infrastructure deployments. While analysts concede that the Internet is not going to go away anytime soon, looking at the amount of money currently being spent and the infrastructure coming online over the next 12 months, investors have become increasingly gun-shy. The motto "build it, and they will come" has changed to "build it, and will they come?" Such fears are not new (articles on bandwidth-glut started appearing in late 1998), but pessimism in the industry and financial markets and increased attention from investors and the press have fueled these fears.
Bandwidth Glut -- Throwing the Bandwidth Out with the Bath Water
We do not believe there is going to be a long-term oversupply of bandwidth. In researching this report, we revisited much of the analysis of this subject over the last 18 months and find it amusing that research written in early 1999 looks very similar to research being written today. Most use trite analysis or apples-to-oranges comparisons.
For capacity projections, most analysis simply takes the number of fiber miles currently being deployed and assumes a certain level of capacity based on future developments of dense wave division multiplexing (DWDM) technologies. For demand projections, hypotheticals are used, such as assuming a throughput rate for every U.S. household based on the size of current streaming-media files. Usually, these hypotheticals end with a curt statement such as "with the current capacity assumed by these fiber deployments, this would allow for 1,000 billion streams of the TV show Charlie's Angels all at once!!!" Most cite the continual decline in bandwidth pricing, often falsely associating the falling cost of Internet access or DSL connectivity as evidence that bandwidth is becoming a "commodity."
Associating local access or Internet connectivity pricing with general bandwidth pricing (i.e., price/bit per second) at the core of the network is similar to comparing the price of oranges to orange juice -- same denominator, different product. Secondly, declining pricing is not a defining characteristic of a commodity, as the term is defined in economic theory. Oranges and crude oil are commodities, and yet their prices rise and fall based on the interplay of supply and demand. At times, producing and selling these commodities is very profitable; sometimes it is less profitable. Commodities are defined not by the existence of continual price decline, but by the fact that there is full visibility into their pricing (on both a current and future basis) and that individual units can be freely exchanged without reducing the value of the commodity. Such price visibility and market liquidity allows for the trading of futures contracts on anticipated price movements as well as the existence of a liquid exchange market for these commodities.
By these economic definitions, bandwidth or long-haul data transit cannot be considered a commodity. Visibility into pricing is extremely limited, as evidenced by the fact that most (if not all) bandwidth deals are struck behind closed doors. Bandwidth exchanges, which make promises of creating a liquid market for raw bandwidth or capacity across specific routes, are currently in a rudimentary stage of development (Enron, one of the most aggressive bandwidth traders, is said to have executed only 50 trades in the first nine months of 2000). Also, bandwidth capacity is spotty (i.e., connectivity is as much about location as it is about throughput). Certain routes are less transited or have fewer competitive providers than others and are therefore more expensive. This is not the case with most commodities, whose prices are not tied to location. Finally, and most importantly, pricing markets for goods and services are (usually) rational. Irrationality may have pushed Internet stocks to unreasonable levels, but for the most part, the push-and-pull between supply and demand that enables price discovery tends to operate in a rational fashion. That said, investors must keep in mind that bandwidth deployed is not bandwidth available. Raw (dark) bandwidth accounts for less than 80% of the total cost of deploying an operational network. Lighting dark fiber requires millions of dollars in electronics (switches, routers, etc.) to become operational. Therefore, we believe additional capacity is more likely to be brought online as these networks reach full capacity and the capital required to fund these build-outs is more readily available to network operators and service providers.
Long Term Supply vs. Demand
Source: Epoch Partners Take the recent behavior of the financial markets as an illustration of how bandwidth differs from a commodity. Depressed public valuations in the equity and debt markets are telling us that investors have concerns regarding the long-term value of the assets being deployed by network builders and the ability of these assets to produce an acceptable return-on-investment. One reason behind this shift, as stated earlier, is the concern regarding an imminent bandwidth glut (too much supply = reduced pricing = reduced ROI). This has restricted companies' access to the capital markets to fund enhancement of the network assets that have already been deployed (i.e., dropping more electronics on these networks to create more capacity on certain routes or increasing capacity on fibers already lit). As a result, many carriers and network builders have said they will cut investments in these technologies for the time being. In a rational pricing market, we would anticipate (although there has been no evidence of this occurring yet) that this dynamic will surface in wholesale bandwidth pricing in the near to mid-term if access to capital remains constrained. As depicted in the chart above, the result is a roller-coaster effect between supply and demand. This roller-coaster effect, while unlikely to occur as smoothly as depicted above, is more likely to occur than a massive initial gap between capacity and demand that would trigger a disruptive pricing dynamic.
Future Demand -- Filling Up the Hole of Bandwidth Glut
Even if bandwidth does not fit the definition of a commodity and maintains a rational pricing dynamic over the long term, the question remains: What data will fill all of these fat pipes and drive future demand? We maintain that, over the long-term, demand for bandwidth (at both the core and the edge of the network) will keep pace with or even outstrip available bandwidth capacity. To support these claims, we are not going to attempt to formulate our own forward demand-versus-supply projection scenario (which would be likely to be built upon stretched and flimsy assumptions), or simply quote third-party research (since we could just as easily cite research that confirms either side of the argument). Rather, we will point to some of the macro trends occurring in communications services and networking technologies development that not only predict, but necessitate, a networking infrastructure whose capacity must continue to scale as demand for bandwidth continues to accelerate. These important macro-level trends include:
"The Hollowing Out of the Computer":Eric Schmidt of Novell is said to have first used this term in 1995 to describe a networked future where the processing and storage power of the computer was externalized from its centralized encasements and placed onto wide area networks (WAN). Traditionally, limited throughput and the high costs of deploying a WAN dictated that data remain within centralized local area networks (LAN). This significantly impeded how data was leveraged across different applications, users, and physical locations. Currently, more than 90% of all corporate data resides in private or LAN-based networks; this severely limits how this data can be leveraged, and who can do the leveraging. Distributive networking and computing architectures, in comparison, allow for greater interaction with applications and data across larger user bases, extended scalability of applications and processing power, and increased networking redundancy. Wide-area (or public) broadband networking is the key element to making such distributive computing a reality. With fast connectivity across wide networks to diverse computing environments and disparate end users, interactivity between applications and end users will catapult to levels of customization and breadth never possible in the LAN-based worlds of the mainframe or client-server. In such a distributive computing environment, developers will write applications and engineers will build new networking systems to take full advantage of inexpensive, widely available bandwidth. As more systems, applications, and end-users are placed onto these wide-area networks, Metcalfe's Law states that each new element not only increases the value of that network to all of its end-users, but also increases the amount of data demanded by each user. Service providers that benefit, alongside the data transport poviders, from this trend include public companies such as Akamai, Storage Networks and Digital Island. Private companies that are likely to benefit include: Scale8, Ensim, Centrata, United Devices, Cidera, and iPass.
Development and Adoption of Network-Based Applications: The amount of data associated with a new generation of applications that reside within wide-area networks is likely to grow rapidly in an era of proliferating and inexpensive bandwidth. These network-based applications will far surpass service providers' early attempts to simply "webify" existing enterprise-based applications. Rather than serving traditional applications from centralized systems over broadband connections (as is the case with the first-generation application-service-provider model), the elements that make up these applications and the data behind them will be pushed out to the edges of the network. Application delivery will be customized for end-user consumption based on specific business logic or user specifications -- thereby taking application interactivity from a one-to-many to a many-to-many dynamic. Furthermore, consumer-orientated networking-based applications, now limited to basic Web-based content delivery and email, are likely to proliferate. Service providers that benefit, alongside the data transport providers, from this trend include public companies such as Akamai and Critical PathPrivate companies that are likley to benefit include SmartPipes, Aventail, CoreExpress, Jamcracker and AppStream.
Wireless Access and Wireless Broadband: The continued proliferation of wireless personal digital assistants (PDA) and the movement towards third-generation (3G) broadband wireless capabilities will necessarily increase the amount of data transported on land-based broadband networks. In contrast to wireline networks, which are entering an era of capacity abundance, wireless networks suffer from spectrum and capacity limitations. Therefore, to most efficiently utilize their valuable spectrum and airtime, wireless network and application providers will be forced to leverage land-based broadband networks to strategically push data and applications out to the edge of broadband networks, where they can be delivered locally over wireless networks to end-user devices.
Internationalization:The deployment of broadband network infrastructure in Europe and Asia, the increasing need for corporations to diversify internationally, and the overall globalization of commerce and culture will continue to drive data consumption worldwide. Currently, more than 60% of European and Asian Internet traffic transits through the United States This will shift dramatically as the broadband infrastructure on these continents is built out, as content becomes locally customized, and corporations extend their private networks to support distributive application delivery or global data mirroring.
History Is Likely to Repeat Itself: Technology paradigms tend to repeat themselves. Again and again in technology circles we have seen demand for processing power, storage capacity and networking throughput exceed early demand projections. A key driver behind this occurrence is the recurring relationship between falling unit costs and increased functionality (processing power, storage capacity, data throughput), or what is known as "price elasticity." A Federal Communications Commission (FCC) study of bandwidth price elasticity revealed that for every unit price drop, sales increased by an additional unit -- a one-to-one ratio. Since the late 1980s, according to Barron's, a 1% decline in the cost of processing power has resulted in a 2% increase in demand. According to Dataquest and Epoch Partners estimates, retail pricing for local access T-1s has dropped more than 70% since 1994, while the number of T-1s in service has increased 177%. We believe this dynamic in bandwidth pricing, deployed capacity and demand will perpetuate itself, thereby accelerating bandwidth demand despite increasing supply.
We believe these five key macro trends set into motion a recurring and accelerating cycle of bandwidth and data consumption. It will be difficult, if not impossible, to predict how the actual metrics of supply and demand will play out over time. Broader economic events will have a strong impact on the near-term interplay between supply and demand. What we are sure of, however, is that these long-term trends support a thesis that calls for the continued proliferation and consumption of bandwidth and a stable and rational pricing model for backbone transit services over the longer term.
Summary
If investors agree with the central theme of this piece, what should their investment strategy be in today's market? Even with depressed valuations across the board for communication service providers(including those that build themselves as "data oriented"), we do not believe investors should simply "buy the lot." In our opinion, rather, investors would be best suited seeking out providers that are well capitalized and that focus primarily on data-centric network building.
The day of reckoning will soon come for legacy carriers that are currently "jimmying" their voice-centric networks to meet increasing demand for data connectivity. However, even if one believes in the value of next-generation networks over their legacy brethren, access to capital in the public markets may remain constrained for some time. Therefore, investors should stick with data-network builders that are well capitalized now: Level 3 Communications, Enron Communications, and 360 Networks fit this description. Companies to avoid would include the intergrated voice-oriented players that are attempting to layer data functionality over their primarily voice networks, or networks that were built on older technologies and architected when intergrated telecom ruled the day. These include companies such as WorldCom, Sprint, and AT&T. We would list Qwest, Williams and Broadwing as "tweeners," companies which have newer technologies, and therefore are not exposed as the traditional telecom players, but have architected these networks in a fashion to pursue a more old-line intergrated telecom-provider business models.
The information contained herein is based on sources believed to be reliable but is neither all inclusive nor guaranteed by Epoch Partners. Opinions, if any, reflect our judgment at this time and are subject to change. Epoch Partners does not undertake to advise of changes in its opinion or the information. Epoch Partners may perform or seek to perform investment banking services for the issuers of securities which are the subject of our Research. Most of the companies Epoch Partners follows are emerging growth companies whose securities typically involve a higher degree of risk and more volatility than the securities of more established companies. The securities discussed in the Epoch Partners Research may be unsuitable for investors depending on their specific investment objectives and financial situation and needs. No report included in the Epoch Partners Research is a recommendation that any particular investor should purchase or sell any particular security in any amount or at all and is not a solicitation of any offer to purchase or sell from or to any particular investor. For additional information that may be available on the securities mentioned, please contact Epoch Partners.
This document has been published in the United States for Residents of the United States. Copyright 2000, Epoch Securities Inc. All rights reserved. Member NASD/SIPC.
ragingbull.altavista.com |