First Monday

The many paradoxes of broadband

Abstract
The many paradoxes of broadband by Andrew Odlyzko

There is much dismay and even despair over the slow pace at which broadband is advancing in the United States. This slow pace is often claimed to be fatally retarding the recovery of the entire IT industry. As a result there are increasing calls for government action, through regulation or even through outright subsidies.

A careful examination shows that broadband is full of puzzles and paradoxes, which suggests caution before taking any drastic action. As one simple example, the basic meaning of broadband is almost universally misunderstood, since by the official definition, we all have broadband courtesy of the postal system. Also, broadband penetration, while generally regarded as disappointingly slow, is actually extremely fast by most standards, faster than cell phone diffusion at a comparable stage. Furthermore, many of the policies proposed for advancing broadband are likely to have perverse effects. There are many opportunities for narrowband services that are not being exploited, some of which might speed up broadband adoption.

There are interesting dynamics to the financial and technological scenes that suggest broadband access may arrive sooner than generally expected. It may also arrive through unexpected channels. On the other hand, fiber-to-the-home, widely regarded as the Holy Grail of residential broadband, might never become widespread. In any case, there is likely to be considerable turmoil in the telecom industry over the next few years. Robust growth in demand is likely to be combined with a restructuring of the industry.

Contents

1. Introduction
2. Making money in telecom the Yellow Pages way
3. The state of the telecom industry
4. Telecoms and nineteenth century railroads
5. Demand for telecommunications
6. What is broadband?
7. What is broadband good for?
8. Neglected opportunities
9. Telecom today and nineteenth century postal systems
10. Diffusion of new technologies
11. Continuing technological progress
12. Costs of connectivity
13. Financial markets and the arrival of broadband
14. A spoiler at the broadband party
15. Conclusions

 


 

++++++++++

1. Introduction

Broadband was the mantra of the dot-com and telecom booms, and is being offered as a magic elixir for curing the woes of the high tech sector. Once American businesses and households have high speed links to the Internet, the claims run, they will open up their wallets and buy new software and hardware from Cisco, Intel, Microsoft, and numerous other suppliers. That will then lead to a revival of the entire information technology (IT) industry and spur faster growth of the general economy. There is even a school of thought that claims the dot-com and telecom booms ended in crashes only because the telecom industry did not deliver broadband access to the home. Three samples of recent calls for action to deliver broadband quickly are [ 28, 36, 69].

Oneparadox, an inconvenient one for broadband enthusiasts, is that while there is extensive moaning and groaning about slow deployment of this new communication service, broadband was already available to the vast majority (well over 80 percent) of American households by 2001. Yet only about 10 percent of those households had chosen to subscribe by year-end 2001 [ 19, 23]. Thus, as is increasingly being recognized (cf. [ 19, 70]),it is adoption, not deployment, that is the issue. At year-end 2001 there were 12.8 million broadband lines in the U.S. according to FCC statistics [23] (with broadband defined as offering a speed exceeding 200 Kb/s in at least one direction). At the same time, there were 128 million cell phones in the U.S. [13]. The average monthly fees for wireless telephony and broadband are comparable (US$40-50). So here we had a population that was voting with its wallets 10:1 in favor of cell phones over broadband. Somehow all those promises of a glorious future of telecommuting, telemedicine, and distance learning failed to sway the citizenry, and they opted to spend their money for mundane voice calls over a narrowband channel with lousy quality. Mobility seemed to trump broadband.

 

Table 1: Millions of broadband subscribers in U.S. in December of each year.

Year
Number of subscribers
1999
2.8
2000
7.1
2001
12.8
2002
19.9

 

 

Table 2: Millions of cellular subscribers in U.S. in December of each year.

Year
Number of subscribers
1989
3.5
1990
5.3
1991
7.6
1992
11.0
1993
16.0
1994
24.1

 

Another broadband paradox that offers a different perspective appears when we look at these statistics more closely. While raw numbers do show a 10:1 edge for narrowband wireless over broadband at year-end 2001, if one considers the rate at which services are taken up, it appears that broadband is much more attractive than cellular. Tables 1 and 2 (based on [13, 23]) show that in the three years between year-end 1999 and year-end 2002, broadband advanced about as much as cellular did in the five years between year-end 1989 and year-end 1994.

Broadband’s spread is therefore slow only by the standards of "Internet time," but then Internet time is a dangerous myth, one of the key culprits responsible for the Internet bubble [ 51]. Most technologies take on the order of a decade to diffuse widely, and by that standard broadband is doing quite well [43, 71].(Of course it is not doing well by comparison with its advance in South Korea, say, but that is another question, related to another broadband puzzle.) Lower prices and more vigorous marketing would likely accelerate the spread of broadband, but is that a worthwhile use of limited resources?

Yet another paradox of broadband is that few people understand what broadband is. If we use a literal interpretation of the official FCC definition (a link with a speed of over 200 Kb/s in at least one direction), then we all have broadband (and have had it for decades) courtesy of the postal service! (This claim and its implications, as well as related questions, are discussed in Section 6, "What is broadband?".)

The aim of this note is to explore some of the numerous and varied puzzles and paradoxes of broadband. The basic questions that are addressed are:

Let me state upfront my personal preferences and beliefs. I am a broadband addict. After two decades of various types of access methods provided by my previous employer at home (starting with early 300 baud modems, and going on through ISDN and cable modems) I am currently paying out of my own pocket for two broadband links (DSL and cable modem). (In addition, I have even faster connections at the office, both wired and wireless.) On trips, I am happy to pay the US$10 per day fee for broadband access that some hotels charge. I believe (and can demonstrate) that broadband makes me much more productive and has changed my life for the better. Furthermore, I believe that eventually broadband will achieve very high penetration in our society. Historical evidence for services such as mail and the telephone shows that penetration and usage eventually reached far higher levels than even the most ardent early proponents predicted. However, this took time, and broadband may also require time. Furthermore, broadband is not necessarily the most important obstacle to economic development, and the case for huge public investments in it is questionable. As one example, my 0.6 Mb/s DSL and 1.5 Mb/s cable modem connections provide about equal performance, as far as my personal usage goes. The 10 Mb/s connection at the office is distinctly better, but I would give that up in favor of a larger screen with higher resolution, say a large LCD screen with 10 megapixels. That would improve my productivity to a greater extent. Should we therefore make improving screen technology a government priority? That is a key question. We can invest in broadband, but is that the most productive use of our resources?

Among the many paradoxes of broadband is that although there is a remarkable degree of unanimity that broadband is great and highly desirable, we don’t really know what it is good for, and in general are not willing to pay much for it. A May 2003 survey [5] shows that of U.S. residential Internet subscribers, nine percent are still using 28/33.3 Kb/s modems, and three percent 14.4 Kb/s ones! Another recent survey even shows that about 40 percent of the U.S population is simply not interested in getting access to the Internet, whether narrowband or broadband, and this includes some people who have used the Internet in the past [64]. Historical precedents suggest that this fraction will diminish with time. On the other hand, other historical precedents suggest we should not expect this to happen very quickly.

Quote

How quickly we get broadband is likely to depend on the dynamics of the financial markets more than on regulatory moves or tax credits. And there are interesting developments, discussed in Sections 12, "Costs of connectivity," and 13, "Financial markets and the arrival of broadband,"both technological and financial, that suggest that broadband mayarrive sooner than is currently expected. The long-awaited convergence is finally arriving, and is likely to lead to intense competition. This might dismay investors, as it might lead to losses even from what seemed to be safe investments. However, it might produce a rush to deploy and market residential broadband.

The general expectation for a long time has been that the ultimate form of broadband connectivity is via fiber. Advances in photonics offer the prospect of essentially endlessly upgradeable bandwidth over the same physical fiber link. Commercial, government, and academic institutions are increasingly taking advantage of this capability. The only question seemed to be when fiber-to-the-home, FTTH, might become feasible. Cable modems and DSL have sometimes been regarded as just way stations on the way to FTTH. The thinking was that whichever carrier managed to get the highest broadband market share in an area would then have the resources and justification for deploying FTTH. Yet the prospects for FTTH, which appeared to brighten recently as a result of technical advances and announcements from most of the large ILECs, have been troubling for some public policy advocates. Fiber appears to be a real natural monopoly. Once connected to a home, it can carry all conceivable communications for the foreseeable future, and could preclude any competition. Hence the owner of that fiber would have a stranglehold over an increasingly vital artery of social, political, and economic life. However, as is outlined in Section 14, "A spoiler at the broadband party," it is quite possible that FTTH may never become widespread. Given the relative rates at which household bandwidth demand is growing, and at which wireless technology is advancing, there is a substantial probability that residential demands might be met by fixed wireless services. The scaling properties of wireless services are much more conducive to multiple competing carriers than are those of wireline services, where much of the basic infrastructure cost is independent of the number of customers. Therefore, should the fixed wireless solution dominate, the public policy concerns over fiber monopolies would be alleviated.

Quote

The puzzles and paradoxes of broadband are just that. I do not claim to be able to resolve them. The goal of this paper is to illustrate some of the basic issues and likely developments, in many cases through current statistics and historical analogies. The emphasis is on bringing out some unconventional views, not to present a comprehensive overview such as that of [41]. The concluding section discusses some possible courses of action for government. In general, I feel that few clear and practical recommendations can be formulated.

The discussion in this paper is very U.S.-centric, based on the particular constellation of carriers and technologies that dominate here. Many of the examples and arguments may be applicable in other countries, but not necessarily directly.

 

++++++++++

2. Making money in telecom the Yellow Pages way

The telecommunications industry is widely regarded as a disaster area, with widespread bankruptcies, including companies as large as WorldCom, and hundreds of thousands of job losses. Yet surprisingly high profits are being made in some sectors of this industry. A particularly intriguing example is that of phone directories. In the summer of 2002, in order to avoid bankruptcy, Qwest sold its directory division for about US$7 billion. This (almost exclusively) print directory business had annual revenues of only US$1.6 billion, but margins of 63 percent, and free cash flow of US$0.5 billion per year [ 7, 66]. Thus the financial performance of this old technology unit was outstanding, something that even Microsoft would not sneer at.

The financials of other ILEC Yellow Pages units are apparently almost as attractive as those of Qwest’s. This is so even in the face of vigorous competition from other print directory services (which are apparently quite often also profitable, even though nowhere near as profitable as those of the ILECs) and online information providers. Yet wasn’t the Internet supposed to obliterate all these businesses, and provide far better service (and save innumerable trees)? That this has not happened suggests several related thoughts that will be explored at greater length later. One is that technology almost invariably takes longer to rework society than its enthusiasts predict. Another is that profits are increasingly tied to intangibles such as customers’ inertia as opposed to concrete physical plant. This puts into question many arguments (including some presented later in this paper) about the advantages that lower costs offer to a new technology in penetrating a market. It may also help explain better than a conspiracy the lack of interest that the ILECs have shown in competing with each other, in spite of their constant complaints that the rates set by regulators for UNE leasing offered new entrants unfair subsidies. (Their reluctance to compete was shown most graphically by SBC. As a condition for permission to acquire Ameritech, it promised to move into a number of other ILECs’ markets. It quickly reneged on this promise, once the merger was completed.) The important role of customer inertia might also help explain the failure of the CLECs [15, 74].

 

++++++++++

3. The state of the telecom industry

The telecom industry is widely regarded as being in a depression, and most of the discussion is whether it has hit bottom yet. A somewhat different perspective emerges when we consider actual statistics of different sectors of this industry. The real disaster has been in the telecom supplier sector, while the service sector as a whole has been pretty healthy, although subject to major internal shifts and upheavals.

Table 3 shows total U.S. telecommunications service revenues for the last few years. I first digress by discussing these statistics. They are derived from Table 3 of [22]. However, the statistics for 2001 in Table 3 of [22] include US$66 billion in services sold to other carriers for resale, so actual end user spending was only US$236 billion. On the other hand, the statistics of Table 3 in [22] exclude US$48 billion of various other types of revenue from reporting carriers, such as inside wiring maintenance, directory publishing, and Internet access. They also exclude revenues of cable TV companies for providing Internet access, as well as revenues of many other ISPs. Thus one can come up with other figures, and [61], for example, credits the U.S. with telecom spending of US$345 billion in 2001. For our purposes in this section, we are interested primarily in trends, and so just about any consistent set of statistics is adequate.

 

Table 3: Total telecommunications revenues in U.S., with data for 2002 preliminary.

Year
Revenue (US$ billions)
Increase (percent)
1995
190
 
1996
212
11.6
1997
231
9.0
1998
246
6.5
1999
269
9.3
2000
293
8.9
2001
302
3.0
2002
294
-2.7

 

The preliminary estimate for 2002 in Table 3 suggests that there was an actual decline in telecom service revenues in 2002, but by a mild 2.7 percent. This succeeded a year of mild three percent growth. However, the preceding few years had seen substantial increases. They may not have been quite up to the expectations of the era (when IT spending as a whole was often growing 15 to 20 percent per year), but they were above historical norms. Back in 1850, spending on telecommunications (primarily the postal service, with a pinch of the electric telegraph thrown in) in the U. S. was about 0.2 percent of GDP [49]. By 2000, that had grown to perhaps four percent (including the traditional voice telephony, Internet, cellular, and parts of the postal system and of express delivery companies such as FedEx). Thus over the last 150 years, telecom spending has been growing about two percent per year faster than the economy as a whole. In the late 1990s, it grew even faster, and it could be that the decline to a sub-par growth in 2001 and 2002 just corrects an overshoot.

As a small digression, let us note that statistics in [61] show that telecom service revenues grew extremely rapidly in the late 1990s in most industrialized economies. In many countries this represented a period of catching up from a position where their telecom sectors were far smaller relative to the sizes of their economies than in the U.S., a move spurred by widespread deregulation and privatization.

Table 3 and the discussion above show that the telecom service provider sector has done quite well as a whole. However, there was a lot of turmoil. Some segments have collapsed (CLECs and the new long distance data carriers), others have been squeezed significantly (traditional long distance carriers), and wireless has boomed.

Quote

What really crashed in the telecom area is the supplier sector, represented by companies such as Alcatel, Ciena, Lucent, and Nortel. Capital expenditures by carriers had exploded in the late 1990s, growing almost 2.5x from 1997 to the peak in 2000, and have since returned to about their former level. This is shown graphically in the figure in [30].

The crash of the telecom suppliers and the dot-coms was accompanied by a collapse of the hope for effortless stock option riches. The telecom suppliers are likely to recover, although almost surely not to the elevated levels of the bubble, since telecom demand continues to grow, just as it has historically. Whether the telecom share bubble will recur is another question. An instructive comparison can be made with the early history of the railroads.

 

++++++++++

4. Telecom and nineteenth century railroads

Figure 1 shows authorizations by the British Parliament for building new railroads, in miles of track, during a crucial formative period of the railway industry, 1833-1850 [4]. Not all the authorized railways were built. The authorizations represented in Figure 1 come to about 12,000 miles, whereas by 1850 only about 6,000 miles of railways were in service. Still, authorizations were cumbersome and expensive to obtain (cf. [59]), so they can be compared to IPOs in the U.S. during the late 1990s, and show the level of speculative excitement among investors.


Figure 1: Miles of railways authorized by British Parliament from 1833 to 1850.

The investment boom and bust cycles seen in Figure 1 are very pronounced. (There was even an earlier and smaller railway boom in the mid-1820s, discussed in [59].)However, this did not come from any volatility in demand. The industry continued growing, with steady increases in traffic and revenues throughout this period. By 1840, at the trough of the first bust visible in Figure 1, there were about 2,000 miles of railways in service in Britain, mostly short lines relieving local transportation bottlenecks. (Moreover, canal traffic, along with horse transport, continued growing vigorously, cf. [59].) By 1850, there were about 6,000 miles of functioning railways, connecting all the major cities. Traffic continued growing, but the industry was in the dumps. In 1857, The Economist (which had changed its name in 1845 to The Economist, Weekly Commercial Times, Bankers’ Gazette, & Railway Monitor, to reflect the importance of the railroads), was lamenting that "[i]t is a very sad thing unquestionably that railways, which mechanically have succeeded beyond anticipation and are quite wonderful for their general utility and convenience, should have failed commercially." Yet railway technology was not abandoned, and continued attracting new investments. By 1900, railway mileage in Britain had grown further to about 20,000 miles, or about 3x the level of 1850.

Traffic (as measured by revenues, passenger trips, or freight ton-miles) grew about 10x during this period, 1850 to 1900. The problem was not that railroad technology was faulty, nor even that the basic business model was deficient, but that "irrational exuberance" led investors to pour too much money into railways too soon. The underlying demand for the planned and built capacity did materialize, but took time to develop.

Many instructive comparisons can be made between the Internet and nineteenth century railroads [59]. In particular, it is worth noting that there were no serious service interruptions on railways. Shareholders and sometimes even bondholders did get wiped out every once in a while. A few lines did get shut down, but on the whole customers did get served, even at the depth of the depression after the mid-1840s boom portrayed in Figure 1. Moreover, this happened in an almost unfettered market. While there were some government oversight and intervention, strong government regulation did not arrive (in either Britain or the U.S.) until late in the nineteenth century.

The phenomenon of financial excess associated with promising novel technologies is a recurring feature of the last two centuries. The basic pattern of thinking that causes this behavior was recognized early on. For example, in 1825 an American author analyzed the finances of British canals [2]. He concluded that although several were earning return on investment of over 100 percent, on the whole industry profits were disappointing:

"[Canals] have been ruinous to their proprietors, but porbably [sic] have been beneficial to the public. Hence the absurdity of that canal mamia [sic], which is beginning to prevail in the United States, — the absurdity of supposing because canals and other works have proved beneficial when constructed in proper situations that they are beneficial in every situation."

In the same pattern, in the late 1990s, seduced by tales of "Internet traffic doubling every 100 days," investors decided that if three nationwide optical fiber networks were good, then 13 were going to be better. Even more than with canals two centuries earlier, this was folly that led to gigantic financial losses and company and personal dislocations.

But demand did continue to grow. It’s just that investments were made on the assumption of faster growth than materialized. In less-competitive areas, such as the directory business mentioned above, or even among rural carriers [31], there is none of the gloom that pervades most of the telecom industry. However, those areas also lack the excitement of the bubble years.

Are we going to have another period of "irrational exuberance"? History strongly suggests (through Figure 1and many other instances) that we will. However, history suggests (again through Figure 1, or the Japanese experience of the 1980s and 1990s, or many other cases) that it will take till the end of this decade or later, and may not be centered on telecom. On the other hand, there could be smaller but still substantial recoveries in telecom, as well as spectacular successes of particular companies or sectors of the industry. There are arguments that telecom might become more capital intensive [54], which would lift sales of supplier companies. There could also be a secondary boom induced by the Y2K effect, possibly in late 2003 or early 2004. The peak of the telecom bubble was enlarged by spending to cope with the potential threat of the Y2K phenomenon. Much of what was installed then is getting dated, and the case for upgrades is getting stronger all the time. Of course, we are now supposedly living in a new era, in which all new spending has to be justified on the basis of return on investment. However, that is part of the same herd mentality that during the boom rewarded any spending on any and all e-initiatives, and could change quickly. What happens in the short run depends very much on mass psychology, and seems impossible to predict. Thus there is going to be considerably uncertainty about broadband deployment. However, even in the absence of a big share price recovery, we could get vigorous action on the broadband front. As an example, Japan, more than a decade into a general financial and economic slump, is moving rapidly into residential broadband.

The two most important motivating forces in business are greed and fear. After the debacle of the telecom crash, it might be hard for greed to spark another boom or even boomlet. However, fear might do it, fear awakened by sharpened competition, possibly from unexpected sources. For technology is moving ahead, and demand for telecommunications is growing.

 

++++++++++

5. Demand for telecommunications

As was already mentioned in the previous section, telecom service provider revenues have slowed down their growth, but have not crashed. On the other hand, actual telecom traffic continues to grow vigorously. In particular, Internet traffic is still about doubling every year, as it has been doing ever since 1997 [58]. Moreover, contrary to reports about e-mail displacing voice calls, and so on, just about every telecom service is seeing growth, consistent with historical precedents [49]. In many cases we do not have solid data to be sure of what is happening. For example, long distance voice calls carried by traditional carriers are declining in the U.S. However, this may be due to such calls being handled by wireless carriers (in which case such calls still traverse terrestrial fiber optic long distance networks, but are not counted by traditional measures).

Table 4 presents data on usage of some of the main telecommunication services in the U.K. over the last few years, based on reports at [62]. I am citing British data because the U.K. regulator, Oftel, requires carriers to provide detailed statistics of traffic on their networks, more detailed than we have in the U.S. The quarters listed are calendar quarters, not the British government fiscal quarters used in the reports. The wireline voice figure is understated, since it leaves out the voice calls that fall in the "other" Oftel category (including toll-free calls and premium services). At the end of 2002, about half the volume of voice calls was for dial modem Internet access, and about half was for voice calls (which, however, also included fax calls, as there is no way to distinguish those).

 

Table 4: Telecommunications traffic in the U.K. Total wireline usage, wireline voice usage, and wireless voice usage in millions of minutes of outgoing calls. Short message service usage in millions of messages.

Quarter
Wireline
total
Wireline
voice
Wireless
voice
SMS
1999q2
47220
36979
4956
159
1999q3
50608
37590
5804
297
1999q4
53786
38869
7092
599
2000q1
56728
38806
7848
1306
2000q2
58339
37783
8388
1421
2000q3
62783
38237
9340
1648
2000q4
68289
38536
10525
2215
2001q1
73525
39349
11064
2758
2001q2
71940
37166
10874
2762
2001q3
75047
37671
11222
3069
2001q4
78429
37963
11867
3447
2002q1
83779
37887
12330
3924
2002q2
82874
36179
12817
4136
2002q3
81510
35756
13118
4210
2002q4
84003
36234
13914
4683

 

We observe that wireline voice is holding steady, while wireless voice is growing rapidly. Furthermore, the explosive growth in SMS goes along with the continuing growth of wireless voice. Although residential broadband is growing very rapidly in the U.K. (along with volume of e-mail, but we do not have comprehensive data on either service), dial modem Internet access has yet to decline. (There was a drop in 2002q3, which the report at [62] attributed to diversion of users to broadband, but this drop was then reversed in 2002q4. Eventually we should expect most Internet access to migrate to broadband links, but this migration is taking its time.) All this evidence fits the historical pattern of communications usage and spending growing, and established services not being displaced very easily [49]. Technologies do fade away, sometimes slowly, as with the electric telegraph, sometimes faster, as with pagers. They almost never vanish rapidly. As just one example, the fax is still ubiquitous, even though in a world of PCs and e-mail, it seems obsolete and redundant. There are undocumented claims that the number of faxes sent has dropped by half between 1998 and 2002 as a result of displacement by e-mail [68]. On the other hand, the number of standalone fax machines sold has declined only slightly from its peak in 2000. The examples of fax and other services suggest, for example, that while Internet access will be moving to broadband links, wireline and wireless, dial modem access will remain a significant factor for a long time.

Table 4 will be cited later, in Section 8, "Neglected opportunities," in discussion of telecom growth opportunities. Next we consider some questions about broadband. One of the key paradoxes of broadband is that it attracts all the public attention, but not that much spending, while the growth of many other telecom services passes almost unnoticed. What is so special about broadband? First, though, we should ensure we know what broadband is.

 

++++++++++

6. What is broadband?

To qualify as a broadband connection under the standard FCC definition, a link has to have a speed of over 200 Kb/s in at least one direction [23]. That rules out ISDN, and includes almost all DSL and cable modem services. However, it also includes postal services! For US$40 per month (less than what most DSL and cable modem subscribers in the U.S. pay) one can send a 10-pound package each week (at least for many distance-destination pairs) that will contain 160 CD-ROMs, each one with 650 MB of data, or a total of 416 GB of data per month. A 1 Mb/s data link, running at full speed over a month, will deliver only 324 GB of data. Moreover, in practice DSL and cable modem links are run at less than one percent of their capacity, with curent typical residential broadband subscribers in the U.S. downloading between one and two GB per month [58]. (A single movie DVD is usually several GB.) Thus postal services have been providing broadband connections at least since the introduction of the CD-ROM two decades ago!

The observation that physical transport of storage media provides high bandwidth is not new. One of the earliest examples appears to have been the saying, attributed usually to Andrew Tanenbaum, that "one should never underestimate the bandwidth of a station wagon full of magnetic tapes." A similar principle applies also to very high capacity links. As is explored in detail in [38], fiber optic transoceanic cables provide lower data transmission capacity than large container ships filled with CD-ROMs. This is not just a thought experiment. Many large commercial and scientific databases are copied to remote locations using tapes or (increasingly) hard disks. Furthermore, the situation is not likely to change at any time soon, since storage and transmission are currently growing at comparable rates (cf. [11, 12]).

The point of the discussion above is not just to argue that current broadband services are not going to destroy video rental stores and NetFlix any time soon. More importantly, they place broadband in a wider setting, as just one communication service among many, and they raise the question of what the crucial features of a communication system are. Postal service transportation of CD-ROMs provides high bandwidth by delivering large volumes of data, but with delay. What makes DSL and cable modems (and, in the narrowband arena, dial Internet access) attractive is the low transaction latency, being able to get the data one wants quickly. This feature comes at the cost of lower volumes of data. The reason that the "always-on" feature of DSL and cable modems is so attractive (and for many users it is the main attraction of broadband connectivity) is that it reduces the transaction latency of dial modems (which have to dial an access port, determine maximal transmission speed, log in, etc.). In general, there appear to be four main dimensions to a communication service:

When faced with choices of different communication services, users select based on their preferences. At present, the greater reach of low volume cell phones appears to be more attractive than the tethered high volume Internet access. (Cellular also has a higher transaction latency, since there is the call set-up time at the start of a conversation.) Back around 1870 (before the invention of the telephone), the available services were the electric telegraph and the postal system. They had comparable reach, with postal services excelling in volume, and the telegraph in transaction latency, with mail being far less expensive. The result of users voting with their pocketbooks was that revenues of the telegraph industry in the U.S. never got above a third of those of the postal service [49].

The classification through the four main dimensions listed above omits any mention of what is usually considered the most important feature of a data communication service, namely isochronicity. That is certainly vital in voice telephony as well as in real-time video. However, isochronicity can be obtained as a by-product of low transaction latency using memory for buffering. Moreover, for many transmissions that are often regarded as requiring isochronicity, such as video, where there is little or no interaction from the two ends, substantial delays are tolerable if one uses memory buffers. As a result, the tremendous bias that has been present from the beginnings of data networks towards designing for real-time streaming traffic is largely misplaced [ 48, 56, 65]. Residential broadband, as well as Internet backbones, is likely to bedominated (as it is right now) by file transfers, with high bandwidth assuring low transaction latency.

The classification above also omits any mention of "content." The historical preoccupation of the telecom industry with professionally prepared material, in the face of repeated disappointments, is an amazing phenomenon [50]. Fortunately, some carriers appear to be learning that such misapprehension leads to misallocation of resources. A recent story about South Korea, the country with the highest residential broadband penetration, suggests as much [6]:

""The killer application of the Internet is speed," said Lee Yong Kyung, the chief executive of the KT Corporation, formerly known as Korea Telecom, which controls nearly half of the country’s broadband market. "The money is in the pipes.""

The main conclusion of this section is that broadband does offer new options. However, it should be viewed in a broader context of all communication services. And when one does that, one can understand better just what it is that broadband provides, and also what can be done with other, more established services. It also leads to a consideration of the tradeoffs between high bandwidth wireline systems versus lower bandwidth mobile or movable systems. Broadband and the Internet as a whole are new and powerful communications technologies, but they are not the only game in town.

 

++++++++++

7. What is broadband good for?

We are beginning to learn what promotes the spread of broadband, especially through international comparisons, such as [ 1, 60, 29].Competition in general, and facilities-based competition in particular, is good. Low prices are great. (However, low prices are not the complete answer. For example, the supposedly low Korean prices are actually quite high compared to earnings, as is noted in [1]. Hence cultural and institutional factors cannot be neglected.)

Still, there are many open questions. In particular, how do we transition from a network in which most of the revenues come from narrowband voice transmission to one where voice is just one of many applications riding on top of a high bandwidth data network? We now know that it is possible to offer inexpensive broadband connections over existing infrastructures of the phone and cable TV companies, if the costs of those infrastructures are neglected. However, we still do not know how to get enough revenues from those broadband connections to pay for the infrastructures, nor do we even know whether those are the right infrastructures for the future. (There will be more discussion of economics of networks in Sections 12, "Costs of connectivity," and 14, "A spoiler at the broadband party".)

We know even less about what we will do with broadband when we have it. The standard list of applications (cf. Table 1 in [ 36])consists of e-education, e-medicine, e-government, e-commerce, and e-entertainment. Those are the same applications that were touted as reasons for building the "Information Superhighway" a decade ago. Some have developed well (primarily e-commerce, to be discussed in Section 10, "Diffusion of new technologies"), others very little, at least so far (e-education, e-medicine, and e-government), and others in ways different than envisaged (with e-entertainment consisting so far primarily of illicit swapping of copyrighted music files instead of paid services). Technological predictions have always been hard, of course, and much of what broadband proponents say has to be treated cautiously. As just one example, let us consider the following claim [3]:

"Real sustainable economic growth and international security will come from expanding the information revolution to all parts of our society. Metcalfe’s Law states that the value of a network increases exponentially in relation to the number of users."

Well, the "law" stated in the second sentence is actually "Reed’s Law." "Metcalfe’s Law" only says that the value of a network increases as the square of the number of participants. The main problem, though, is that both "Metcalfe’s Law" and "Reed’s Law" are wrong, at least in the precise quantitative form in which they are stated [49]. Yes, there is value in connecting more people, but locality of traffic is a key feature that cannot be neglected. Most communications are local, and the Internet is likely to increase the locality of its transmissions. (This phenomenon has happened in the past with some other services, such as the mail [49].) "The death of distance" is greatly exaggerated. Some of the venture capitalists who proclaim "the death of distance" the loudest are among those who insist that startups have to be based in easy driving distance of their offices on Sand Hill Road. An interesting example (referenced in [49]) was the tech branch of an investment bank that moved from San Francisco to Menlo Park, because San Francisco was too far from the scene of the action in Silicon Valley! The value of locality is diminishing in some jobs (which are then migrating to India and other places) but is getting ever more important in other jobs. Broadband is encouraging the evolution, but there are no clear-cut rules for how it will evolve. As just one example, broadband is often promoted as a way to keep populations in rural areas from declining, by enabling telecommuting. Yet if a job can be exported to a farm in Manilla, Iowa, why couldn’t it be exported at even lower cost to an office building in Manila, The Philippines?

The first part of the quote from [3] presented above also has to be treated with some reservation. That new communication technologies would lead to peace has been hoped for for ages, starting with the postal service. The hopes for an impact on the economy are more likely to come to pass, but even there this will often happen in unexpected ways, and more slowly than many proponents hope.

The frequently voiced hopes that broadband would reduce travel by encouraging telecommuting flies in the face of overwhelming evidence that travel and communications are positively correlated. (This hope is consistent, though, with similarly misplaced hopes expressed almost two centuries ago about the relation between postal services and personal travel.) Yes, there will be more telecommuting, but there will also be more travel.

Although we surely don’t know just how broadband will be used, that is not a novel or insurmountable problem. Technological forecasting has an atrocious record. As just one example, consider the Liverpool and Manchester Railway, the one whose opening in 1830 is usually regarded as the start of the modern railroad era. Its financial success did much to spark the boom in the mid-1830s visible in Figure 1. However, as was noted by a mid-1850s observer [ 9],the Liverpool and Manchester Railway missed its promoters’ projections by a large margin. Costs of construction were three times as high as projected, and the line’s principal role was to carry passengers, as opposed to freight that had been originally envisaged as the main revenue producer. Marginal operating expenses were also far higher than expected. (It should also be noted that the line was started in the mid-1820s, before it had been settled whether trains would be drawn by horses, by stationary steam engines pulling wagons by ropes, or by locomotives. Thus this line represented an extreme example of the faith in the progress in technology that animates many startups.) Yet revenues as of 1845 were 4.3x the projected level, which made up for all the defects in the projections.

As the Liverpool and Manchester Railway example shows, the "build it and they will come" attitude, which animated the dot-com and telecom bubbles, does pay off at times. Unfortunately, all too often they don’t come (as with Iridium), or come and don’t do much (as with Minitel). And even when they do come and embrace the service enthusiastically, there can be losses, if too much of "it" is built too early. That happened with railways in Britain in the 1840s and with long haul fiber networks in the U.S. (and many other countries) in the late 1990s. A key issue for a financially sustainable business is to estimate the rate of adoption correctly. Given the difficulty of predicting either technological developments or how society reacts to them, it is no wonder that booms and busts occur.

The main justification that is cited (as in the quote from [3] above) for a major push to develop broadband connectivity is that it would increase economic growth. The Internet is credited with a large contribution to the dramatic increase in productivity growth that was observed in the U.S. economy in the late 1990s. After slow growth at only about 1.3 percent per year over the preceding two decades, output per hour worked increased at a rate of about 2.8 percent per year in the 1995-2001 period. (For sources of data and references to studies of this subject, see [18, 36].) Of this 1.5 percent annual increase, some was likely caused by various cyclical and other factors. Still, many economists estimate that about one percent per year was due to IT. For several decades, rapidly increasing investment of IT was accompanied by the "Solow paradox" of "you can see the computer age everywhere but in the productivity statistics." Finally, in the late 1990s, the payoff seemed to finally appear. However, to many it seemed to raise productivity growth by a miserly one percent per year. This was nowhere near the "New Economy" expectations that would have led to profit growth of 20 percent per year forever and to Dow Jones at 36,000 instantly.

The common underappreciation of the value of an increase in the growth rate by one percent per year shows two things. One is a lack of understanding of how slowly economies change in general. The spurts of 10 percent annual growth that a few economies, such as those of Japan and South Korea, managed to show for a few years, happen only in unusual circumstances, mostly when a country is catching up with the leaders by exploiting technology and markets that had already been developed elsewhere. The other is the lack of appreciation for the power of compound interest. Raising productivity growth by one percent per year has huge impact over time. But few people appreciate this, and are willing to believe tales of "Internet traffic doubling every 100 days" [58] in spite of the improbabilities and inconsistencies in the stories, and accept promises of unending profit growth of 20 percent per year.

A little historical perspective could have tempered the exaggerated expectations of what the Internet (or IT as a whole) could do for productivity. Railroads by the end of the nineteenth century were at least as large a fraction of the economy as IT is today [59]. They were the most influential industry in that century, and profoundly affected all of society. Yet their impact on rates of economic growth was surprisingly modest. The basic source for this revisionist view is the work of Fogel [ 25]. There is controversy about Fogel’s thesis (see [ 16], for example, or the references listed in [ 59]), but it is now accepted that railroads by themselves did not lead to a big spurt of economic growth.

The moral of this section is that improved telecommunications and improved IT can have a big impact on economic performance in the long run. However, this results primarily from compounding of small improvements. This weakens the case for drastic action. (And indeed, one can raise the basic question: What has South Korea gained from its world-record broadband penetration? Lots of interactive online video gaming [ 6, 27], certainly, but what else? I am sure that with time there will be more concrete payoffs, but it is likely to take some time.)

Although history does teach not to expect dramatic gains in productivity from deployment of better telecom services, it also teaches the advantages of flexibility. As with the Liverpool and Manchester Railway, how systems are eventually used often is at wide variance with projections. Hence there are strong advantages of flexible policy frameworks that can accommodate new technologies and services. The telecom industry has done an abysmal job of providing what users wanted, with a history of technologies such as Minitel, ISDN, ATM, Iridium, and WAP. Most of the successful services, such as the Internet, World Wide Web, Napster, and search engines, came from outside, and most were made possible by the flexibility of the Internet.

 

++++++++++

8. Neglected opportunities

The Internet in general and residential broadband in particular do offer unprecedented opportunities in communications. But there are other opportunities that are not being exploited, associated with the seemingly more mundane voice and e-mail. Many would be relatively simple to take advantage of, and could lead to faster public acceptance and profits for carriers than broadband deployment. After all, voice still provides most of telecom revenues. Thus a small percentage increase in spending on voice services could lead to a bigger gain than a much larger percentage gain in Internet revenues. Table 3 shows that in 2001, U.S. telecom revenues were US$302 billion. Total Internet revenues (not fully captured in Table 3) were about US$35 billion (US$15 billion from dedicated access, and about US$10 billion each from dial modem and residential broadband services). By contrast, cellular produced US$75 billion.

Where are the opportunities to provide better communication services? Well, let us consider Table 4. Something that really stands out there is the rapid growth in SMS. Although the success of SMS and the failure of WAP were consistent with a long historical record [50], cellular carriers were oblivious to this, and poured huge efforts into WAP, and basically stumbled into SMS by accident. However, the attractiveness of SMS has now been well established for many years, and it is a proven money maker for the carriers. Why is it then that it is only now that wireline carriers are beginning to offer its equivalent, and even then apparently only in some places in Europe [17], and only because of the pressure of competition from wireless carriers? Furthermore, why don’t both wireless and wireline carriers promote services that would allow callers to have their voice messages delivered to recipients’ voice mail boxes, thus imitating one of the most attractive features of e-mail, namely its non-intrusive nature?

Given how attractive and profitable SMS is in Europe and Asia, why aren’t U.S. carriers exploiting it? The standard answer is that the U.S. has inexpensive voice calls, both wireline and wireless, so less need for SMS. But that answer is not convincing, since U.S. has the highest intensity of (wireline) e-mail usage. It is more likely that SMS in the U.S. suffers from lack of promotion and interoperability. While there is little the U.S. government can do about carriers’ marketing, if it feels that it would like to push the industry forward, it could mandate SMS interoperability (including wireline variety, once it appears).

Further, even in Europe and Asia, where SMS is already popular, its usage could be expanded. If we consider Table 4 together with the fact that there were just about 50 million cellular subscribers in the U.K. in the fourth quarter of 2002 [62], we see there there were about 90 SMS messages for each subscriber that quarter, or just about one per day. That is a very low number, especially when compared to the number of e-mails that are sent and received. It seems likely that one could stimulate substantial growth in SMS usage. Doing so would not require any new technologies, just some marketing, and in particular a shift towards more attractive pricing plans, either flat rate or for blocks of messages [52].

Another opportunity that stares out of Table 4 is for wireless voice to replace wireline voice. As was noted in Section 5, "Demand for telecommunications,"communications services are not easily displaced by others if there areany material differences between them. However, in the wireless vs. wireline voice comparison, wireless is in principle capable of offering everything that wireline has, plus greater reach. In the U.K., there is still more than three times as much wireline as wireless voice usage. (The wireline voice figure in Table 4 is understated, as mentioned above. There is also the additional factor that incoming and outgoing calls are more balanced in wireline usage, and Table 4 shows only outgoing calls.) Thus there is a huge opportunity for expansion of cellular voice usage at the expense of wireline usage. However, that is not what the wireless industry in the U.K. or anyplace else is concentrating on.

The one country where substitution of wireless for wireline usage appears to be starting is in the U.S., and for reasons that are going almost totally unnoticed. The statistics in Table 4, combined with the count of almost exactly 50 million cellular subscribers in the U.K. at the end of 2002, show that the average cell phone usage in that country consists of about three minutes of outgoing calls per day (and under 4.5 minutes total). Further data at [62] shows that this usage has been stable for the last few years, and is comparable to the average usage in most of the countries for which I have firm statistics or even estimates. The one exception is the U.S.. Table 5, based on data from CTIA and company reports, shows that U.S. usage per subscriber was for many years comparable to that in Britain now. Starting in 1998, though, a new trend set in of rapidly growing usage. This is leading to an increasing number of users abandoning their wireline phones. What is really amazing is that so little attention has been paid to this trend or its causes. There are many laments about U.S. being behind in wireless. It is indeed behind in areas such as introduction of new sophisticated features in handsets, or in the fraction of population that has cell phones. But in usage per subscriber, the U.S. appears to be the world champion. (However, even in the U.S., there is more than three times as much wireline voice usage as cell phone usage, so the substitution effect is just starting. What we have seen so far has mostly been new usage.)

 

Table 5: U.S. cell phone usage, minutes per day around June of each year.

Year
Usage
Minutes/day
1993
4.0
1994
4.2
1995
3.8
1996
4.0
1997
3.6
1998
3.9
1999
5.2
2000
7.4
2001
10.5
2002
13.4

 

How did the U.S. reach its leadership position in wireless usage? It stumbled into it. In the spring of 1998, AT&T Wireless introduced (after intensive internal controversy, and with low expectations) the AT&T Digital One-RateTM plan, which provided for a monthly block of minutes for a fixed price, with no long distance or roaming fees. As should have been eminently predictable (on the basis of AT&T’s experiences alone, if not the huge body of other evidence [49, 52]) this plan proved wildly popular, and led other carriers to respond with their bucket pricing plans, which led to rising usage. What this shows is that industry can be stunningly blind to the opportunities open to it, and second, that simple, non-technological methods can lead to huge expansions in usage.

Currently, the wireless industry around the world is mesmerized by the prospect of delivering various data services using 3G technology. This is increasingly being recognized as a disappointment, and should have been anticipated from the beginning [ 49, 50]. On the other hand, the increased bandwidth of 3G offersopportunities for increased voice usage [ 49, 50, 53].

What can the wireless industry do to stimulate voice usage and (eventually) cannibalize wireline voice usage? It can push forward with bucket pricing plans, and eventually with totally flat rate plans. It can also provide differentiated quality for voice transmission. Right now cellular suffers from quality that is marginal. Using the increased bandwidth of 3G for voice would offer a chance to segment the market (since there would still not be enough bandwidth to provide for high quality transmission of all voice telephony calls), and draw more revenues from the business community. (Wireless is much less successful than the wireline industry in exploiting the greater ability and willingness of business customers to pay for communications.)

The wireless industry can also stimulate usage by offering toll-free calling. Airlines and other businesses are willing to pay for customers to call them from wireline phones, so why should they not be willing to pay for wireless calls?

There are likely many more simple techniques that can be developed on top of ordinary voice services that would be attractive to customers. Nextel’s "push-to-talk" feature is likely just one example.

An international comparison shows huge differences in wireline voice usage per person [49], with the U.S. typically higher by factors of two or three than other countries. This is caused primarily by differences in pricing, with the flat rate residential calling plans in the U.S. stimulating usage (without harming carrier profits). This argues that other countries have easy ways to increase their voice usage by pushing for changes in pricing. (They also have increasing evidence of the effectiveness of this method, through statistics on Internet access being stimulated by flat rates.)

An objection to the measures proposed in this section is that they are all about voice, that old-fashioned technology. But voice is a marvelously flexible communication medium, something that people are very good at. Moreover, its role in making the economy efficient should not be underestimated. While trillions of dollars are now transacted in e-commerce, even more trillions of dollars are now transacted in t-commerce, where the "t" stands for the telephone. After all, voice calls still play a key role in most large commercial transactions. And while farmers in the U.S. Midwest do use broadband to check on prices of their crops, fishermen in Bangladesh use cell phones to check on prices of different types of fish in accessible harbors, with similar effects of increasing productivity. When we talk of the faster growth in the U.S. economy in the late 1990s and ascribe it to broadband Internet, how can we be sure it was not due to the narrowband cell phones?

A recent ad from AT&T (for one of its new flat rate calling plans) said that "Talk is good." People are willing to pay a lot for it, and do use it extensively. So why not give them the opportunity to use it even more if they so choose?

The big paradox of this section is then that there is so much concern about broadband, while there are still plenty of opportunities in voice. Furthermore, as we will see in Section 14, "A spoiler at the broadband party,", some of the measures for promoting voice usage (especially for promoting substitution of wireless for wireline voice) could also play a big role in promoting broadband.

 

++++++++++

9. Telecom today and nineteenth century postal systems

Section 4 ("Telecoms and nineteenth century railroads") suggested that nineteenth century railways provide good analogies to the evolution of telecommunications today. Nineteenth century postal systems are yet another area full of fruitful comparisons.

"Insatiable demand for bandwidth" was one of the key and most destructive mantras of the Internet bubble. As late as September 2000, Kevin Boyne, the COO of WorldCom’s UUNet, was quoted in the press as saying that "as soon as more capacity becomes available, the Internet community will find interesting, clever ways to use it." Such claims inspired the overinvestment that produced the telecom crash. Yet history provides a valuable perspective that should have warned investors and managers not to believe in the "insatiable demand for bandwidth." Telecommunications has been a growth industry for centuries. As was mentioned in Section 3 ("The state of the telecom industry"), in the U.S. it grew from about 0.2 percent of GDP in 1850 to about four percent in 2000, and other countries have shown similar increases. However, there has not been a single explosive increase in spending similar to what would have been required to make the business plans of the bubble years a reality.

A particularly instructive example is provided by the famous "Penny Post" reform of 1840 in Britain. It reduced the cost of sending a letter anywhere in the United Kingdom to one penny, bringing average postal rates down by more than 80 percent. The effect of this reform (shown in Table 6) was that the number of letters sent jumped dramatically, up 122 percent from 1839 to 1840. (However, much of this increase appears to have come from a decline in letter smuggling, not real growth in usage.) On the other hand, the British Post Office’s revenues dropped 43 percent in that period. This disproved claims of the reform’s most ardent advocates, who had predicted usage would increase faster than prices would drop.

 

Table 6: British Post Office in a period of disruptive change: volume (letters in millions), revenues (million pounds), and profit (million pounds).

Year
Volume
Revenue
Profit
1839
75.9
2.4
1.6
1840
168.8
1.4
0.5
1841
195.5
1.5
0.6
1842
208.4
1.6
0.6
1843
220.5
1.6
0.6
1844
242.1
1.7
0.7
1851
360.6
2.4
1.1

 

There was pent-up demand for mail, but no "insatiable demand for mail." The 1840 Penny Post reform was wildly popular with the public and it made Britain the envy of the world. (Just as today there are studies being produced around the world bemoaning that one country or another is getting ahead in the race for broadband, in the mid-19th century, after the Penny Post reform, there were studies complaining that Britain was far ahead in postal communication. The Post Office was the communication technology then.) After decades of stagnation, communication traffic in Britain started to grow. The volume of letters delivered and total revenues grew at annual rates of 6.3 percent and five percent, respectively, between 1841 and 1851. Eventually both revenues and profits exceeded the 1839 figures. But it did take time. Enterprises and individuals had to learn to use the less expensive and more convenient service productively. Similarly, it takes time for greater and less expensive bandwidth to be incorporated into our economy.

In general, there have been many instances of underestimates of the growth potential of new telecommunications services. The electric telegraph (derided by Henry David Thoreau) and the telephone all had their skeptics. A more recent example is the infamous McKinsey study of the early ’80s that predicted there would be fewer than a million cellular users in the U.S. in the year 2000. As it turned out, there were nearly 100 million.

The rapid growth of the Internet in the early and mid-’90s also caught many (including most top managers in the telecom industry) by surprise. In reaction, unrealistically high estimates of demand became generally accepted. Yet there was plenty of publicly available evidence that the "insatiable demand for bandwidth" was simply not there. As simple examples, corporate and academic data networks were lightly utilized [46, 48], and ISP subscribers were slow to upgrade their modems (with more than half still not having 56K modems as late as 2000, and three percent still using 14.4 Kb/s modems by mid-2003 [ 5]). There were even controlled studies showing limited willingness to pay for broadband [ 72].In general, even in the absence of bandwidth limitations, the experience of large, diverse, and well-wired institutions has been that traffic does not grow faster than about 100 percent per year [ 10- 12, 58].

Jim Crowe of Level 3 used to cite studies from a famous industrial research lab that supposedly proved that demand elasticity of bandwidth was three or four. That those studies were wrong is apparent from the collapse of the new long distance fiber carriers. Dramatic declines in prices did not lead to an increase in revenue. The mistake these studies made was to assume that correlations over a long term between pricing and demand, reflecting complicated relationships between economics, technological progress, and diffusion rates, would predict short term responses to sudden changes in prices. The British Penny Post reform of 1840 could have served as a warning. Prices did drop, but demand did not increase enough to compensate.

The Internet community is finding "interesting, clever ways to use" the growing bandwidth, so we should expect vigorous growth in data traffic, but not at the unrealistic pace that had been predicted. However, while traffic is growing, there is no sign of willingness to dramatically increase spending. Service providers will have to resign themselves to relatively modest increases in revenues, with growth in data coming largely at the expense of traditional voice [58]. As with Britain’s postal reform, we’re now entering a phase in which companies and individuals must learn to use a less expensive and more convenient service in a manner that makes economic sense.

The British Penny Post reform of 1840 provides yet other lessons. This reform was wildly popular with the public (but less so with government officials, since it changed the Post Office from exceedingly profitable to only very profitable). It did lead to an increase in the volume of communication, and thereby surely did make the British economy more efficient. But the increase in productivity was not large enough to be quantifiably attributable to the postal reform.

 

++++++++++

10. Diffusion of new technologies

Perhaps the dominant reason for the dot-com and telecom bubbles and crashes was the conviction that technological progress and its diffusion through society had accelerated, and that the world would now evolve on "Internet time." Yet new technologies and business models take time to spread through society. This was already noted in 1965 by J.C.R. Licklider, the "grandfather of the Internet" [35]:

"A modern maxim says: "People tend to overestimate what can be done in one year and to under-estimate what can be done in five or ten years.""

The Internet has not changed this. It still takes on the order of a decade for fundamental change [ 43, 51].The browser, which did much to inspire the myth of "Internet time," was an exception. (And there were many special factors involved. The browser did have an unbeatable zero price. It also took advantage of the existing voice telephony infrastructure and of the millions of PCs that were already in place and widely used.) As a simple example, personal video recorders such as TiVo may finally be taking off. However, although their owners are almost universally enthusiastic about them, it has been several years since they were introduced (and their producers have struggled financially all along).

The disenchantment with dot-coms brought about by the crash has concealed the fact that quite a few of the dot-com ideas had real merit, if not on the scale or at the speed envisaged in the boom years. As an example, one of the more tragic stories of that period is that of Webvan, which wasted over US$1 billion in attempting to build an online grocery business before closing its doors. Yet selling groceries online is not a stupid idea. It is being developed, although slowly and in niche markets. It is finally making money for both brick-and-mortar grocers and specialized startups [ 34]. Many more examples of dot-com concepts that are finally making their way are cited in [ 40]. E-commerce in general is large and growing, both business-to-consumer and especially business-to-business.

The point of this section is to reinforce the arguments of earlier parts of the paper that one should not expect sudden changes. Another point is that some ideas that have been given up for dead can be revived, either through rethinking the basic business model, or through advances in technology. In particular, the proposal for Internet access through fixed wireless, which led to major losses at Winstar, Teligent, as well as at AT&T Wireless and Sprint, may yet turn out to be the best way to provide residential broadband access, an idea that will be discussed in more detail in Section 14 ("A spoiler at the broadband party").

 

++++++++++

11. Continuing technological progress

Sometimes a new technology fails because it is simply not competitive. Whether it can be revived depends on the relative rates of improvement in its cost/performance ratio versus that of competitors. Back in the 1970s and 1980s, substantial investments were put into renewable energy sources. These efforts did not bear fruit, though, since all renewable technologies that were tried turned out to be too expensive, especially when fossil fuel prices declined. However, after two decades of assiduous work, the economics of these technologies are much more attractive [8]:

"As a result of technological advances, along with government incentives, industry analysts say, the cost for many of the forms of energy has plummeted. Wind power, for instance, has dropped from a cost of about 38 cents a kilowatt hour in the early 1980s to 3 to 3.5 cents now. By comparison, electricity produced by natural gas costs about 5.5 cents a kilowatt hour, a jump of about 1.5 cents since last year, analysts say."

While fossil fuel prices are volatile, and may well decline, the rate of progress in technology suggests that in the long run wind power will be consistently less expensive (in areas where there is a lot of wind). Hence we can expect that the sincerity of renewable energy advocates will be sorely tested in the next few years, as entrepreneurs attempt to build many more wind farms near the homes of those advocates. We will be witnessing the threshold effect, as renewable energy emerges from niche markets to compete for mainstream business.

Telecommunications also offers examples of continuing advances and threshold effects. This has been widely recognized for the long haul, with dramatic and widely publicized advances in photonics and routers. However, there has also been great but less widely known progress in the access networks. As a simple example, the marginal cost to a carrier of offering DSL service over an existing copper infrastructure is estimated at only US$10 to US$15 per month [20]. (This is to a considerable but not exclusive extent caused by the price of a combination of modem and DSLAM going down to about US$100 per user. An important additional factor is that with the latest technology, this service most of the time can be installed by the customer, eliminating the expensive "truck rolls" that plagued early installations.)

Even cellular, which is often thought of as relatively static, shows great technical advances. That the costs of handsets have plummeted is widely known (from the US$3,500 for the first bulky ones in 1984 to the sub-US$100 ones of far greater functionality and usability today). On the other hand, at the carrier level, at first glance there seems to have been less progress. Figures from CTIA [13] show that over the last decade in the U.S., capital expenditures have been close to US$1,000 per each new subscriber. Also, the average monthly spending per subscriber has stayed relatively constant at about US$50 in recent years. This appears to show a static industry. However, Table 5 shows that the volume of traffic per customer has grown more than three-fold over the last five years. Thus even in the 2G wireless technology, there has been a 3x improvement in performance and service delivered to the customer.

The relative improvements in different transmission technologies are hard to gauge. Still, it is clear that the cost of electronics is decreasing rapidly. This leads to convergence of capabilities, with carriers having copper and those having coax into consumer homes able to offer comparable services. It is also likely that there will be a threshold effect because of the distribution of costs in wireline and wireless technologies. These two effects will be discussed in the next two sections.

 

++++++++++

12. Costs of connectivity

The core of the Internet is huge in terms of transmission capacity. However, it does not attract large revenues and is not expensive to run. This is the result of advances in technology and deployment outstripping demand (with the entire U.S. Internet backbone traffic transmittable in principle through a single fiber strand) and of competition. Some estimates are presented in [ 58]. I will not repeat the figures and arguments from [ 58]here, and instead will cite some supporting evidence for the small and diminishing role of the core of the network. Consider Cogent Communications, which started out with the exclusive goal of providing high speed Internet access to enterprise customers over fiber. More recently its business model has become more involved because of the acquisition of PSINet. However, that makes the rough arguments that follow all the more compelling. There are serious concerns about whether Cogent is viable, and recently it had to restructure its finances to deal with a default on debt to Cisco, for example. But let us ignore the revenue side of Cogent, and consider just the its capital and operational costs. If we examine the Cogent financial report (available through the SEC Web site, for example) for the quarter ending 31 March 2003, we find that the book value of its property and equipment (without deducting depreciation) was a little under US$400 million. Network operations were US$11 million that quarter, or a rate of US$44 million per year. It is overwhelmingly likely (especially when we consider how Cogent’s network operations expenses have been growing) that most of both capital and operational expenses are associated with local connectivity. But even if we ignore this, we can conclude that a backbone the size of Cogent’s could be built for at most US$400 million. (In all likelihood, it could be done for far less, since most of the costs are likely to be associated with local connectivity, and also because Cogent was started at the height of the telecom boom, when prices were higher.) It could also be run for under US$50 million per year. Now Cogent’s backbone (run currently at 80 Gb/s, or eight OC-192 wavelengths, on two giant rings leased from Williams) is among the largest in the U.S. in terms of capacity. To provide coverage comparable to that of the large established players such as AT&T, WorldCom, or Sprint, it would need more fiber. (At the moment Cogent does not cover such major metropolitan areas as Minneapolis/St. Paul, for example. It has 12,500 route-miles of fiber, whereas larger players tend to have twice as much.) Making allowances for a need to double the Cogent network, we conclude that an extremely generous upper bound for the costs of constructing what would likely be the largest Internet backbone in the U.S. would be US$1 billion, and operational costs would be under US$100 million per year. This would be just for the backbone, and the points of presence, not for hooking up all the individual customers. Still, this thought experiment does make the point that the backbones are not a bottleneck and are not likely to be a bottleneck in the foreseeable future.

One can object that Cogent could build its network inexpensively only because of the fiber glut. That could be, but there are two responses to that. One is that building a complete nationwide network capable of handling voice and data from scratch is not all that expensive. Various startups, such as Williams, Level 3, or Qwest (before it acquired US West) typically did it for around US$10 billion. The other response is that the fiber glut is here. The fiber is not decaying, is available for use, and using ("lighting") it is getting less expensive all the time. This fiber is a (wasteful) gift to the nation of that strange period when investors plowed money into new long haul networks, ignoring the signs that demand would not be there, and that the revenue opportunities in telecom were moving inexorably to the edges. It would not require much vigilance on the part of the federal government to prevent the long haul fiber supply from being monopolized (even if there was a player capable of cornering the market). Even if this supply did get monopolized, new fiber could be laid easily.

Progress in technology has been decreasing the costs of long haul transport much faster than of access links for a long time. As a result, even before the rise to prominence of the Internet, the long distance carriers were already doomed to a decline. The Internet has accelerated this trend. The backbones, which attract most of the interest, are almost irrelevant in the grand scheme of things. They are a commodity, and are likely to be run as commodity plays. Eventually some will start making good profits (as commodity markets are often surprisingly profitable). However, they are not now, and are not likely to be, where the big money is.

The metro area is also experiencing fast cost declines, which is making high bandwidth connectivity for enterprises increasingly affordable. Economics and technology appear to be favoring the trend in which initially large and then increasingly also medium institutions (commercial, academic, or government) or those institutions’ landlords buy or lease fiber from their buildings to local switching centers. This is part of the natural move towards customer-owned networks. It will create new revenue opportunities, but this will likely require a major restructuring of the industry [54].

Most of the discussions about the future of the Internet concentrate on residential "first mile" connectivity, as that is where the bottleneck is the most serious right now. However, it should be kept in mind that in the U.S. right now, most of the traffic appears to be business-to-business, and it is growing vigorously. (See [ 58] for estimates. In other countries this may not be the case, and [ 58]cites data for Australia, for example, which shows that in that country residential users dominate.) Thus it is not at all clear that residential users are required for healthy growth of the Internet. However, since residential broadband connectivity is what most of the public discussion is about, we concentrate on it from now on.

Telecom is supposed to be a high tech business, but a surprisingly high proportion of its costs come from very low tech aspects of its operations. In particular, the costs of installing a new wireline link have a high and seemingly irreducible component of about US$1,500 per location. Whether one is putting in traditional copper, coax, or fiber, the cost ends up someplace in that vicinity. (There are various estimates, and they differ, often depending on the scale of the project and local conditions, but that is roughly the cost range one sees from various sources.) Of this US$1,500, it appears that about half is for bringing the cable through the neighborhood, and half for actually hooking up a household or business. (There are disputes whether the split is half-and-half, or 60-40, but again it tends to be in that range.)

Another way to confirm this cost figure is by considering the data for the ILECs. They have approximately 180 million lines going to about 110 million households and businesses. (More exact figures are available, but I am using round numbers to make estimates easier. I will be working with extremely rough estimates, just to get a sense for the magnitude of various pieces of the puzzle.) As is reported in [67], the undepreciated historic cost of the ILEC plant is about US$340 billion. However, the depreciated cost is US$166 billion, and the TELRIC estimates are that replacing the network from scratch with the most cost effective modern technology would cost about US$180 billion. Thus the general conclusion is that to replace the ILEC plant with modern technology would cost around US$1,500 per endpoint, and around US$1,000 per line.

As we noted before, the capital investment of the wireless carriers appears currently to be close to US$1,000 per subscriber in the U.S.

For comparison, estimates from various WiFi projects inside enterprises (and thus in controlled environments, with easy availability of power, etc.) appear to cluster in the range of US$1,000 to US$2,000 per access point. This is not dissimilar from the estimates for rewiring enterprise networks, which tend to come in around US$1,000 per connection (whether this is with improved copper, coax, or fiber). On the other hand, estimates have been cited of US$3,000 to US$5,000 for converting pay phone booths for WiFi. In all these cases, the cost of the access point (even a commercial strength one) is practically negligible compared to the labor and related costs of hooking up electric power, and so on.

The general conclusion is that the cost of connecting any kind of endpoint, wired or wireless, tends to be in the range of US$1,500 to US$3,000. Furthermore, those costs are not coming down, since they involve primarily labor and overhead. The difference is that in the wireline environment, this cost has to be incurred for every business or residence. With wireless technology, one can potentially share this cost among several customers. This will be considered in more detail in Section 14 ("A spoiler at the broadband party").

 

++++++++++

13. Financial markets and the arrival of broadband

How quickly we move on to faster connectivity depends more on the financial markets than on government action. The tax credits, regulatory moves, and even outright subsidies that are being discussed (cf. [36]) are rather modest compared with the money that Wall Street (and Sand Hill Road) can marshall. As a comparison, construction of the new long haul fiber networks in the U.S. cost someplace in the range of US$70 billion to US$100 billion. The US$750 billion figure cited in [39] for the telecom boom years includes all sorts of financing, in both service and supplier sectors, and counts paper deals, in which overvalued shares of one company were being swapped for even more overvalued shares of another. The trillion dollar losses that are sometimes mentioned refer to destruction of fantasy paper profits. The actual sums that were invested in building networks were far more modest, the US$70 billion to US$100 billion mentioned above in long haul, and smaller amounts on CLECs. (This is one area where the Internet and nineteenth century railroads differed substantially [59].)

The rough estimates of the previous section show that the minimum of US$70 billion that was thrown away on long haul networks would suffice to provide broadband solutions to everyone in the U.S., provided it was not necessary to go into homes. Thus if one were to bring fiber to the neighborhood and then use the cable TV provider’s coax or the ILEC's copper, one could surely provide 100 Mb/s (Fast Ethernet) access for the US$600 or so per residence that would be available. On the other hand, FTTH would not be feasible. To install FTTH, we would need not only the approximately US$180 billion to wire up the homes, but also something like US$150 billion to US$250 billion for electronics (although the latter cost is decreasing rapidly, as technology advances). Thus even in the financially giddy atmosphere of the bubble years, and with today’s technology, FTTH almost surely could not have been financed.

As was discussed in Section 4 ("Telecoms and nineteenth century railroads"), it is likely to be quite a while before another bubble appears, and when it does appear, it may not strike in telecom. However, greed and fear will continue to operate. In particular, advances in technology are making it easier for different sectors of the broadcast and telecommunication industries to encroach on each other’s turf. Furthermore, financial market valuations are likely to force companies to move into other sectors. The result is likely to be substantial upheaval in share prices, and a potentially rapid deployment of broadband.

To substantiate the claim above, consider current valuations of various companies, in terms of so called enterprise value (i.e., sum of stock market valuation of shares and amount of debt, the standard measure of the total value of the entire company) per subscriber, and compare it to the replacement cost. As was mentioned in the previous two sections, cellular carriers in the U.S. appear to require about US$1,000 of capital investment for each customer. Their enterprise values seem to be in the range of US$1,500 per customer, a premium over replacement cost, but not a giant one. (At the peak of the bubble these companies were valued in some cases at well over US$5,000 per customer.)

The ILECs appear to be valued at over US$2,000 per line, and there are reports of several sales of large local systems in various parts of the U.S. that fetched over US$3,000 per line [ 73]. On the other hand, replacement cost is only around US$1,000 per line, as was noted in the previous section.

For cable TV networks, replacement costs are around US$1,500 per subscriber. Their enterprise values appear to be based on valuations of around US$3,500 per subscriber, though.

Traditionally, the Tobin Q ratio (of valuations to replacement value for the whole economy) has been below 1. During the bubble years in the U.S., it soared far above it, close to 2 (the level that was also visited by the Japanese markets during their bubble years in the late 1980s). It has now come down, but is still higher than historical norms. That the wireless carriers’ Q ratio is close to that of the general market probably reflects competition. On the other hand, the high ratio for the ILECs and cable TV networks likely reflects the perceived monopoly positions of those enterprises.

As Section 2 ("Making money in telecom the Yellow Pages way") warned, physical plant is playing a smaller role than in the past, and intangibles are more important. The prototypical examples are Microsoft and eBay, which have hardly any physical assets, but high valuations, due to the lock-in effect on their customers. That effect (as well as difficulty in obtaining orbital slots and arranging deals with content providers) appears to be behind valuations of around US$2,000 per customer of the direct broadcast satellite services (EchoStar, DirectTV). (Their customer acquisition costs, like those of the cable companies and cellular carriers, appear to be someplace in the range of US$200 to US$400, far short of their valuations.) Similar effects seem to have been behind the pricing of AOL shares. AOL at its peak was valued at around US$5,000 per subscriber (although revenues per subscriber were about US$250 per year), presumably reflecting the expectations that network effects, first mover advantage, and all the other mantras of the boom would provide a way to derive high profits. Today, financial analysts’ estimates of the market worth of AOL, were it to be spun off from AOL Time Warner, are in the range of US$200 to US$300 per subscriber, close to the customer acquisition costs. This reflects a more sober view of AOL’s prospects.

The dot-com and telecom bubbles appear to have been animated to a large extent by the expectations that the intangibles were in the future going to be the dominant factor, and so "buzz," "mind share," "eyeballs," and similar factors were going to matter much more than actual facilities [51, 59]. The crash has destroyed that illusion for most companies. It is not easy to be a Microsoft or an eBay. In particular, the ILECs and cable TV networks are still primarily in the business of providing very mundane connectivity over expensive physical plant, and valuations much over the replacement values of their plant likely reflect their monopoly positions more than anything else.

Currently the reigning broadband contestants are perceived to be ILECs with DSL, and cable TV networks with cable modems. Not only have these technologies been shown to work at reasonable cost, but the ILECs and cable networks have the financial and organizational resources to provide broadband services to most of the population. So far they have been competing primarily for high speed Internet access. The ILEC forays into entertainment have so far been half-hearted, and the cable companies have not done much in telephony. In Internet access, ILECs have been slow to push DSL, and appear to be finally starting to move primarily because of competition from cable. However, so far the two camps have been coexisting pretty peacefully.

What is likely to disturb the equilibrium is a combination of technological progress and dynamics of the financial markets. Costs of electronic equipment are falling, so the costs of offering high speed Internet access and entertainment over ILEC copper are declining. Similarly, the costs of providing Internet access and voice over cable networks’ coax are decreasing. Thus the long awaited convergence is finally about to make its mark. It comes later than predicted by its enthusiasts, but it appears to be near.

Cable networks have the higher incentive to encroach on ILEC turf. Unlike ILECs, which have been very profitable, cable has never made much money. Furthermore, in its basic area of delivering entertainment TV, cable is getting squeezed by direct broadcast satellite, which has much more favorable economics, with essentially zero marginal cost of serving an additional subscriber. Just to meet the basic competition from satellite, cable has had to invest in expensive digital upgrades, and yet it is losing subscribers to satellite. Even if it did not have to worry about satellite competition, cable has a fundamental problem. There is no way it can satisfy the rate of return expectations embodied in its US$3,500 per subscriber valuations through entertainment alone. The US$40 or US$50 per month it gets from each subscriber appears to go up only modestly with digital services. There are also advertising revenues, but on the negative side there are also the increasingly heavy costs of the broadcast material. (That is one of the key fallacies of the "content is king" dogma [50], in that good content does not come cheap. On the other hand, telephony customers provide their own (free) "content," in addition to being willing to pay more for it.) The only way cable can justify its valuations is by selling its subscribers bundles of entertainment, voice, and high speed Internet access. Thus, sooner or later, cable has to go after the ILECs’ bread-and-butter business.

ILECs are in a more comfortable position, in that they are nicely profitable. Their monopoly profits are probably enough to justify their stock valuations, as long as they stay stable. However, the situation is not stable, in that the ILECs are beginning to lose voice customers to wireless carriers. They are also losing some customers to CLECs, but that does not seem a serious threat in the long run. Furthermore, for the reasons mentioned in the previous paragraph, they are likely to start losing voice customers to cable. This will force them to respond. Many ILECs own cellular operations, and so could hope to rely on growth in the wireless arena. However, that is unlikely to be sufficient, and so they are also likely to respond much more aggressively by moving to higher speed DSL that will enable them to offer not only Internet access but also entertainment TV.

The scenario of cable networks and ILECs competing vigorously could provide fast deployment of improved broadband services, including price wars that might significantly accelerate the penetration of this technology. That would be good for consumers and the economy as a whole. There is a basic problem with this scenario, though, namely that the current stock market valuations seem to anticipate that both cable networks and ILECs win. For cable to justify its US$3,500 valuation per subscriber would probably require almost all households in a served area to purchase the complete bundle of entertainment TV, voice, and Internet access. But that would leave no customers for the ILECs, whose valuations appear to anticipate continuing high revenues and profits from their customers. (Moreover, this would require cellular operators to be unsuccessful in capturing the wireline voice business.) Thus there is a high potential of rapid broadband deployment combined with financial setbacks for either ILECs, or cable, or both. This potential is only increased by the possibility of an unexpected entrant making major inroads, as will be discussed next.

 

++++++++++

14. A spoiler at the broadband party

At the height of the telecom boom, much attention was paid to alternate technologies for providing "first mile" connectivity, approaches through satellite broadcast, power lines, free space optics, and fixed wireless. None of them succeeded in the marketplace. Hence, in public discourse, the race to provide broadband to the home is thought to have narrowed down to two choices, DSL and cable modems. The focus of public policy debate (as in the FCC decision of February 2003) is on how much monopoly control needs to be given to the providers of these services to motivate them to build out the necessary infrastructure. The hope is that eventually they might be convinced to deploy FTTH.

Yet appearances often deceive, and current common wisdom may be missing a fundamental transformation that may render DSL, cable modems, and FTTH irrelevant. The two main factors behind this surprising possibility are the moderate rate of growth in the public’s appetite for broadband, and the rapid advances in wireless transmission. Put together, these make feasible a totally different future, in which most users would get their broadband connectivity over the radio. Instead of facing a broadband monopoly, they might enjoy a competitive service provider market.

Wireless is progressing rapidly. WiFi, in particular, is a shining bright spot in the telecom sector. Sales are skyrocketing, with the number of units sold more than doubling in 2002, and new notebooks increasingly shipping with built-in cards. Cellular carriers are arguing whether they should embrace Wi-Fi as a synergistic adjunct to their own planned 3G networks, or fight it as a competitive threat.

There is no doubt that WiFi has great allure. Even if one discounts Negroponte’s futuristic "lilypads" vision of a self-organizing mesh of interconnected Wi-Fi islands, this technology does greatly simplify home and office networking, and, through efforts of enterprises such as Boingo and Cometa, might provide at least nomadic computing.

WiFi also has problems, especially in the security, scaling, and business model areas. However, the doubts hanging over WiFi should not obscure a more important fundamental point. Whether WiFi itself succeeds or fades away, it is a harbinger of a new wireless future. Although many fixed wireless projects (such as Teligent, WinStar, and the MMDS and LMDS efforts by major carriers) have gone down in flames, and 3G is being increasingly recognized as a giant mistake, the key factor is that technology is improving relentlessly, with quality rising and costs declining. What failed a few years ago can become a success now. WiFi is the outstanding example of this phenomenon, with PC cards and access points for home use in the US$100 range, and simple enough that most people can install it themselves. Future developments ought to be able to offer even greater functionality.

What is needed is a wireless technology that provides bandwidth of a few tens of megabits per second (all that most consumers will need for a while, given how slowly display technology is changing [56]), a range of a few hundred meters, to be able to serve a number of households, and ability to offer voice (which is where the money will continue to be for quite a while yet, and which is not hard to do when there is enough bandwidth). Once that is available, one could build new wireless services to compete with established wireline ones. Whether such wireless systems would use licensed or unlicensed spectrum is an open question.

Wireless has the major advantage of not having the same economies of scale as wireline. It does not suffer from the same "tyranny of homes passed," in which a carrier’s infrastructure costs are proportional to the number of homes in an area, and not to the number of customers. That is why we typically have just one wireline carrier (or two, where the cable company offers voice), but several cellular carriers. Similarly, we could easily have several competing wireless broadband carriers in the same area.

Where does this leave the wireline carriers, including cable TV networks? If the wireless option is realized, they will be faced with the loss not only of voice but also of traditional video transmissions. Wireless for local connectivity will still require the higher bandwidth of fiber for medium and long distance service, but that is a much smaller wholesale business. (It could also be supplied by new players in telecom, such as electric or gas utilities that already have rights of way, or by municipalities.) Wireless broadband would not even have to gain the lion’s share of the residential customers to be a factor. As long as it had a critical mass that would support at least one carrier in most areas, wireless broadband could exert pricing pressure on its wired competitors. The ILECs and cable networks are likely to hang onto many of their customers, since they can always write down the value of their fixed assets, and settle for lower revenues, but that would be disastrous for their shareholders and bondholders. In order to have any hope of hanging onto their retail customers, the wireline carriers will have to exploit the advantage of their higher bandwidth (especially as they push fiber closer to the customer). To do that they will need to develop their customers’ appetite for bandwidth. This will mean abandoning streaming audio and video, and marketing advantages of faster-than-real-time transmission, for local storage and transfer to portable devices, for example. The basic mind set will have to change, from that of offering a fixed service such as 640 Kb/s DSL, to a periodic upgrading of the connection’s bandwidth.

The dominance of wireless was predicted before, by George Gilder and others [ 32, 37, 42].Many predictions were premature, but it appears that the time for wireless is rapidly approching. We should note that the likely rise of wireless will not mean unbounded wireless bandwidth. It will be the result of the rate of improvement in wireless transmission exceeding the rate of growth in residential demand. If consumers were really interested in and willing to pay for 1 Gb/s connectivity, wireless would not be an option.

The arrival of wireless broadband will be welcomed by those who have been worried about public policy aspects of FTTH. Instead of a monopoly, we may instead enjoy the benefits of a lower cost technology provided by several competitive carriers, a situation that is likely to lead to greater innovation and efficiency.

 

++++++++++

15. Conclusions

Broadband is a great technology. However, it is poorly understood, both as to how best to deliver it, and how it will be used. The case for making major public investments in it is rather questionable. It is not likely to lead to a spurt in economic growth. There are good chances that progress in technology combined with the dynamics of financial markets will lead to relatively rapid spread in the U.S. of more advanced forms of broadband than are available now, even in the absence of government intervention.

What can the government do to promote broadband, if it is determined to do something? Subsidies and tax credits are not likely to have much of an effect. It appears unlikely that Washington could find enough money to make a big difference. In general, as is shown very conclusively in [21], it is hard for the government to get carriers to do things that they do not want to do on their own. As Bruce Kushnick has been pointing out, the ILECs did get various types of rate relief in the mid-1990s in return for a promise to deploy broadband, a promise they failed to deliver on. Deregulation of ILECs, advocated by some (cf. [ 70]) is similarly of doubtful efficacy, and could work against broadband by diminishing competitive threats.

Instead, let me suggest three other methods for stimulating broadband, one intriguing but totally impractical, one very practical but incremental, and one speculative.

The impractical method for stimulating broadband adoption is to make music free on the Internet. Currently, music file sharing appears to be one of the main drivers behind the spread of broadband. (It is certainly among the main generators of traffic.) Instead of using the law to choke file swapping, perhaps we should encourage the telecom industry to buy off the music studios, as was suggested in [50]. Total recorded music sales in the U.S. come to a grand total of about US$15 billion per year, while telecom spending is over 20 times higher. Moreover, of that US$15 billion, only about half goes to the studios. Thus in the abstract, it might be a wise investment for the phone companies to buy out the studios. This is of course wildly impractical for business and legal reasons, but it would quickly stimulate demand for broadband. (It would also demonstrate that the content tail should not be wagging the telecom dog, as it too often does in political, legal, and business discussions.) A slightly more practical method would be for the government to enact a compulsory licensing scheme that would have a similar effect. However, given all the concerns about fairness and consensus, it is doubtful the government could come up with an acceptable scheme fast enough to do much good.

A more practical method for stimulating broadband is to encourage migration of voice calls to cell phones. With their bread-and-butter business declining rapidly, the ILECs would then have to utilize the competitive advantage of wired links by promoting broadband connectivity. This migration could be speeded up by forcing the ILECs to spin off their wireless subsidiaries, to prevent cross-subsidization and encourage competition. The cellular operations are operated almost as separate businesses, so there would be little of the problem of unclear boundaries that bedevil other proposals, such as that of separating the ILECs into basic connectivity and service providers. Making more spectrum available for cellular would also promote the move of voice telephony to radio channels.

Finally, the third technique for stimulating broadband is to encourage innovative new wireless technologies. This could include both conventional and Ultra Wide Band, and both licensed and unlicensed approaches. It would require making substantial additional spectrum available for wireless. The advantages of wireless include not only the potential of lower costs, but also the prospects of having multiple local carriers providing "first mile" connectivity. End of article

 

About the Author

Andrew Odlyzko is Director of the interdisciplinary Digital Technology Center and an Assistant Vice President for Research at the University of Minnesota. Prior to assuming that position in 2001, he devoted 26 years to research and research management at Bell Labs and AT&T Labs. He has written over 150 technical papers and has three patents. He has managed projects in diverse areas, such as security, formal verification methods, parallel and distributed computation, and auction technology. In recent years he has also been working on electronic publishing, electronic commerce, and economics of data networks. All his recent papers as well as further information can be found on his home page at http://www.dtc.umn.edu/~odlyzko.

 

Acknowledgements

Parts of this paper appeared previously in [ 54, 55, 57].

I thank Rolf Engstrand, Bob Frankston, Jim Gray, Steve Kamman, Dave Schaeffer, Tom Schmidt, Richard Shockey, and Bill St. Arnaud, for comments and corrections to an earlier draft.

 

Notes

1. I. Aizu, "Comparative study of broadband in Asia: Deployment and policy," Asia Network Research report, third draft, 29 September 2002, at http://www.anr.org/web/html/output/2002/bbasia0929.pdf.

2. Anonymous, 1825. Facts and Arguments in Favour of Adopting Railways in Preference to Canals, in the State of Pennsylvania. Philadelphia: W. Fry; reprint edition, New York: Arno Press, 1970.

3. "Extending the Information Revolution," Athena Alliance white paper, February 2002, at http://www.athenaalliance.org/pdf/AA202WhitePaper.pdf.

4. P.S. Bagwell, 1974. The Transport Revolution from 1770. New York: Harper & Row.

5. WebSiteOptimization.com, 2003. "Bandwidth Report" (June), at http://www.websiteoptimization.com/bw/0306/.

6. K. Belson and M. Richtel, 2003. "America’s broadband dream is alive in Korea," New York Times (5 May), and at http://news.com.com/2100-1034_3-999695.html.

7. D.K. Berman and R. Frank, 2002. "Qwest bankruptcy talk eases amid deal for directories unit," Wall Street Journal(21 August).

8. J. Carlton, 2003. "Sun and wind will be sources for more power in next decade," Wall Street Journal (19 June), and at http://www.bluefish.org/sunwindb.htm.

9. E.D. Chattaway, 1855-1856. Railways: Their Capital and Dividends, with Statistics of Their Working in Great Britain andIreland, &c., &c. London: John Weale.

10. K.G. Coffman and A.M. Odlyzko, 1998. "The size and growth rate of the Internet," First Monday, volume 3, number 10 (October), at http://firstrstmonday.org/issues/issue3_10/coffman and also at http://www.dtc.umn.edu/~odlyzko.

11. K.G. Coffman and A.M. Odlyzko, 2002. "Internet growth: Is there a "Moore’s Law" for data traffic?," In: J. Abello, P.M. Pardalos, and M.G.C. Resende (editors). Handbook of Massive Data Sets. Dordrecht: Kluwer Academic, pp. 47-93, and at http://www.dtc.umn.edu/~odlyzko/doc/internet.moore.pdf.

12. K.G. Coffman and A.M. Odlyzko, 2002. "Growth of the Internet," In: I.P. Kaminow and T. Li (editors). Optical Fiber Telecommunications IV B: Systems and Impairments. San Diego: Academic Press, pp. 17-56, and at http://www.dtc.umn.edu/~odlyzko/doc/oft.internet.growth.pdf.

13. CTIA, "CTIA’s semi-annual wireless industry survey results, June 1985-December 2002," at http://www.wow-com.com/industry/stats/surveys/.

14. M. Cooper, 2002. "What ails broadband?," IEEE Spectrum volume 39, issue 9 (September), pp. 15-16, and at http://www.spectrum.ieee.org/WEBONLY/resource/sep02/speak2.html.

15. L.F. Darby, J.A. Eisenach, and J.S. Kraemer, 2002. "The CLEC experiment: Anatomy of a meltdown," September 2002 report from the Progress & Freedom Foundation, at http://www.pff.org/publications/POP9.23CLEC.pdf

16. P.A. David, 1969. "Transport innovation and economic growth: Professor Fogel on and off the rails," EconomicHistory Review (second series), volume 22, number 3 (December), pp. 506-525.

17. K.J. Delaney, 2003. "Telephone carriers pitch fixed-line text messaging," Wall Street Journal (29 May).

18. J. Bradford DeLong, 2002. "Productivity growth in the 2000s," April 2002 report, available at http://www.j-bradford-delong.net/movable type/2003 archives/000275.html.

19. U.S. Department of Commerce, Office of Technology Policy, 2002. "Understanding broadband demand: A review of critical issues," 23 September report, at http://www.ta.doc.gov/reports/TechPolicy/Broadband 020921.pdf.

20. DSL Prime electronic newsletter, Dave Burstein (editor), archived at http://www.dslprime.com/.

21. G. Faulhaber, in press. "Policy-induced competition: The telecommunications experiments," Information Economics & Policy, at http://rider.wharton.upenn.edu/~faulhabe/Policy-InducedCompetition.pdf.

22. U. S. Federal Communications Commission, 2003. "Telecommunications Industry Revenues 2001," March report, at http://www.fcc.gov/wcb/iatd/stats.html.

23. United States Federal Communications Commission, 2003. "High-Speed Services for Internet Access: Status as ofDecember 31, 2002," June report, at http://www.fcc.gov/wcb/iatd/stats.html.

24. P.C. Fishburn and A.M. Odlyzko, 1998. "Dynamic behavior of differential pricing and Quality of Service options forthe Internet," In: Proceedings of the First International Conference on Information and Computation Economies (ICE-98). New York: ACM Press, pp. 128-139, and at http://www.dtc.umn.edu/~odlyzko/doc/differential.pricing.pdf.

25. R.W. Fogel, 1964. Railroads and American Economic Growth: Essays in Econometric History. Baltimore: Johns Hopkins UniversityPress.

26. S. Helms, 2002. "Recapturing an expensive resource: Bandwidth," Cable Datacom News (1 October), at http://www.cabledatacomnews.com/oct02/oct02-6.html.

27. J. C. Herz, 2002. "The bandwidth capital of the world," Wired, volume 10, number 8 (August), pp. 90-97, and at http://www.wired.com/wired/archive/10.08/korea.html.

28. "IEEE-USA Board of Directors statement," February 2003, available at http://www.ieeeusa.org/FORUM/POSITIONS/broadband.html.

29. International Telecommunications Union, Promoting Broadband Web site, http://www.itu.int/osg/spu/ni/promotebroadband/index.html.

30. M. Jander, 2001. "Optical Oracle: More Carrier Cutbacks," Light Reading (15 November), at http://www.lightreading.com/document.asp?site=lightreading&doc id=9699.

31. M. Jander, 2003. "Rural telcos: The steady hand," Light Reading (16 April), at http://www.lightreading.com/boardwatch/document.asp?doc id=31422.

32. J.H. Johnston and J.H. Snider, 2003. "Breaking the chains: Unlicensed spectrum as a last-mile broadband solution — Issue brief," New American Foundation white paper (June), at http://www.newamerica.net/Download Docs/pdfs/PubFile1250_1.pdf.

33. E. Larson, 2002. "IDC: Telco data boom underway," Light Reading (31 January), at http://www.lightreading.com/document.asp?doc_id=11402.

34. L. Lee, 2003. "Online grocers: Finally delivering the lettuce," Business Week (28 April), and at http://www.businessweek.com/magazine/content/03_17/b3830074.htm.

35. J.C.R. Licklider, 1965. Libraries of the Future. Cambridge, Mass.: MIT Press.

36. Office of Senator Joseph I. Lieberman, 2002. "Broadband: A 21st Century Technology and Productivity Strategy,"May 2002 white paper, at http://www.senate.gov/~lieberman/press/02/05/broadband.pdf.

37. A. Lightman, 2002. Brave New Unwired World: The Digital Big Bang and the Infinite Internet. New York: Wiley.

38. C. Lu, 1998. The Race for Bandwidth: Understanding Data Transmission. Redmond, Wash.: Microsoft Press.

39. O. Malik, 2003. Broadbandits: Inside the $750 Billion Telecom Heist. New York: Wiley.

40. T.J. Mullaney with H. Green, M. Arndt, R.D. Hof, and L. Himelstein, 2003. "The E-biz surprise: It wasn’t all hype. For companies as well as consumers, e-commerce is hotter than ever," Business Week (12 May), and at http://www.businessweek.com/magazine/content/03_19/b3832601.htm.

41. Committee on Broadband Last Mile Technology, Computer Science and Telecommunications Board, Division on Engineering and Physical Sciences, National Research Council, 2002. Broadband: Bringing Home the Bits. Washington, D.C.: National Academy Press, and at http://www.nap.edu/books/0309082730/html/.

42. W.R. Neuman, L. McKnight, and R.J. Solomon, 1997. The Gordian Knot: Political Gridlock on the Information Highway. Cambridge, Mass.: MIT Press.

43. A.M. Odlyzko, 1997. "The slow evolution of electronic publishing," In: A.J. Meadows and F. Rowland (editors). Electronic Publishing ’97: New Models and Opportunities: Proceedings of an ICCC/IFIP conference held at the University of Kent at Caterbury, England, 14-16 April 1997. Washington, D.C.: ICCC Press, pp. 4-18, and at http://www.dtc.umn.edu/~odlyzko/doc/slow.evolution.pdf.

44. A.M. Odlyzko, 1998. "The economics of the Internet: Utility, utilization, pricing, and Quality of Service," unpublished manuscript, at http://www.dtc.umn.edu/~odlyzko/doc/internet.economics.pdf.

45. A.M. Odlyzko, 1998. "Smart and stupid networks: Why the Internet is like Microsoft," ACM netWorker, volume 2, number 5 (December), pp. 38-46, and at http://www.dtc.umn.edu/~odlyzko/doc/stupid.networks.pdf.

46. A.M. Odlyzko, 1999. "Data networks are mostly empty and for good reason," IT Professional, volume 1, number 2 (March/April), pp. 67-69, and at http://www.dtc.umn.edu/~odlyzko/doc/high.network.cost.pdf.

47. A.M. Odlyzko, 1999. "The current state and likely evolution of the Internet," Proceedings, Globecom’99, pp. 1869-1875, and at http://www.dtc.umn.edu/~odlyzko/doc/globecom99.pdf.

48. A.M. Odlyzko, 2000. "The Internet and other networks: Utilization rates and their implications," InformationEconomics & Policy, volume 12, pp. 341-365, (presented at the 1998 Telecommunications Policy Research Conference), and at http://www.dtc.umn.edu/~odlyzko/doc/internet.rates.pdf.

49. A.M. Odlyzko, 2000. "The history of communications and its implications for the Internet," 2000 unpublished manuscript, at http://www.dtc.umn.edu/~odlyzko/doc/history.communications0.pdf.

50. A.M. Odlyzko, 2001. "Content is not king," First Monday, volume 6, number 2 (February), http://firstmonday.org/issues/issue6_2/odlyzko/, and at http://www.dtc.umn.edu/~odlyzko/doc/recent.html.

51. A.M. Odlyzko, 2001. "The myth of Internet time," Technology Review (April), pp. 92-93, and at http://www.dtc.umn.edu/~odlyzko/doc/internet.time.myth.txt.

52. A.M. Odlyzko, 2001. "Internet pricing and the history of communications," Computer Networks, volume 36, pp.493-517, and at http://www.dtc.umn.edu/~odlyzko/doc/history.communications1b.pdf.

53. A.M. Odlyzko, 2001. "Talk, Talk, Talk: So who needs streaming video on a phone? The killer app for 3G may turn out to be — surprise — voice calls," Forbes (20 August), p. 28, and at http://www.dtc.umn.edu/~odlyzko/doc/3g.accidental.success.txt.

54. A.M. Odlyzko, 2002. "Roxane Googin’s predictions and the telecom world," Cook Report on the Internet, volume 11, numbers 1-2 (April-May), pp. 53-58, and at http://www.dtc.umn.edu/~odlyzko/doc/googin.ilec.txt.

55. A.M. Odlyzko, 2002. "Comments on 'Solving the broadband paradox' by Adam D. Thierer," letterto the editor, Issues in Science and Technology, volume 18, number 4 (Summer), pp. 7-8, and at http://www.dtc.umn.edu/~odlyzko/misc/index.html.

56. A.M. Odlyzko, in press. "Internet TV: Implications for the long distance network," In: E. Noam, J. Groebel, and D. Gerbarg (editors). Television over the Internet. Mahwah, N.J.: Lawrence Erlbaum Associates, (proceedings of workshop held at Columbia University in November 2000), and at http://www.dtc.umn.edu/~odlyzko/doc/tv.internet.pdf.

57. A.M. Odlyzko, 2003. "False hopes," Red Herring, number 123 (March), p. 31, and at http://www.dtc.umn.edu/~odlyzko/doc/redh.false.hopes.txt.

58. A.M. Odlyzko, in press. "Internet traffic growth: Sources and implications," invited paper for ITCOM 2003, to appearin the proceedings, and at http://www.dtc.umn.edu/~odlyzko/doc/itcom.internet.growth.pdf.

59. A.M. Odlyzko, "The railroads and the Internet," manuscript in preparation.

60. OECD, 2001. "The development of broadband access in OECD countries," DSTI/ICCP/TISP(2001)2/FINAL report (29 October), at http://www.oecd.org/pdf/M00020000/M00020255.pdf.

61. OECD, "OECD Communications Outlook 2003," electronic and paper copies can be purchased through http://www.oecd.org/.

62. Oftel Market Information Updates, available at http://www.oftel.gov.uk/publications/market info/index.htm.

63. L. Perdue, 2002. Eroticabiz: How Sex Shaped the Internet. San Jose, Calif.: Writers Club Press, and at http://www.eroticabiz.com/.

64. Pew Internet report, "The Ever-Shifting Internet Population: A new look at Internet access and the digital divide," at http://www.pewinternet.org/reports/toc.asp?Report=88.

65. B. St. Arnaud, 1997. "The future of the Internet is NOT multimedia," Network World (November), at http://www.canarie.ca/~bstarn/publications.html.

66. K. Scannell, 2002. "To buyout artists seeking green, Yellow-Pages units look golden," Wall Street Journal (21 August).

67. B. Stuck and M. Weingarten, 2002. "A tale of two decisions," Business Communications Review (July), pp.47-48, and at http://signallake.com/publications/.

68. E.A. Taub, 2003. "Ease of paperless e-mail sidelines the forlorn fax," New York Times (13 March).

69. "TechNet CEOs call for national broadband policy," 15 January 2002 press release, at http://www.technet.org/news/newsreleases_/2002-01-15.62.html.

70. A.D. Thierer, 2002. "Solving the broadband paradox," Issues in Science and Technology (Spring), pp. 57-62, and at http://www.nap.edu/issues/18.3/thierer.html.

71. L.K. Vanston, 2002. "Residential broadband forecasts," Technology Futures, Inc., white paper, at http://www.tfi.com/pubs/whitepapers/pdf/ti_broadband.pdf.

72. H.R. Varian, "Estimating the demand for bandwidth," at http://www.sims.berkeley.edu/simhal/Papers/wtp/wtp.html#learn99.

73. Verizon press release, reported at Light Reading (3 September 2002), at http://www.lightreading.com/document.asp?doc_id=20589.

74. M. Weingarten and B. Stuck, 2002. "CLECs: The view post-bankruptcy," Business Communications Review (May), at http://signallake.com/publications/.


Editorial history

Paper received 15 July 2003; accepted 31 July 2003.


Contents Index

Copyright ©2003, First Monday

Copyright ©2003, Andrew Odlyzko

The many paradoxes of broadband by Andrew Odlyzko
First Monday, volume 8, number 9 (September 2003),
URL: http://firstmonday.org/issues/issue8_9/odlyzko/index.html