The
New York Times ran an
editorial this week on net neutrality. Granted, the piece doesn't add anything new to
the debate that's raging between the neutrality advocates and corporate defenders like
Verizon's CEO, but the newspaper with the largest circulation weighing in on what was once an obscure debate is, well, pretty cool. The
Times concluded,
If access tiering takes hold, then Internet providers, rather than consumers, could become the driving force in how the Internet evolves. Those corporations' profit-driven choices, rather than users' choices, would determine which sites and methodologies succeed and fail. They also might be able to stifle promising innovations, like Internet telephony, that compete with their own business interests.
The editorial followed a Feb. 7th Congressional Hearing on Net Neutrality in which a number of corporate representatives, Congresspeople, and Law Professors testified. The NYTimes editorial articulates the gist of the argument for network neutrality. However, can regulation be promoted on the basis of an assertion that access tiering "might be able to stifle promising innovations"? That seems like a fairly loose basis for legislation that regulates the ISP market. As Kyle McSlarrow, President and CEO, National Cable & Telecommunications Association testifies,
Government, by its nature, is ill-equipped to make judgements about the best business models for an industry... in the absence of any problem calling for a legislative solutution - and since the boadband services marketplace is characterized by robust competition - Congress should refrain from premature legislative action and allow the marketplace to continue to grow and change so network and applications providers can offer consumers the fullest range of innovative service options.
...hmmm, ok, McSlarrow's words make sense right up until "robust competition." And THIS is where the serious debate is, and where I think lessons from the past are most striking. The net neutrality side now articulates their case in a way that echoes Doc Searls'
Saving the Net article, which is to say the focus is on the economic, not the philosophical. This shift suggests that there will be a lot more direct clash between the sides. Lawrence Lessig, one of the invited panelists, concisely summarized the issue of tiering (whether it be access tiering or consumer tiering):
Consumer-tiering, however, should not discriminate among content or application providers. There's nothing wrong with network owners saying "we'll guarantee fast video service on your broadband account." There is something wrong with network owners saying "we'll guarantee fast video service from NBC on your broadband account." And there is something especially wrong with network owners telling content or service providers that they can't access a meaningful broadband network unless they pay an access-tax... i mean "wrong" in the sense that such a policy will inevitably weaken application competition on the Internet, and that in turn will weaken Internet growth.
This is a fairly intuitive argument. The internet, because it is an "end-to-end" network, allows any application to be run on it. That means that for every successful application- VOIP, email, instant messaging, chat rooms, community boards, even fantasy football- there have been dozens, if not hundreds, of attempts leading up to the products we have now. And the way that these products became BETTER was because the free nature of the internet allowed CONSTANT INNOVATION. For example
Wired Magazine named Firefox its product of the year, noting its exceptional rise in popularity. The force behind such a brilliant product? User feedback, both from average users and tech gurus.
But, if AOL or another ISP provider was able to access-tier, the economics of the internet change. No longer is innovation free, in fact the more users are attracted to a product, the more that product costs. While that may make sense in the long run, there is no long run if products can't get off the ground.
That's the innovation/competition stifled argument. The response to this argument is, as articulated by Gregory Sidak, visiting professor at Georgetown Law Center,
the overarching reason why anticompetitive behavior of any sort is implausible is that competition will constrain the market power of any given carrier. In most geographic markets, four or more separate firms will supply broadband Internet access.
Now, put into context, this conclusion comes after a long analysis of the traditional economic reasons that ISPs ought to be allowed to alter the business model as they see fit. Essentially it boils down to 1) ISPs bear the costs if Google launches a video service that takes up more bandwidth, so 2) Google should have to pay extra (The reverse is, of course, ISPs grow because of content like Google, so ISPs should pay for content). And Prof. Sidak's testimony is logical, reasonable, and based on sound economic models (their applicability to internet business models, however, is questionable). If Sidak is correct in his analysis, Google won't have to worry much about paying that access tax, because over time if an ISP tried to tax them, they'll likely lose their dominance in the ISP market.
...and there's the rub. The two sides VEHEMENTLY disagree on whether or not there exists sufficient competition in the market. This was the battle cry
Saving the Net was making. Kyle Dixon, Senior Fellow and Director of the federal Institute for Regulatory Law & Economics, The Progress & Freedom Foundation, argues,
The FCC reports that nearly all zip codes are served by at least one braodband provider, and a solid majority is served by several... There is no single, dominant broadband network provider and none seems likely to emerge in the immediate future. Instead, cable and phone companies vie to expand their respective, substantial market shares and to defend against wireless and other firms who hope to use less established technologies to enter new markets and expand existing footholds.
...well, not really. The section of the
FCC report that Dixon is citing reads:
The Commission's data collection program also requires service providers to identify each zip code in which the provider has at least one high-speed service subscriber. As of December 31, 2004, subscribers to high speed services were reported in 95% of the nation's zip codes. In 83% of the nation's zip codes more than one provider reported having subscribers.
So there is no single, dominant server. I think most people would agree with that... in fact there are TWO dominant servers, AOL/Time Warner and AT&T/MCI (see
Saving the Net). Which means that "several" may be a stretch. And this idea about wireless and other forms of service is contradicted by the very same report he cites (i'll get to that later). So Dixon's argument sort of falls flat when considered in context... and I have NO idea what Prof. Sidak bases his assumption of "at least four providers" being available to every houshold on. Vint Cerf, the ubiqitous voice of internet evangelicalism and Google's mouthpiece, slammed the opposing side's analysis:
Most American consumers today have few choices for broadband service. Phone and cable operators together control 98 percent of the broadband market, and only about half of consumers actually have a choice between even two providers. Unfortunately, there appears to be little near-term prospect for meaningful competition from alternative platforms.
...and
In 2004, the Commission reported that only 53 percent of Americans have a choice between cable modem service and DSL service. Of the remaining consumers, 28 percent have only one choice, and 19 percent have no choice at all. Thus, nearly half of all consumers lack meaningful choice in broadband providers.
Cerf's footnotes are, however, conspicuously absent when it comes to verfying these fun factoids. And after looking through the 2004 FCC report, I'm not sure where the 53 percent figure comes from (they do, after all, note that in 73% of zip codes more than one company offers services, although I suppose they might only cover part of a zip code). Given the half-truths of both sides, it appears that while some competition exists, there are huge barriers to having the robust marketplace that the anti-regulation advocates claim. One point that Vint Cerf makes, grounding this point in a seemingly unescapable reality, is that broadband providers require an enormous infastructure, a characteristic that even ISP management admits prevents small company competition. Earl Comstock. President and CEO, CompTel., clarifies the economic side of the debate by hammering on this issue:
The FCC's reversal is predicated on a flawed assumption, namely that the barriers to entry for transmission networks are so low that anyone who wants to compete can build their own network. Nothing is further from the truth. The truth is that all three of the ubiquitous wired networks - telephone, cable, and power - were built in a monopoly environment... The FCC points to wireless and powerline operators (both of which have signifigant facilities) as potential competitors. But an examanation of the facts regarding broadband over powerline (BPL) and wireless make clear they are not real competitive threats for the foreseeable future... Nowhere in the world are BPL or wireless being commercially used as the primary means for data or video communications. In the US, the latest FCC report on broadband shows that wireless, BPL, and satallite account for less than 3 percent of the market, and their share of the market is actually declining.
So the economic model that Sidak is so aggressive in applying doesn't quite work... unfortunately, neither side strikes me as having a strong foundation for their claims of competitiveness or lack thereof. In the next post, I'll do some research to try and explore the issue of broadband competition a little more...
...But before I end this post, I want to note one other claim Cerf makes that worries me. He writes
Network neutrality need not prevent anyone - carriers or applications providers - from developing software solutions to remedy end users concerns such as privacy, security, and quality of service. The issue arises where the network operator decides to place the functionality in the physical or logical layers of the network, rather than in the application layer where they belong.
Although putting faith in carriers or application providers to "remedy" (remedy?) privacy, security, and quality concerns might sound reasonable, by restricting such actions to the application layer, you create a lot of insecurity. That's exactly the problem that Jonathan Zittrain writes about in
"Without a Net" (my coverage
here).