Archive for category competition
The net neutrality debate reached a fever pitch last week when the D.C. Circuit heard oral arguments in Verizon v. FCC. Although many pundits have predicted what the appeals court will do, let’s search instead for an instructive lesson for reforming the FCC, something that policy wonks on all sides of the debate agree is necessary.
For years, I have been peddling a “compromise” on net neutrality between the folks who want to level the playing field for websites (or “edge providers”) and the folks who want to turn down the lights at the FCC.
Before explaining the idea, a quick backgrounder is in order: In December 2010, the FCC issued its Open Internet Order, which effectively proscribed certain practices by Internet service providers (ISPs), including selectively blocking traffic and contracting for priority delivery with websites. Rather than imposing an outright ban on “pay for priority” contracts, the FCC sternly warned ISPs that “as a general matter, it is unlikely that pay for priority would satisfy the ‘no unreasonable discrimination’ standard.” Put differently, such arrangements would presumptively violate the FCC’s new “non-discrimination” rule, and the burden would be on the ISP to reverse that presumption if it was ever foolish enough to try such a thing.
Of course, these rules have nothing to do with discrimination in the classic sense—that is, treating someone or something differently on the basis of some exogenous attribute (such as age, race, or lack of affiliation). For example, under the FCC’s Open Internet Order, if Time Warner TWX +0.57% (an ISP) entered into a pay-for-priority arrangement with Sony SNE +0.89% (a website) to support a Sony online-gaming application, that contract would presumptively violate the FCC’s “non-discrimination” rules even if Time Warner stood ready to extend the same economic terms to all comers. Calling these rules the “zero-price rule” or the “no-economic-relation” rule would have been more accurate, but less politically appealing.
Flash forward three years to last week’s hearing, and the FCC’s attorneys were arguing that pay for priority was tolerated under the order. Huh? Competition Law 360 ran an article appropriately titled “FCC Switches Gears In D.C. Circuit Net Neutrality Fight.” In response to the FCC’s surprise claim that ISPs were free to strike pay-for-priority deals, Judge Tatel exclaimed: “I thought the whole purpose of the rule was so edge providers would have free and unfettered access.” And Judge Silberman asked how an ISP could charge websites for priority but have no recourse if the website refused to pay.
Setting aside the legal niceties raised by Verizon’s challenge, the best outcome for consumers would be a rule that permits ISPs to contract for priority delivery but polices such arrangements for discriminatory conduct (say in favor of an affiliated website) on a case-by-case basis. Under this ex post approach, priority arrangements would be presumptively legal, and the burden of overturning that presumption would fall on the complaining website. It bears noting that the FCC uses the exact framework to adjudicate discrimination complaints in the cable video space.
Like any compromise, this solution would entail some sacrifice on both sides: Nascent websites would be forced to pay up if they wanted the same quality of service as their (paying) competitors, and ISPs would be prevented from playing favorites. It is hard to identify any “losers” in an economic sense, except perhaps those websites that preferred a free ride.
The consumer benefits of this ex post approach would be enormous: Investments would likely pour into real-time applications such as telemedicine, distance learning, and interactive games, which could exploit the priority services. Users could enjoy these apps as they were originally intended. And ISPs could use the new revenues to extend or enhance their networks.
Another benefit of this approach is that it appears to solve at least one of the FCC’s legal predicaments. Judge Tatel warned during his questions last week that the Open Internet Order’s “non-discrimination” rule did not afford “any room for negotiation” between ISPs and websites. And Judge Silberman said, “As a matter of law, this [non-discrimination] rule requires some level of access, which runs afoul of common carriage.” By permitting pay for priority contracts, the ex post approach would provide ISPs the very flexibility that is needed to get them out of this mess.
So how can this grand compromise come about? The D.C. Circuit will likely vacate the “non-discrimination” portion of the order and send it back to the FCC. At that point, the FCC will have an opportunity to revise its rules along the lines of what I’ve described here. With any luck, the agency will recognize that the ex post approach is a template for much more than fixing its overreach in the Open Internet Order. Forsaking broad proscriptions in favor of regulatory tolerance toward new business relations combined with case-by-case review is the path forward in the new digital era.
Today the Senate will convene a distinguished panel of experts to discuss the state of wireless competition in America. Although it is trendy among the cognoscenti to complain about the wireless industry, the reality is that wireless competition is vibrant here, and U.S. carriers are leaving their European counterparts in the dust.
A common refrain among those calling for regulators to “level the playing field” is that two carriers—AT&T T +1.8% and Verizon—are running away from the pack, due to their allegedly superior spectrum holdings. The resulting imbalance in competition can be remedied, they claim, by capping the spectrum holdings of the larger carriers and steering newly available spectrum to smaller carriers. Any relative improvement in the smaller carriers’ networks would attract more customers, which would reduce wireless concentration.
One problem with this story is that wireless concentration—a very fuzzy indicator of competition when it comes to wireless services—is not climbing as predicted. In fact, U.S. wireless concentration as measured by the FCC has held steady since 2008, indicating that Sprint and T-Mobile are not losing ground. Indeed, 2012 was a particularly good year for these carriers, as both enjoyed significant subscriber gains. T-Mobile recently completed its merger with MetroPCS, giving the combined company access to more subscribers and more spectrum.
Perhaps the best indicator of the smaller carriers’ prospects is the bidding war for Sprint that has erupted between Softbank and Dish Network. If Sprint stood no chance to compete with AT&T and Verizon due to its allegedly inferior spectrum, then these savvy investors would not be so bullish about Sprint’s future. Put differently, Sprint’s spectrum holdings are valued dearly in the marketplace despite their “high-frequency” nature.
The same voices calling for intervention will likely cite lower wireless prices in Europe as proof that reducing concentration will bring lower prices. But a new study by GSMA, a trade association representing 800 of the world’s mobile operators, concludes that “Europe now lags far behind the United States in the deployment of next-generation mobile technologies and the advanced services made possible through mobile,” rendering any straight-up price comparison unreliable. The study found that U.S. mobile customers consume five times more voice minutes and nearly twice as much data as their European counterparts, and average mobile data connection speeds in the United States are now 75 percent faster than those in Europe.
By convening a panel on the state of wireless competition, the Senate must be careful not to miss the forest for the trees. The phrase “wireless competition” implies incorrectly that wireless carriers compete exclusively among themselves. New data suggests that wireless competes increasingly with wireline connections such as cable modem and DSL for broadband customers. According to a consumer survey by Leichtman Research Group, hundreds of thousands of Americans canceled their home Internet service in 2012, taking advantage of the proliferation of Wi-Fi hot spots and fast new wireless networks accessible to smartphones and tablets. Indeed, more U.S. households stopped paying for home Internet subscriptions (and relied on wireless access instead) than cancelled their pay-television subscriptions (and relied on video over Internet services).
How quickly will wireless overtake wireline broadband connections? Dish’s chairman is projecting that as many as a third of all Americans one day could find it more efficient to get their home Internet service wirelessly; Cisco IBSG recently projected that up to 15 percent of U.S. consumers could “cut their cord” in favor of a mobile data connection by 2016; and Samsung recently predicted that mobile networks could supplant wireline broadband by 2020.
The oncoming battle between wireless and wireline Internet providers suggests a more permissive attitude toward wireless concentration. For those who can’t (or won’t) recognize this “inter-modal competition,” any increase in wireless concentration is mistakenly perceived as bad news for consumers. The quest to promote wireless competition via spectrum policy could result in less competition where it matters most.
Since the publication of Susan Crawford’s book on the alleged failings of U.S. Internet policy, several mainstream outlets have run stories repeating her mantra that Internet speeds are too slow, coverage is shoddy, there is a growing “digital divide” among rich and poor, and broadband prices are too high.
Consider the barrage of “bad news” in just one week:
- The Wall Street Journal reported that six percent of Americans “lack high-speed service” in a story provocatively titled “Gaps Persist in High-Speed Web Access”;
- The Financial Times reported that the United States ranks 16th in Internet speeds, and that U.S. prices on a per-megabit-per-second basis (Mbps) are more than double those in Europe; and
- Digital Trends ran an article touting Ms. Crawford’s policies titled “Admit It: U.S. Internet Service Sucks.”
Are things as gloomy as the naysayers claim? A close look at the facts suggests otherwise. (Yes, that is a link to Need for Speed, my new e-book on Internet policy from Brookings Press; if Bob Woodward can shamelessly promote his book in the Washington Post when reporting the origins of the sequester, surely I can do the same.)
Let’s start with connection speeds. According to Akamai, a global provider of Internet services, the United States ranked ninth in average connection speeds (7.7 Mbps) in the third quarter of 2012, and seventh in percent of Internet connections with speeds above 10 Mbps (18 percent). South Korea leads both categories (average speed of 14.7 Mbps, 52 percent above 10 Mbps). It’s a bit misleading to compare our speeds with those of the fastest country in the world; a seven-minute-per-mile runner looks shoddy compared to the fastest runner in the world. And like any average, our nationwide average speed combines fast connections with slow ones. For example, the average connection speed in eight states (mostly along the densely populated Northeast corridor) exceeds 9 Mbps; any of those states would rank third fastest in the world on Akamai’s list. It’s a bit of a stretch to say that we are the tortoise among rabbits; the United States is more like Danica Patrick, who finished eighth at Daytona on Sunday.
Moving on to coverage gaps. The empirical basis for the share of Americans without “high-speed service” is the FCC’s annual report on the state of broadband deployment. There are two important caveats to keep in mind when assessing these data: The FCC counts wireline connections only, and only those wireline connections that exceed 4 Mbps. Thus, a wireline connection of say 3 Mbps (such as DSL) would not be counted in the FCC’s tally, and a wireless connection of say 10 Mbps (such as 4G LTE) would also be ignored. As of 2011, the latest year for which the FCC has reliable data, only about 7 million U.S. households did not have broadband access; if wireless broadband technologies are counted, the number of households without access to broadband at the FCC’s minimum speed is in the range of 2 to 5 million. It is hyperbole to suggest that broadband operators have ignored large swaths of the country.
And what about that growing “digital divide”? Once again, the naysayers ignore speedy wireless connections to create the appearance of a problem. It is not surprising that wealthier people have greater access to the Internet; they likely have greater access to most goods in the U.S. economy. A 2012 Pew survey shows that the same percentage of white, black, and Hispanic adults (roughly 62 percent) go online wirelessly with a laptop or a cellphone; that slightly more blacks and Hispanics own a smartphone than do whites (49 versus 45 percent); and that twice as many blacks and Hispanics go online mostly using their cell phone compared to whites (38 versus 17 percent).
The third statistic may indicate that blacks and Hispanics lack wireline access relative to whites or that blacks and Hispanics simply have stronger preferences for wireless connections relative to whites; if the latter, there is no problem to be solved. And if income differences explain the differences in broadband choices, income-based subsidies are the logical policy instrument.
Broadband price comparisons. There is a lot of casual empiricism in this area. International price comparisons of a differentiated product such as Internet connectivity should be taken with a grain of salt because the quality of Internet service might not be comparable. Moreover, if you put a gun to a provider’s head (as regulators do in Europe), and require it to make its services available to resellers at incremental costs, you are going to get cheap service—and destroy investment incentives as a nasty byproduct. Citing “harsher rules that have sapped profitability,” Reuters reported that European telco stocks were trading at roughly 9.9 times earnings compared to 17.6 times for their U.S. peers.
In large swaths of this country, the incumbent cable operator faces a fiber-based telco offering triple-play packages. Unless you think that cable operators are colluding with the telcos—a position espoused by Ms. Crawford—Internet prices are less than monopoly levels where telco-based fiber is available. And help is on the way for the rest of us in the form of wireless 4G LTE offerings, satellite broadband connections, and further telco deployment.
This is not to say that market forces and a largely hands-off Internet policy have delivered the ideal state of competition. In a market with large fixed costs, when consumers are reluctant to switch providers, and when certain must-have video programming is controlled by the incumbent cable operator, we shouldn’t expect ten broadband providers in each zip code.
The United States appears to being doing just fine in the broadband race; perhaps not in first place, but certainly deserving of a cameo on the next GoDaddy commercial. Any efforts to stimulate greater deployment should be targeted, and they should respect the incentives of broadband operators to continually upgrade their networks. The naysayers have misdiagnosed the state of broadband competition.
With InBev Suit, Feds Fight To Keep Beer Cheap For Young Blue-Collar Men. Maybe That’s Not A Good Idea.
Last week, the Department of Justice sued to block the merger of Anheuser Busch InBEV (“ABI”) and Grupo Modelo (“Modelo”). The coming battle between the antitrust agency and the merging parties could raise several important issues for merger review, including the role of entrants (craft beer makers) and negative externalities (associated with consuming beer).
ABI, the maker of Bud, Bud Light, and Busch, already owns 35 percent of Modelo; the DOJ’s lawsuit seeks to keep ABI’s share right there. For those who haven’t carefully studied the back of their Mexican beer bottles, Modelo is the maker of popular Mexican imports such as Corona Extra, Corona Light, and Pacifico.
ABI’s “partial ownership” of Modelo is no small detail; it complicates the DOJ’s analysis relative to a garden-variety merger analysis. Writing in the Antitrust Law Journal, Salop and O’Brien explain that the “competitive effects of partial ownership depend critically on two separate and distinct elements: financial interest and corporate control.” Depending on those variables, partial mergers “can occur in ways that result in greater or lower harm to competition than a complete merger.” The implication of their finding is that a movement from a partial merger to a complete one could raise or lower prices.
The DOJ’s complaint doesn’t tell us much about the nature of ABI’s existing control over Modelo, except for noting that ABI’s annual report claims that ABI does not have “effective control” over Modelo. Despite this disclaimer and despite the “firewalls” designed to prevent ABI members of Modelo’s board learning about pricing information, it is possible that ABI exerts some influence over Modelo’s decision-making. Setting aside the degree of ABI’s control over Modelo’s prices, economic theory predicts that ABI’s financial interest in Modelo could affect ABI’s prices. The question is whether a full transfer of ownership would really make things worse.
The DOJ’s primary theory of harm is that the merger would facilitate coordinated pricing between ABI and MillerCoors, the second largest beer manufacturer in the United States. According to the complaint, ABI and MillerCoors have been forced to discount their prices to discourage consumers from “trading up” to Modelo brands; take away Modelo’s aggressive pricing and the industry leaders could better coordinate their price increases. Secondarily, the DOJ argues that the merger would permit ABI to unilaterally raise its prices without concern about customer defection to Modelo’s brands.
One bone of contention between the dueling antitrust experts will be the likely role of “craft beers” or microbrews in the coming years. To the extent that craft beers play a larger role in the near future—one estimate suggests that craft beers currently account for six percent of all sales but are growing at 13 percent—then a merger of two “low-end” labels is not as important for consumers. According to the Brewers Association, there were 2,000 U.S. breweries in operation by the end of 2012, and there are another 1,000 in the planning stage; the expansion of microbreweries suggests a “shift in the palate” of U.S. beer consumers toward craft beers. With this backdrop, the combination of two low-end brands might not generate much pricing power.
To be fair, ABI has some high-end labels, such as Stella Artois and Beck’s, and craft beers such as Goose Island and Shock Top. But these brands are drowned out in a sea of differentiated flavors, including popular brews such as Abita, Lagunitas, and Shiner. There is an exciting microbrew story in nearly every state—for example, you can’t visit the Blue Ridge region of Virginia without stopping at Devils’ Backbone (Roseland) or Blue Mountain Brewery (Afton).
The DOJ’s discussion of the proposed “relevant product market” is good reading. Apparently, ABI’s Bud Light Lime-a-Rita sits within the “premium plus” category. Where I come from, serving a margarita in an aluminum can is blasphemy. The agency asserts that all segments of the beer industry—from the “sub-premium” segment to “high-end”—compete in the same product market: Query whether sub-premium beers or even the “premium” segment are not constrained by the price of water, the closest available substitute. Craft beers are mentioned in passing only.
The key demographic for low-priced beer drinkers is blue-collar males in their 20s, who might shy away from the premium prices commanded by craft beers. Presumably, the DOJ’s lawsuit aims to protect these drinkers. Given the negative externalities associated with consuming alcohol, however, the movement to higher priced, heavier-tasting, craft beers that are not guzzled like Mad Dog might not be a bad thing. Which leads to one to wonder: Should the supply of beer be competitive or should we tolerate a little market power along with reduced levels of consumption?
If the DOJ has its way and blocks this merger—and if the agency is right about the likely price effects—then we will get more alcohol consumption relative to a world in which ABI owns 100 percent of Modelo. Be careful what you wish for.
Before Washingtonians could fully digest the election results in early November, there was a mild tremor in the tele-cosmos that could have a significant impact on broadband deployment and hence the U.S. economy. AT&T announced that it planned to upgrade its copper network to an IP-based technology and replace some rural lines with wireless connections. It also petitioned the Federal Communications Commission to commence a proceeding in which market trials would be conducted to determine the policy implications associated with its IP transition. According to one consumer advocate, the news was the “single most important development in telecom since passage of the Telecommunications Act of 1996.”
To understand why, one needs a bit of history. A century ago, voice services were provided by a single firm (also named AT&T) based on a social compact struck in 1913 that has lost its relevance due the advance of technology. In exchange for monopoly privileges, AT&T submitted (over the course of the next decade) to rate regulation and a universal service obligation. And the compact delivered on universality: By the early 1980s, over 90 percent of American households had basic telephone service.
But a funny thing happened since the technological era of the Commodore 64 and the Walkman. Our nation was rewired for a second time by cable plant, a third time by wireless networks, and a fourth time by satellite networks. By 2012, high-speed Internet over a cable connection—which supports voice as one of several IP-based applications—was available to 93 percent of U.S. households. By 2010, 99.8 percent of the U.S. population was covered by at least one wireless voice network. And in September 2012, Dish Network launched a nationwide satellite broadband service, targeting customers in rural areas that are underserved with a $40 per month offer that supports, among other IP-based applications, voice services.
Competitive entry puts telecom regulators in a pickle. Anyone following the recent spat between D.C. taxi drivers and Uber services, or the decade-old spat between cable operators and telco-based video providers, understands that when regulators can no longer provider monopoly protection to an incumbent, their basis for imposing monopoly-related fees or obligations washes away. Why should I pay you for the privilege of driving a cab in your city, the taxi driver asks, when my competitor is free from such obligations?
When it comes to voice services, the regulatory obligation that is now under scrutiny is the duty to provide universal telephone service over the old copper network. Based on the original social compact, that duty falls uniquely (and thus perversely) on the telcos. Cable, wireless and satellite providers are free to provide voice service (or not) over the network of their choosing, and they are free to pick and choose which homes to serve. In contrast, telcos must operate two networks at once—an outdated, copper-based legacy network that provides service to a shrinking customer base and a modern, IP-based network that supports data, video, and voice applications.
To understand how onerous these rules are, consider the decision of Google, a recent entrant to the broadband space, not to offer voice service as part of its Google Fiber offering in Kansas City. After studying state and federal regulations for voice services, the vice president of Google Access Services concluded: “We looked at doing that [VoIP]. The cost of actually delivering telephone services is almost nothing. However, in the United States, there are all of these special rules that apply.” It makes little sense to have the telcos abide by those same rules when cable operators and wireless providers (typically five in a city) are direct competitors for voice services.
If supporting two separate networks imposed trivial costs on the telcos, then consumers would be held harmless. Alas, telcos invest a significant amount of resources to maintain the legacy network. One study by the Columbia Institute for Tele-Informations estimated that nearly half of telcos’ capital expenditures are tied up in this rut. Freed from these obligations, telcos could deploy these resources to higher value services, including expanding the reach of their IP-based networks. Broadband consumers, particular those living in areas served by a single wireline provider of broadband services, would benefit from the enhanced competition with cable operators.
There appears to be a growing consensus on the need for reform. Indeed, Public Knowledge, a consumer advocacy group typically adverse to the telcos, acknowledged that the petition for deregulation “raises a valid point of concern if the rules for the [legacy] to IP [conversion] apply only to it and other Local Exchange Carriers (LECs) upgrading their networks.”
Of course, there are still voices who advocate continued monopoly-era obligations, regardless of how many distinct technologies cover or nearly cover the entire nation for voice service. A recent op-ed in the New York Times fantastically asserted the existence of a telco-cable “cartel.” These incessant calls for a public-utility-style approach are outliers in the policy arena, as rational voices from both the left and right seem to be coalescing around the proper idea for how to transition to the modern telecom era.
Although the elections were polarizing for many policy matters, at least broadband policy seems to be bringing folks to the middle for constructive debate and problem solving. It’s time to bring communications policy into alignment with the modern era.
In light of recent stories hinting that the Federal Trade Commission (FTC) will not pursue antitrust claims that Google discriminates in its search results, advocates for rival websites are sounding the alarms. One attorney who represents several websites that have complained about Google’s alleged favoritism in search decried: “If a settlement were to be proposed that didn’t include search, the institutional integrity of the FTC would be at issue.” Ironically, the opposite is true: By reportedly dropping search discrimination from its case, the FTC has bolstered its integrity.
This is not to say that discrimination against rival websites is a good thing. Rather, discrimination of the kind allegedly practiced by Google is generally not recognized as an antitrust violation. With the exception of extreme cases, such as when a monopolist refuses to sell a product or service to a competitor that it makes available to others for discriminatory reasons, a firm does not expose itself to antitrust liability by merely refusing to deal with a competitor. (By contrast, a firm may expose itself to antitrust liability by refusing to deal with customers or suppliers so long as they deal with the firm’s rival.) Because Google is not refusing to sell a product or service to a rival website that it makes available to others, but instead places its specialized search results—such as maps, image, shopping or local results—at the top of the page when it believes they will be useful to consumers, Google arguably has no “duty to deal” under the antitrust laws.
To make a discrimination square peg fit into an antitrust round hole, the FTC would have needed to invoke an unorthodox section of the FTC Act (Section 5), thereby stretching the agency’s authority. By recognizing the incongruence between the conduct that the antitrust laws are meant to stop and the consumer-centric justifications for Google’s behavior, the FTC appears to have spared itself a tough slog. For example, one element of a duty-to-deal claim under the Sherman Act is proving that Google’s treatment of rival websites harms consumers; even the cleverest economist would be stumped with that assignment.
Google’s rivals are now seeking a do-over at the Justice Department (DOJ). They analogize the Google case to the FTC’s Microsoft investigation, where the DOJ picked up that case shortly after the FTC commissioners deadlocked in 1993. But the FTC does not appear to be deadlocked here; the agency is likely rejecting the Google case because the antitrust law does not support the complainants’ arguments.
Although regulatory relief at the FTC appears to be fleeting (and the DOJ is not the proper forum), website rivals could seek protection against search discrimination from Congress. The blueprint is already established: In 1992, Congress amended the Cable Act to protect independent cable networks against discrimination by vertically integrated cable operators. Section 616(a)(3) of the Act directs the Federal Communications Commission to establish rules governing program carriage agreements that “prevent [a cable operator] from engaging in conduct the effect of which is to unreasonably restrain the ability of an unaffiliated video programming vendor to compete fairly by discriminating in video programming distribution on the basis of affiliation or nonaffiliation of vendors in the selection, terms, or conditions for carriage of video programming provided by such vendors.”
This explains why, for example, the NFL Network brought a discrimination cases against Comcast—a vertically integrated cable operator that owns a national sports network—under the Cable Act and not under the Sherman Act. Had the NFL Network pursued its discrimination claims in an antitrust court, it likely would have failed. By styling its case as a program-carriage complaint, however, the NFL Network took advantage of cognizable harms under the Cable Act such as preserving independent voices that, for better or worse, are not appreciated by the antitrust laws.
If independent websites such as Nextag want relief, then they should lobby Congress to write the analogous non-discrimination provisions covering search engines. Once an agency is designated with the authority to police Google and other vertically integrated search engines (Bing included), website rivals could pursue individual discrimination claims just like the NFL. Importantly, website rivals would have to fund these battles, not with taxpayer money (of which millions were likely spent by the FTC in its antitrust investigation of Google), but with their own resources. Self-funding ensures that only the strongest discrimination cases would come forward; when someone else is footing the bill, all bets are off.
Admittedly, the relief contemplated here would not come quickly. It took years for independent cable networks to convince Congress of their plight. But the impatience of Google’s rivals is no reason for the FTC to bend the antitrust laws. Better to keep the powder dry—and the FTC’s integrity intact—and go after a monopolist that is more blatantly violating the antitrust laws on another day.
The New York Times just ran a provocative story titled “Americans Paying More for LTE Service,” suggesting that prices charged by U.S. wireless operators for access to their new 4G networks are triple what they would be were our wireless markets more competitive. In support of this claim, they compare the price per gigabyte charged by Verizon Wireless for its bundled voice-data plan ($7.50) to the “European average” LTE price for data-only plans ($2.50), as calculated by the consultancy Wireless Intelligence. Time to call in the trust busters? Hardly.
As any first-year economic student understands, prices are determined by supply and demand conditions. When performing international price comparisons, one should account for these differences before proclaiming that U.S. consumers spend “too much” on a particular service. Of course, it is much easier to generate readership (and hence advertising dollars) with fantastic claims that our wireless markets are not competitive.
Let’s start with differences in demand that could affect the value of wireless data services and thus relative prices. While it makes sense for The Economist to compute a Big Mac Index for a product that is basically the same wherever it is sold, price comparisons of services that are highly differentiated across countries are less revealing. And the quality of wireless LTE networks varies significantly. Verizon’s LTE network covered two-thirds of the U.S. population in April 2012. In contrast, the geographic coverage of European carriers’ LTE networks is anemic, prompting the European Commissioner Neelie Kroes to proclaim this month that the absence of LTE across the continent was proving to be a major problem in Europe. No wonder it is hard to get Europeans to pay dearly for LTE services!
Turning to the supply-side of the equation, while the surface area of the U.S. LTE “coverage blanket” is relatively larger, the European coverage blanket is thicker than ours. U.S. wireless carriers don’t have as much spectrum, the key ingredient in delivering wireless service, as their European counterparts. As pointed out by wireless analyst Roger Entner, U.S. carriers have only one-third of the spectrum available in Italy (on a MHz-per-million-subscribers basis), and one-fifth of the spectrum as France, Germany, and the UK. Given this relative scarcity of spectrum, U.S. carriers must prevent overuse of their LTE networks through the price mechanism—else their data networks would be worthless. As more spectrum comes online, basic economic theory predicts that U.S. data prices will fall.
The staggered LTE offerings by U.S. carriers are another factor affecting the supply-side of the equation. As the New York Times article notes, Verizon was the first to market LTE in the United States in December 2010. AT&T, Sprint, and T-Mobile unveiled LTE offerings at a later date and are playing catch up. To compete for LTE customers, these latecomers are undercutting Verizon, which in turn, will lead to lower prices. By offering unlimited LTE data plans, Sprint charges $0 on a per-gigabyte basis at the margin. T-Mobile also offers an “Unlimited Nationwide 4G” plan at $90 per month (including unlimited voice minutes) that sets the marginal price on a per-gigabyte basis to zero. Although AT&T does not offer unlimited data plans, one can compute the “imputed” price per gigabyte for its bundled voice-data plans by subtracting the price of a comparable unlimited voice plan and then dividing by the gigabytes permitted. The result? A lower price per gigabyte than the European average. (Interested readers can email me for the math.)
Thus, even if you think U.S. wireless data prices are “too high” today, the competitive process should be given more than one year to work its magic. Consider the competition for wireless voice services, which has played out over a decade. According to Merrill Lynch, the United States enjoyed a lower price for voice services on a per-minute-of-use basis ($0.03) than France ($0.10), Germany ($0.08), or the UK ($0.08) in the fourth quarter of 2011. How can the New York Times say, on the one hand, that these European countries serve as a competitive benchmark for wireless data services in the United States, but that the prices for voice services in these same countries should be ignored? Are we to mimic European policies with respect to data services and shun their policies with respect to voice services?
The lesson here is that what’s happening to European prices for wireless voice, wireless data, healthcare, or any differentiated product for that matter depends on several things, none of which is controlled for when making these simplistic international price comparisons. I know, I know. We need to sell Internet advertising. Can you imagine the headline: “Difference-in-difference regression shows that U.S. data prices are just right?”