Archive for category welfare
Since the publication of Susan Crawford’s book on the alleged failings of U.S. Internet policy, several mainstream outlets have run stories repeating her mantra that Internet speeds are too slow, coverage is shoddy, there is a growing “digital divide” among rich and poor, and broadband prices are too high.
Consider the barrage of “bad news” in just one week:
- The Wall Street Journal reported that six percent of Americans “lack high-speed service” in a story provocatively titled “Gaps Persist in High-Speed Web Access”;
- The Financial Times reported that the United States ranks 16th in Internet speeds, and that U.S. prices on a per-megabit-per-second basis (Mbps) are more than double those in Europe; and
- Digital Trends ran an article touting Ms. Crawford’s policies titled “Admit It: U.S. Internet Service Sucks.”
Are things as gloomy as the naysayers claim? A close look at the facts suggests otherwise. (Yes, that is a link to Need for Speed, my new e-book on Internet policy from Brookings Press; if Bob Woodward can shamelessly promote his book in the Washington Post when reporting the origins of the sequester, surely I can do the same.)
Let’s start with connection speeds. According to Akamai, a global provider of Internet services, the United States ranked ninth in average connection speeds (7.7 Mbps) in the third quarter of 2012, and seventh in percent of Internet connections with speeds above 10 Mbps (18 percent). South Korea leads both categories (average speed of 14.7 Mbps, 52 percent above 10 Mbps). It’s a bit misleading to compare our speeds with those of the fastest country in the world; a seven-minute-per-mile runner looks shoddy compared to the fastest runner in the world. And like any average, our nationwide average speed combines fast connections with slow ones. For example, the average connection speed in eight states (mostly along the densely populated Northeast corridor) exceeds 9 Mbps; any of those states would rank third fastest in the world on Akamai’s list. It’s a bit of a stretch to say that we are the tortoise among rabbits; the United States is more like Danica Patrick, who finished eighth at Daytona on Sunday.
Moving on to coverage gaps. The empirical basis for the share of Americans without “high-speed service” is the FCC’s annual report on the state of broadband deployment. There are two important caveats to keep in mind when assessing these data: The FCC counts wireline connections only, and only those wireline connections that exceed 4 Mbps. Thus, a wireline connection of say 3 Mbps (such as DSL) would not be counted in the FCC’s tally, and a wireless connection of say 10 Mbps (such as 4G LTE) would also be ignored. As of 2011, the latest year for which the FCC has reliable data, only about 7 million U.S. households did not have broadband access; if wireless broadband technologies are counted, the number of households without access to broadband at the FCC’s minimum speed is in the range of 2 to 5 million. It is hyperbole to suggest that broadband operators have ignored large swaths of the country.
And what about that growing “digital divide”? Once again, the naysayers ignore speedy wireless connections to create the appearance of a problem. It is not surprising that wealthier people have greater access to the Internet; they likely have greater access to most goods in the U.S. economy. A 2012 Pew survey shows that the same percentage of white, black, and Hispanic adults (roughly 62 percent) go online wirelessly with a laptop or a cellphone; that slightly more blacks and Hispanics own a smartphone than do whites (49 versus 45 percent); and that twice as many blacks and Hispanics go online mostly using their cell phone compared to whites (38 versus 17 percent).
The third statistic may indicate that blacks and Hispanics lack wireline access relative to whites or that blacks and Hispanics simply have stronger preferences for wireless connections relative to whites; if the latter, there is no problem to be solved. And if income differences explain the differences in broadband choices, income-based subsidies are the logical policy instrument.
Broadband price comparisons. There is a lot of casual empiricism in this area. International price comparisons of a differentiated product such as Internet connectivity should be taken with a grain of salt because the quality of Internet service might not be comparable. Moreover, if you put a gun to a provider’s head (as regulators do in Europe), and require it to make its services available to resellers at incremental costs, you are going to get cheap service—and destroy investment incentives as a nasty byproduct. Citing “harsher rules that have sapped profitability,” Reuters reported that European telco stocks were trading at roughly 9.9 times earnings compared to 17.6 times for their U.S. peers.
In large swaths of this country, the incumbent cable operator faces a fiber-based telco offering triple-play packages. Unless you think that cable operators are colluding with the telcos—a position espoused by Ms. Crawford—Internet prices are less than monopoly levels where telco-based fiber is available. And help is on the way for the rest of us in the form of wireless 4G LTE offerings, satellite broadband connections, and further telco deployment.
This is not to say that market forces and a largely hands-off Internet policy have delivered the ideal state of competition. In a market with large fixed costs, when consumers are reluctant to switch providers, and when certain must-have video programming is controlled by the incumbent cable operator, we shouldn’t expect ten broadband providers in each zip code.
The United States appears to being doing just fine in the broadband race; perhaps not in first place, but certainly deserving of a cameo on the next GoDaddy commercial. Any efforts to stimulate greater deployment should be targeted, and they should respect the incentives of broadband operators to continually upgrade their networks. The naysayers have misdiagnosed the state of broadband competition.
With InBev Suit, Feds Fight To Keep Beer Cheap For Young Blue-Collar Men. Maybe That’s Not A Good Idea.
Last week, the Department of Justice sued to block the merger of Anheuser Busch InBEV (“ABI”) and Grupo Modelo (“Modelo”). The coming battle between the antitrust agency and the merging parties could raise several important issues for merger review, including the role of entrants (craft beer makers) and negative externalities (associated with consuming beer).
ABI, the maker of Bud, Bud Light, and Busch, already owns 35 percent of Modelo; the DOJ’s lawsuit seeks to keep ABI’s share right there. For those who haven’t carefully studied the back of their Mexican beer bottles, Modelo is the maker of popular Mexican imports such as Corona Extra, Corona Light, and Pacifico.
ABI’s “partial ownership” of Modelo is no small detail; it complicates the DOJ’s analysis relative to a garden-variety merger analysis. Writing in the Antitrust Law Journal, Salop and O’Brien explain that the “competitive effects of partial ownership depend critically on two separate and distinct elements: financial interest and corporate control.” Depending on those variables, partial mergers “can occur in ways that result in greater or lower harm to competition than a complete merger.” The implication of their finding is that a movement from a partial merger to a complete one could raise or lower prices.
The DOJ’s complaint doesn’t tell us much about the nature of ABI’s existing control over Modelo, except for noting that ABI’s annual report claims that ABI does not have “effective control” over Modelo. Despite this disclaimer and despite the “firewalls” designed to prevent ABI members of Modelo’s board learning about pricing information, it is possible that ABI exerts some influence over Modelo’s decision-making. Setting aside the degree of ABI’s control over Modelo’s prices, economic theory predicts that ABI’s financial interest in Modelo could affect ABI’s prices. The question is whether a full transfer of ownership would really make things worse.
The DOJ’s primary theory of harm is that the merger would facilitate coordinated pricing between ABI and MillerCoors, the second largest beer manufacturer in the United States. According to the complaint, ABI and MillerCoors have been forced to discount their prices to discourage consumers from “trading up” to Modelo brands; take away Modelo’s aggressive pricing and the industry leaders could better coordinate their price increases. Secondarily, the DOJ argues that the merger would permit ABI to unilaterally raise its prices without concern about customer defection to Modelo’s brands.
One bone of contention between the dueling antitrust experts will be the likely role of “craft beers” or microbrews in the coming years. To the extent that craft beers play a larger role in the near future—one estimate suggests that craft beers currently account for six percent of all sales but are growing at 13 percent—then a merger of two “low-end” labels is not as important for consumers. According to the Brewers Association, there were 2,000 U.S. breweries in operation by the end of 2012, and there are another 1,000 in the planning stage; the expansion of microbreweries suggests a “shift in the palate” of U.S. beer consumers toward craft beers. With this backdrop, the combination of two low-end brands might not generate much pricing power.
To be fair, ABI has some high-end labels, such as Stella Artois and Beck’s, and craft beers such as Goose Island and Shock Top. But these brands are drowned out in a sea of differentiated flavors, including popular brews such as Abita, Lagunitas, and Shiner. There is an exciting microbrew story in nearly every state—for example, you can’t visit the Blue Ridge region of Virginia without stopping at Devils’ Backbone (Roseland) or Blue Mountain Brewery (Afton).
The DOJ’s discussion of the proposed “relevant product market” is good reading. Apparently, ABI’s Bud Light Lime-a-Rita sits within the “premium plus” category. Where I come from, serving a margarita in an aluminum can is blasphemy. The agency asserts that all segments of the beer industry—from the “sub-premium” segment to “high-end”—compete in the same product market: Query whether sub-premium beers or even the “premium” segment are not constrained by the price of water, the closest available substitute. Craft beers are mentioned in passing only.
The key demographic for low-priced beer drinkers is blue-collar males in their 20s, who might shy away from the premium prices commanded by craft beers. Presumably, the DOJ’s lawsuit aims to protect these drinkers. Given the negative externalities associated with consuming alcohol, however, the movement to higher priced, heavier-tasting, craft beers that are not guzzled like Mad Dog might not be a bad thing. Which leads to one to wonder: Should the supply of beer be competitive or should we tolerate a little market power along with reduced levels of consumption?
If the DOJ has its way and blocks this merger—and if the agency is right about the likely price effects—then we will get more alcohol consumption relative to a world in which ABI owns 100 percent of Modelo. Be careful what you wish for.
The New York Times just ran a provocative story titled “Americans Paying More for LTE Service,” suggesting that prices charged by U.S. wireless operators for access to their new 4G networks are triple what they would be were our wireless markets more competitive. In support of this claim, they compare the price per gigabyte charged by Verizon Wireless for its bundled voice-data plan ($7.50) to the “European average” LTE price for data-only plans ($2.50), as calculated by the consultancy Wireless Intelligence. Time to call in the trust busters? Hardly.
As any first-year economic student understands, prices are determined by supply and demand conditions. When performing international price comparisons, one should account for these differences before proclaiming that U.S. consumers spend “too much” on a particular service. Of course, it is much easier to generate readership (and hence advertising dollars) with fantastic claims that our wireless markets are not competitive.
Let’s start with differences in demand that could affect the value of wireless data services and thus relative prices. While it makes sense for The Economist to compute a Big Mac Index for a product that is basically the same wherever it is sold, price comparisons of services that are highly differentiated across countries are less revealing. And the quality of wireless LTE networks varies significantly. Verizon’s LTE network covered two-thirds of the U.S. population in April 2012. In contrast, the geographic coverage of European carriers’ LTE networks is anemic, prompting the European Commissioner Neelie Kroes to proclaim this month that the absence of LTE across the continent was proving to be a major problem in Europe. No wonder it is hard to get Europeans to pay dearly for LTE services!
Turning to the supply-side of the equation, while the surface area of the U.S. LTE “coverage blanket” is relatively larger, the European coverage blanket is thicker than ours. U.S. wireless carriers don’t have as much spectrum, the key ingredient in delivering wireless service, as their European counterparts. As pointed out by wireless analyst Roger Entner, U.S. carriers have only one-third of the spectrum available in Italy (on a MHz-per-million-subscribers basis), and one-fifth of the spectrum as France, Germany, and the UK. Given this relative scarcity of spectrum, U.S. carriers must prevent overuse of their LTE networks through the price mechanism—else their data networks would be worthless. As more spectrum comes online, basic economic theory predicts that U.S. data prices will fall.
The staggered LTE offerings by U.S. carriers are another factor affecting the supply-side of the equation. As the New York Times article notes, Verizon was the first to market LTE in the United States in December 2010. AT&T, Sprint, and T-Mobile unveiled LTE offerings at a later date and are playing catch up. To compete for LTE customers, these latecomers are undercutting Verizon, which in turn, will lead to lower prices. By offering unlimited LTE data plans, Sprint charges $0 on a per-gigabyte basis at the margin. T-Mobile also offers an “Unlimited Nationwide 4G” plan at $90 per month (including unlimited voice minutes) that sets the marginal price on a per-gigabyte basis to zero. Although AT&T does not offer unlimited data plans, one can compute the “imputed” price per gigabyte for its bundled voice-data plans by subtracting the price of a comparable unlimited voice plan and then dividing by the gigabytes permitted. The result? A lower price per gigabyte than the European average. (Interested readers can email me for the math.)
Thus, even if you think U.S. wireless data prices are “too high” today, the competitive process should be given more than one year to work its magic. Consider the competition for wireless voice services, which has played out over a decade. According to Merrill Lynch, the United States enjoyed a lower price for voice services on a per-minute-of-use basis ($0.03) than France ($0.10), Germany ($0.08), or the UK ($0.08) in the fourth quarter of 2011. How can the New York Times say, on the one hand, that these European countries serve as a competitive benchmark for wireless data services in the United States, but that the prices for voice services in these same countries should be ignored? Are we to mimic European policies with respect to data services and shun their policies with respect to voice services?
The lesson here is that what’s happening to European prices for wireless voice, wireless data, healthcare, or any differentiated product for that matter depends on several things, none of which is controlled for when making these simplistic international price comparisons. I know, I know. We need to sell Internet advertising. Can you imagine the headline: “Difference-in-difference regression shows that U.S. data prices are just right?”
Two dominant schools of thought have emerged in the broadband policy arena. The first, represented by the views of Susan Crawford, a visiting professor at Harvard Law School, is that there is not enough competition to cable modem service and thus government must intervene to prevent a likely abuse of market power. A second camp believes that there is no basis for proactive policies designed to increase the number of broadband providers, even in local markets served by a single provider. The high margins enjoyed by the first provider, they claim, rewards risk-taking behavior and will induce further entry.
A third perspective gaining some traction and to which I and hopefully a few others subscribe posits that there is still a limited role for policy so long as improving consumer welfare is the objective. After penning this blog, I might be disinvited to the Christmas parties of camps one and two this year.
Camp three is agnostic as to the “right” number of broadband providers, but believes that “more than one” will likely increase consumer welfare. Although government should not subsidize entry by rivals—this is tantamount to appropriating the returns of first movers, which decreases consumer welfare in the long run—it should remove any barriers that prevent more robust competition. Whereas my camp has a healthy respect for investment incentives on a going-forward basis, camp one sees investments by cable operators as sunk and thus ripe for the taking.
The role of wireless 4G networks likely separates those with at least some faith in market forces and those without any (camp one). Ms. Crawford and her ilk relegate wireless to somewhere less relevant than pink elephants when it comes to broadband competition. At a Brookings event last week, she referred to wireless as a “complementary product” for most Americans, the insinuation being that wireless is not to be taken seriously as a solution to Internet connectivity.
Although wireless might be perceived as a complement to wireline connections today, the new 4G mobile connections will afford consumers roughly seven times more speedy downloads as compared to the experience on prior generations (3G) of smartphones. With sufficient spectrum to provide endurance (another dimension of network quality), 4G operators could offer broadband consumers the full suite of services to which they have become accustomed on wireline connections in the near future.
If you don’t believe in wireless, and if you think that no amount of tinkering with the rules will get fiber deployed in more areas, then you have what Ms. Crawford refers to as a “natural monopoly” in homes served by cable modem providers but not fiber. What to do then?
In these cases, says Ms. Crawford, government “has a very important role to play.” In particular, government should “provide assistance to people who don’t have fiber access;” it should “make sure pricing is fair;” and it should provide “equal facilities to all Americans.” This is scary stuff. Although I have been critical of certain cable practices, it is a step too far to suggest that cable companies should be subject to price regulation or government-subsidized overbuilding because they invested in neighborhoods where no else has been willing to follow.
So what policies are being peddled by camp three? When it comes to broadband competition, the FCC should remove barriers to entry for wireless broadband operators seeking to deploy 4G wireless technologies, and eliminate the disincentives facing telcos for deploying fiber beyond the 55 million U.S. homes that were served as of March 2012.
Two FCC Commissioners recently sent signals to the marketplace along these lines. In a speech at the Wharton business school, Chairman Genachowski discussed the need for additional spectrum: “In addition to promoting competition, reducing barriers to broadband build-out and driving broadband investment, we of course need to keep clearing inefficiently used spectrum and reallocating it for licensed flexible use.” Can I get an Amen?
On C-SPAN’s The Communicators, Commission Ajit Pai was asked about how to spur additional fiber investment: “For one, we shouldn’t extend legacy regulations of copper wire telephone monopoly era to next generation networks. The Title II docket remains open to this day. To the extent we wanted to send a signal to the private sector that we weren’t going to take a heavy handed approach, we should close that docket.” Translation: The FCC should clarify its rules towards IP networks so that telcos understand the implications of making fiber investments; if those investments are subject to onerous requirements, then telcos will be less inclined to invest.
Dare I count the Chairman of the FCC and FCC Commissioner Pai as honorary members of my third camp? I’ll let you know if I get any Christmas invitations.