Capital Expenditures in Broadband: 2Q-16 Update

Capital continues to flee digital infrastructure. The table below compares the first six months of 2016 with the same period in 2014–the last year in which ISPs were not subject to Title II regulation.

Screen Shot 2016-08-11 at 11.38.33 AM

Aggregate capital expenditure (capex) declined by nearly $2.7 billion relative to the same period in 2014. While Title II can’t be blamed for all of the capex decline, it is reasonable to attribute some portion to the FCC’s draconian rules. After all, the rules (needlessly) bar ISPs from creating new revenue streams from content providers, and (needlessly) expose ISPs to price controls. Both measures truncate an ISP’s return on investment, which makes investment less attractive at the margin.

This is incredibly frustrating because net neutrality protections could have been achieved though an alternative source of legal authority (section 706). Mr. Wheeler took ISP investment for granted. Bad assumption.

1 Comment

GW Institute of Public Policy Event Livestream

gwipp bar

The Federal Communications Commission’s Path to Populism: A Search for Relevancy in the Digital Age

July 11, 2016

http://www.wirestream.tv/customer/livecoverage/2016/07-11/video.asp

Event programming will begin at approximately 12:45 pm.

Leave a comment

Has the FCC Righted Its Economics Ship in the Business Data Services Proceeding?

Fresh off the heels of a scathing critique by Judge Williams of the economic analysis in its Open Internet Order, the FCC needs to avoid any appearance of further “economics-free” rulemaking. In a mind-numbingly complicated and extended proceeding to decide how to retain the agency’s foothold over business data services (BDS), the FCC released three economic statements from its staff this week that purportedly justify the draconian actions it wants to take in that space. The agency even went so far as to reveal that it subjected its core economic analysis to peer review. A boon for economics, right?

Not exactly. The FCC is merely paying lip service to infusing its decision-making with economic analysis. As revealed in the peer review, the flaws in the underlying economic work that undergirds the proposed regulation of BDS (previously called “special access” services) are potentially fatal, rendering the analysis useless as the basis for the agency’s proposed regulations.

At least one party directly affected by the agency’s move has already expressed outrage over the broken process by which the FCC subjected its economist’s work to peer review, including the FCC’s decision not to release the peer review responses until comments were due. Rather than piling onto the FCC’s process errors, I’ll focus on the merits of the FCC’s economic support.

By way of a quick history, the FCC retained an external economist, Dr. Mark Rysman, Professor of Economics at Boston University, to establish “direct evidence of market power” in the supply of BDS in its Further Notice of Proposed Rulemaking (FNPRM). A finding of market power is critical to the FCC’s endgame, as the agency cannot compel a BDS provider to share its facilities at cost-based rates with a rival, nor subject its retail rates to price-cap regulation, in the presence of competition.

But as George Ford so ably points out, Rysman’s paper, which appears as Appendix B to the FRNPM, “says nothing about market power.” That BDS markets served by a single BDS provider exhibit greater prices on average than markets served by two providers does not rule out the possibility that BDS providers earn zero profits in so-called monopoly markets—that is, they generate revenues just high enough to cover their fixed costs. Moreover, Rysman’s estimates of the difference in prices between monopoly and competitive markets are not headline-grabbing: Before considering any critiques of his methodology, competition allegedly brings a modest price reduction of roughly ten and three percent for DS3 lines and DS1 lines, respectively.

The FCC sought peer review of the Rysman study by Andrew Sweeting, Associate Professor of Economics at University of Maryland, and Tommaso Valletti, Professor of Economics at Imperial College London. Interestingly, neither reviewer seems to be persuaded by Rysman’s empirical analysis.

Both reviewers criticized Rysman for relying on a cross-sectional database of buildings in 2013, rather than on panel data, which would contain prices for each building over a span of years. Sweeting notes that an “obvious concern” of Rysman’s limited dataset is the possible presence of “unobservable differences across customers that are correlated with both prices and competition by using census tract or county fixed effects,” thereby potentially contributing to biased estimates of Rysman’s competition variable. Rysman’s estimated price effects from competition could be upwardly biased, for example, if as Sweeting notes, cable providers “might be particularly good at picking off customers who want fancier services from the ILEC, so that in locations with [cable] competition, ILECs are left serving customers who are purchasing relatively cheap products. . . .” Sweeting concludes that “most economists would regard within-customer-over-time changes in prices, that could have been identified and estimated with panel data, as more compelling.”

In addition to the problems relating to the cross-sectional nature of his data, Sweeting identifies deficiencies in Rysman’s econometric model. For example, Sweeting notes that many of the contracts in Rysman’s dataset “are likely to have been negotiated some time prior to 2013, when local competition may have been different.” Sweeting notes that there has been significant growth in fiber-based cable networks since 2013, which suggests that extant relationships between prices and competition might no longer hold in 2016. He faults Rysman for failing to control for contract terms, such as duration, additional equipment, or contract date, all of which could explain pricing variation for the same products. Sweeting is also critical of Rysman’s assumption that a lack of a competitor’s presence in the building indicates limited competition, even when a competitor exists in the same census block.

Valletti also addresses the limitations of Rysman’s cross-sectional dataset: “[T]he question remains whether it is still possible that unobserved factors that can affect prices (particular demand and supply characteristics) differ within the census tract, and could drive the entry of CPs.” If so, and if Rysman failed to control for the “endogeneity of competitive entry,” then his estimates for the competition variables could be biased.

But perhaps Valletti’s most devastating critique occurs on page five, when he ponders whether prices per building (Rysman’s dependent variable) are even capable of responding to competitive entry given that DS1 and DS3 prices are based on tariffs:

If a service is tariffed, which I understand is true, for instance, of DS1 and DS3 services, then that service must be generally available to all at the same price. I also understand that the carriers can and do under the tariff differentiate services based on geographic locations, and that under the tariff prices can also vary, for instance, with volume, term commitment, and quality of SLAs. But if one could control for all these factors, the prices should not change with the number of competitors, as the same conditions must be offered to everyone. So my main point here is to understand why – having controlled for all the “right” factors – competition should have a role for tariffed services. Else the interpretation of the regression results could be substantially different: if, say, regulated prices could not react at all to the number of competitors, then the present statistical findings are simply pointing to the spurious correlation that competitors seem to enter in particular buildings where particular contractual elements (not observed by the econometrician) are present.

Put differently, Rysman’s basic premise that DS1 and DS3 prices should be correlated with competitive circumstances makes no sense so long as prices are tariffed and available to all comers at the same terms.

In seeming disregard to these significant criticisms, the FCC presses forward with its radical proposal, which would subject both telcos (incumbents) and facilities-based entrants (cable companies) to price controls. None of the economic statements released by the staff this week credibly addresses the critical errors reviewed here. Peer review is great in theory, but if doesn’t cause the Commission to alter its approach, then what good is it?

Leave a comment

The FTC Should Not Have Blocked Staples-Office Depot

The Federal Trade Commission (FTC) can’t bring a suit against every merger that threatens competition. Given its scarce resources–it has limited staff and budgets for external experts–the agency has to pick its battles carefully. Among the set of the mergers that would raise prices, how should it choose?

There are some obvious filters. For example, the FTC should not pursue cases in dying industries: As dinosaurs are toppled by new technologies, the opportunity to preserve consumer welfare is limited. Nor should the FTC pursue cases where the victims are large, sophisticated buyers (think Fortune 100 firms).

Consider a subtle filter: The FTC should not pursue cases where the industry is characterized by large negative externalities. Before turning to paper-based office suppliers, implicated by the Staples-Office Depot merger, let’s focus on toxic chemicals to drive the point home.

Two tanneries are situated along a river, spewing pollution into the water like nobody’s business in proportion to the output of the firms. The tanneries are tired of competing, and decide to merge. The FTC hires a world-class economist from Berkeley, who estimates that the merger will lead to monopoly prices. Should the FTC block it?

There’s no doubt the merger is anticompetitive. But as any good economist knows, relative to the competitive equilibrium, a monopolist produces at levels closer to the socially optimal output in the presence of negative externalities. We want pollution to come down. An efficient way to achieve this objective is to permit the two firms to merge and raise prices, thereby discouraging consumption of leather-based products. (We should also enforce the relevant environmental laws, but that is outside the scope of the antitrust agencies.)

Now back to paper-based office products. Due to environmental concerns, and due to a need to quickly incorporate materials into work product, I prefer to read documents on my screen. But my (generally older) partners can’t print fast enough. Some appear to print every email. Any why not? The cost to the partner of printing an email is zero. And the cost to the firm is near zero.

If that doesn’t annoy you, consider these stats from the Paperless Project:

  • Americans consume more paper per capita – upwards of 500 lbs. annually – than anyone else on earth. On average, a person in the United States uses more than 700 pounds of paper every year.
  • The United States uses approximately 68 million trees each year to produce paper and paper products.
  • The average office worker continues to use a staggering 10,000 sheets of copy paper every year.
  • Discarded paper is a major component of many landfill sites, accounting for about 35% by weight of municipal solid waste.
  • Pulp and paper is the third largest industrial polluter to air, water and land in both Canada and the United States, and releases well over 100 million kg of toxic pollution each year.
  • 40% of the world’s industrial logging goes into making paper, and this is expected to reach 50% in the near future

Which brings me to the FTC’s dogged pursuit of Staples-Office Depot. I am skeptical of the merits of the FTC’s claims regarding anticompetitive effects. Although the price discipline of Amazon is hard to detect looking backwards, on a going-forward basis, Amazon Business, which hit $1 billion in sales in its first year and is growing at 20 percent per month, will likely displace these two dinosaurs in a matter of years.

But let’s grant the FTC its (unpersuasive) theory of harm. Why in the world should we encourage competitive pricing of paper-based office products, given the horrific things paper does to the environment? A monopoly provider of paper-based office products will raise the price of printing emails, which is a good thing in my book.

For now, environmentalists can only pray that Amazon chases the dinosaurs out of business, and then raises prices to monopoly levels. Either that or pray that Millennials have different habits when it comes to printing emails in the office.

For industries characterized by negative externalities, the FTC should think twice before stopping mergers to monopoly. Did I mention that alcohol consumption is associated with violence? Of course, no merging beer producer would make this argument before the antitrust agencies. But a snarky economist with no skin in the game just might.

Leave a comment

The FCC’s Merger Conditions in Charter-TWC

Some economic thoughts are just too complicated for Twitter. I need about 560 words to vent properly on the merger conditions that the FCC imposed on Charter-TWC. So sit back.

The first condition that deserves economic scrutiny is the FCC’s seven-year zero-price regulation on interconnection. For seven years, Netflix will make zero contribution to Charter’s recovery of broadband infrastructure costs. Although economists object on efficiency grounds to placing 100 percent of cost recovery on the more price-sensitive side (broadband users) of this two-sided market, at least this condition bears some resemblance to the FCC’s theory of harm–namely, that the combination of Charter and TWC would increase the merged firm’s pricing power vis-a-vis national content providers such as Netflix. But if you really believe this, as my liberal friends do, then why not just block the merger? Why approve a merger that facilitates the exercise of market power only to regulate that exercise out of existence?

That Charter offered up this condition to see the deal though, or that Charter had not previously charged for interconnection fees does not make this condition anything less than blatant price controls. Apparently, price controls are in vogue these days, with the FCC regulating the price of business broadband service at both the wholesale and retail level. But to an economist, this stuff is hard to swallow. Consumers hate higher prices, but high prices are also a signal to broadband rivals to deploy duplicative networks. If you think there aren’t tradeoffs, check out what’s going on across the pond, where price regulation (in the form of mandatory unbundling) and the absence of facilities-based competition go hand in hand.

The second condition that riled me up was the obligation for Charter to invade a rival cable operator’s territory. This requirement is misguided for at least two reasons. First, the merger of Charter and TWC does not reduce competition in the provision of broadband service to any household in America, as the two cable operators serve non-overlapping territories. This condition, like so many other FCC merger-related requirements in the past, is completely divorced from the competitive harms raised by the merger. It is yet another example of the FCC using the merger review process as a means to regulate the industry.

Second, if Charter satisfies this overbuild requirement in the territory of a cable operator with fewer broadband subscribers, then the requirement perversely increases concentration at the national level.

To see how, consider the following illustrative example: Charter invades the territories of Cox, Cablevision, and other smaller cable operators, shifting three percentage points of market share (about 1.3 million subscribers with speeds of 10 Mbps down or faster) towards Charter. (The market share figures are from a January 26, 2016 story in Ars Technica.)

 Provider Subs Post Merger Share Squared Subs Post Invasion Share Squared
Comcast 29% 841 29% 841
New Charter 24% 576 27% 729
Cox 5% 25 4% 16
Cablevision 4% 16 3% 9
Suddenlink 1% 1 1% 1
Mediacom 1% 1 1% 1
Wide Open West 1% 1 1% 1
Cable One 1% 1 1% 1
RCN 0% 0 0% 0
Other Cable 5% 25 4% 16
Verizon 9% 81 9% 81
AT&T 12% 144 12% 144
Century   Link 4% 16 4% 16
Frontier 1% 1 1% 1
Windstream 1% 1 1% 1
Fairpoint 0% 0 0% 0
Cincinnatti Bell 0% 0 0% 0
Other LECs 1% 1 1% 1
Other 1% 1 1% 1
TOTAL 100% 1,732 100% 1,860

As the table shows, if Charters takes subscribers from smaller cable operators, then nationwide concentration (or HHI) increases from 1,732 to 1,860. This is just one illustration, and the result would hold so long as Charter invades any cable operator’s territory save Comcast.

Recall the FCC’s theory of harm is that the merger would permit Charter to exercise market power vis-a-vis Netflix. By perversely increasing nationwide concentration, the FCC’s overbuild requirement will have strengthened Charter’s pricing power!

There. I’m done.

Leave a comment

No Special Favors for Special Access

When FCC Chairman Tom Wheeler said back in February 2015 that his agency was not interested in regulating the rates charged by Internet service providers, he neglected to offer any caveats. And for good reason: The more caveats, the less powerful his message of “forbearance.”

What he meant to say was, “No rate regulation for residential broadband offerings. Business broadband is fair game.” But that doesn’t have as nice a ring to it.

The FCC Chairman revealed his intention when he publicly voiced his opposition to a bill that would neuter the agency’s ratemaking authority. Clearly, the FCC wants to preserve its right to engage in ex post rate regulation of residential broadband offerings.

The FCC is even more clear about its intentions for business broadband. The agency already embraced rate regulation for wholesale rates charged to resellers of business broadband. In August 2015, the FCC imposed an “Interim Rule” that compelled telcos (but not cable operators) to make IP-based broadband connections (running over fiber connections) available at “reasonably comparable rates, terms, and conditions” to resellers if the telco decommissions its TDM-based services (running over copper connections) to a building.

The FCC is now doubling down by considering a proposal from the resellers’ trade association that would bring rate regulation to retail rates in its “special access” proceeding; a telco’s Ethernet services offered over new fiber networks would be subject for the first time to pre-approved price caps. So much for forbearance. [Update: Yesterday, the FCC issued a news release announcing a forthcoming NPRM that would solicit comments on how to structure price regulation of business broadband.]

Basic economics suggest that, by truncating returns on investment, price controls undermine a network operator’s incentive to invest. But by how much? A new paper I released last week on behalf of USTelecom models how fiber deployment to business districts likely will unfold with and without rate regulation.

To model the impact of retail or wholesale price regulation, one first needs to model the “baseline” level of investment—that is, predict how many buildings in a given city not currently lit for fiber would be wired over the coming years if the FCC does nothing.

This is where things get tricky. The cost to wire an unlit building in a given city depends on the distance of that building to the telco’s existing fiber footprint. To understand the distribution of distances (and associated connection costs) for unlit buildings, we selected a city (Charlotte, North Carolina) that was representative of the population of cities in terms of number of total establishments, the number of large establishments, and large establishments per square mile.

Our investment model was aimed at simulating AT&T’s fiber-expansion plan in Charlotte. Although the same model could be used to simulate the likely deployment patterns of competitors, who collectively serve over 267,000 buildings with fiber across the nation, the greatest investment impact of price regulation aimed at incumbent telcos will be (drum roll please) on incumbent telcos.

Today, AT&T’s fiber network reaches roughly ten percent of all buildings with over 20 employees in Charlotte. Although we cannot see the layout of AT&T’s fiber network, using the location of AT&T’s lit buildings, we can construct the most efficient network connecting those buildings using a minimum-spanning-tree algorithm. Next, we calculated the distance from the existing network to the closest unlit buildings, which can be translated—with estimates of underground and aerial fiber costs per mile—into the building’s fiber-connectivity costs.

We combined this building-specific cost data with an estimate of telecom revenues for the building. Unlit buildings that generated positive cash flows net of capital expenses in 18 months were assumed to be lit, which expanded the existing network and reduced the distance to serve the marginally unprofitable (and unlit) buildings. We allowed this iterative process to bring more and more buildings onto the network. By the time the process was complete, AT&T’s fiber network covered 20 percent of all buildings in Charlotte. This is not to say that 20 percent is the long-term maximum; a future technology shock could reduce AT&T’s costs, making marginal buildings profitable.

To gauge the impact of price regulation, we started the simulation over from scratch, this time reducing the expected revenue in each unlit building by 30 percent. As it turns out, the economic literature shows that prior unbundling efforts and price-cap rules have been associated with price declines of this magnitude. Which regulation ultimately causes Ethernet prices to fall is irrelevant; the point is that some unlit buildings that would otherwise have been served are no longer profitable to serve.

The results were sobering: Rather than expand to 20 percent penetration, AT&T would halt its investment in Charlotte at 14 percent. Put differently, 324 buildings in Charlotte that would have been lit under the baseline would no longer be profitable to serve under rate regulation. Because it costs on average $80,000 to wire a building, this means that AT&T will invest roughly $26 million less in Charlotte due to price regulation (a 55 percent decline in investment).

We extrapolated these results to the population to determine what the investment impacts would look like assuming the Charlotte experience was representative. If there is no rate regulation, incumbent telcos would light up nearly 122,000 buildings nationwide, representing $9.9 billion in capital expenditures and 4,900 new fiber route miles. Price regulation would cut projected investment (again by 55 percent) to an estimated $4.4 billion, providing fiber to only 55,100 buildings with 2,200 new fiber route miles.

Policymakers with a fixation on prices might not be moved by investment declines in the abstract. To put a human touch on these declines, we used fiber-specific employment multipliers from the literature. Because every $1 million in fiber investment supports 20 full-time jobs through the multiplier effect and another 20 jobs via the spillover effect, removing slightly over $1 billion per year (equal to $5.5 billion reduction spread over five years) means 43,000 fewer jobs.

The purpose of the exercise is to show policymakers the tradeoffs associated with intervening in today’s competitive broadband markets. While the FCC has a role in promoting the public interest, it also has a mandate to promote broadband deployment. Rate regulation is anathema to that mandate.

 

Twitter: @halsinger

[[This piece originally appeared in Bloomberg BNA on April 22, 2016. It has been reposted here with Bloomberg BNA’s permission]]

Leave a comment

ISP Capital Expenditures in the Title II Era (4Q Edition)

We are getting close to a final tally on ISP capex in 2015, the first year of the Title II era. Here is the latest scorecard.

Screen Shot 2016-02-25 at 1.17.09 PM

As of this post, Suddenlink had not reported its 4Q-2015 results. So I’m using its data through the third quarter of 2014 and 2015 as a placeholder.

Across this sample of the twelve largest ISPs, capex growth was slightly negative in 2015. The net change for the year was -0.4 percent, down nearly a quarter of a billion from 2014 levels.

It’s important to put this finding in context: According to USTelecom, which uses a larger sample of ISPs, broadband capex increased by 8.7 percent in 2013 (from $69 billion to $75 billion), and by 4.0 percent in 2014 (from $75 billion to $78 billion). Those were sizable gains. (Across the twelve ISPs in my sample, the increase in capex was slightly above four percent in 2014.) What happened in 2015 that caused capex growth to stagnate?

If a firm’s earnings growth followed this trajectory, its managers would likely be canned. Yet in several Title II anniversary pieces, the FCC’s defenders at ConsumeristTechdirt and Huffington Post are popping champagne. They build their cases by selectively choosing ISPs (like Comcast) that increased their capex in 2015, rather than looking at the bigger picture. Alas, the aggregate investment data do not paint a rosy picture.

This is not to suggest that Title II solely caused capital accumulation to stagnate. Several factors could be at play. But when investment theory is corroborated by evidence, as it is here, it is reasonable to infer that reclassification of ISPs as Title II common carriers was not a good thing for investment.

The theory provides a crisp prediction: Reclassification is a prerequisite for price regulation and mandatory unbundling, both of which are recognized in the economics literature to cause capital flight. Although the FCC promises to forbear from engaging in Title II-based price regulation of broadband services, so long as ISPs don’t believe that promise, capital formation should slow. Unfortunately, this FCC has no way to bind itself or future Commissions to obey this promise. In the language of game theory, the FCC lacks a “commitment mechanism.”

Indeed, just a few months after reclassifying ISPs, the FCC broke its promise with respect to unbundling of super-fast broadband services to businesses (called “Ethernet services”). In August, the FCC issued a Technology Transition Order that sought to extend the FCC’s unbundling obligations into fiber-based broadband connections for business customers. In particular, the FCC adopted a rule that required telcos “that discontinue a TDM-based service to provide competitive carriers reasonably comparable wholesale access on reasonably comparable rates, terms, and conditions during the pendency of the special access proceeding.” In other words, if a telco seeks to replace its copper-based broadband connection to a business, it now faces a fresh disincentive to invest in fiber, in that the wholesale-access requirements will extend to its Ethernet services provided over a fiber-based network. So much for forbearance.

In his dissent to the order, Commissioner Pai explained that “the Commission now leverages its discontinuance authority to get a foothold in the Ethernet market, exporting its legacy economic regulations into an all-IP world.” Commissioner O’Rielly similarly recognized the threat to fiber investment: “Providers that had voluntarily agreed to offer a commercial wholesale platform service to ease the transition for competitive carriers after the obligation to provide UNE-P was struck down by the Courts are now being forced to carry it forward into an IP world for a to-be-determined duration.”

The ISPs have good reason to doubt the FCC’s promise not to act opportunistically under Title II.

And as investment slows, the Obama Administration has good reason to be concerned that the fears about Title II depressing investment are coming true, putting into question the ability of the Administration to deliver on its promise to encourage broadband investment and extend the reach of broadband networks to areas not currently served.

Twitter: @halsinger

 

1 Comment