Category Archives: Infrastructure

The FCC's Reign of Terror on Transaction Reviews

by Larry Downes and Geoffrey A. Manne

Now that the election is over, the Federal Communications Commission is returning to the important but painfully slow business of updating its spectrum management policies for the 21st century. That includes a process the agency started in September to formalize its dangerously unstructured role in reviewing mergers and other large transactions in the communications industry.

This followed growing concern about “mission creep” at the FCC, which, in deals such as those between Comcast and NBCUniversal, AT&T and T-Mobile USA, and Verizon Wireless and SpectrumCo, has repeatedly been caught with its thumb on the scales of what is supposed to be a balance between private markets and what the Communications Act refers to as the “public interest.”

Commission reviews of private transactions are only growing more common—and more problematic. The mobile revolution is severely testing the FCC’s increasingly anachronistic approach to assigning licenses for radio frequencies in the first place, putting pressure on carriers to use mergers and other secondary market deals to obtain the bandwidth needed to satisfy exploding customer demand.

While the Department of Justice reviews these transactions under antitrust law, the FCC has the final say on the transfer of any and all spectrum licenses. Increasingly, the agency is using that limited authority to restructure communications markets, beltway-style, elevating the appearance of increased competition over the substance of an increasingly dynamic, consumer-driven mobile market.

Given the very different speeds at which Silicon Valley and Washington operate, the expanding scope of FCC intervention is increasingly doing more harm than good.

 

Deteriorating Track Record

We’re trapped in a vicious cycle: the commission’s mismanagement of the public airwaves is creating more opportunities for the agency to insert itself into the internet ecosystem, largely to fix problems caused by the FCC in the first place. That is happening despite the fact that Congress clearly and precisely circumscribed the agency’s authority here, a key reason the internet has blossomed while heavily regulated over-the-air broadcasting and wireline telephone fade into history.

Desperate for continued relevance, the FCC can’t resist the temptation to tinker with one of the only segments of the economy that is still growing and investing. The agency, for example, fretted over Comcast’s merger with NBCUniversal for 10 months, approving it only after imposing a 30-page list of conditions, including details about which channels had to be offered in which cable packages.

Regulating-by-merger-condition has become a popular sport at the FCC, one with dangerous consequences. While it conveniently allows the agency to get around the problem of intervening where it has no authority, the result is a regulatory crazy quilt with different rules applying to different companies in different markets. Consumers, the supposed beneficiaries of this micromanagement, cannot be expected to understand the resulting chaos.

For example, Comcast also agreed to abide by an enhanced set of “net neutrality” rules even if, as appears likely, a federal appeals court throws out the FCC’s 2010 industry-wide rulemaking for exceeding the agency’s jurisdiction. As with all voluntary concessions, Comcast’s acquiescence isn’t reviewable in court.

The FCC made an even bigger hash in its review of AT&T’s proposed merger with T-Mobile. Once it became clear that the FCC was bowing to political pressure to reject the deal, the companies pulled their applications for license transfers to focus on winning over the Department of Justice first. But FCC Chairman Julius Genachowski, determined to have his say, simply released an uncirculated draft of the agency’s analysis of the deal anyway.

The report found that the combination, as initially proposed, would control too much spectrum in too many local markets. But that was only after the formula, known as the “spectrum screen,” was manipulated to reduce substantially the amount of frequency included in the denominator. Hidden in a footnote, the report noted cryptically that the reduction was being made (and explained) in an unrelated order yet to be published.

When the other order was released months later, however, it made no mention of the change. It never actually happened. With the T-Mobile deal off the table, apparently, the chairman found it more expedient to leave the screen as it was, at least until further gerrymandering proved useful. Unwittingly, Genachowski had exposed his hand in rigging a supposedly objective test applied by a supposedly independent agency.

 

Leave it to the Experts

This amateurish behavior, unfortunately, is increasingly the norm at the FCC. Politics aside, part of the problem is that while federal antitrust regulators enforce statutes under a long line of interpretive case law, the FCC’s review of license transfers is governed by an undefined and largely untested public interest standard.

Now the commission is asking interested parties how, if at all, it needs to formalize its transaction review process, particularly the spectrum screen calculation it blatantly manipulated in the AT&T/T-Mobile review. It’s even asking whether it should re-impose a rigid cap on the amount of spectrum any one carrier can license, a bludgeon of a regulatory tool the agency wisely abandoned in 2003.

We have a better idea. Do away with easily forged formulae and proxies with no scientific relevance. Instead, review transactions in the broader context of a dynamic broadband ecosystem that is disciplined not only by inter-carrier competition, but increasingly by device makers, operating system providers, app makers and ultimately by consumers.

Every user with an iPhone 5 knows perfectly well how complex and competitive the mobile marketplace has become. It’s now time for the government to abandon its 19th century toolkit and look at actual data—data that the FCC already collects and dutifully reports, then ignores when political expediency beckons.

Thanks to the FCC’s endemic misadventures in spectrum management, we can expect more, not fewer, mergers—necessitating more, not fewer, commission reviews. Rather than expanding the agency’s unstructured approach to transaction reviews, we should be reining it in. As the FCC embarks on its analysis of T-Mobile’s takeover of MetroPCS and Sprint’s acquisition by SoftBank, it’s time to put an end to dangerous mission creep at the FCC.

That, at least, would better serve the public interest.

(Reprinted, with permission, from Bloomberg BNA Daily Report for Executives, Dec. 6, 2012.  Our recent paper on FCC transaction review can be found at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2163169.)

The Latest Leak Makes Even Clearer UN Plans to Take Over Internet Governance

On Friday evening, I posted on CNET a detailed analysis of the most recent proposal to surface from the secretive upcoming World Conference on International Telecommunications, WCIT 12.  The conference will discuss updates to a 1988 UN treaty administered by the International Telecommunications Union, and throughout the year there have been reports that both governmental and non-governmental members of the ITU have been trying to use the rewrite to put the ITU squarely in the Internet business.

The Russian federation’s proposal, which was submitted to the ITU on Nov. 13th, would explicitly bring “IP-based Networks” under the auspices of the ITU, and would in specific substantially if not completely change the role of ICANN in overseeing domain names and IP addresses.

According to the proposal, “Member States shall have the sovereign right to manage the Internet within their national territory, as well as to manage national Internet domain names.”  And a second revision, also aimed straight at the heart of today’s multi-stakeholder process, reads:  “Member States shall have equal rights in the international allocation of Internet addressing and identification resources.”

Of course the Russian Federation, along with other repressive governments, uses every opportunity to gain control over the free flow of information, and sees the Internet as it’s most formidable enemy.  Earlier this year, Prime Minister Vladimir Putin told ITU Secretary-General Hamadoun Toure that Russia was keen on the idea of “establishing international control over the Internet using the monitoring and supervisory capability of the International Telecommunications Union.”

As I point out in the CNET piece, the ITU’s claims that WCIT has nothing to do with Internet governance and that the agency itself has no stake in expanding its jurisdiction ring more hollow all the time.  Days after receiving the Russian proposal, the ITU wrote in a post on its blog that, “There have not been any proposals calling for a change from the bottom-up multistakeholder model of Internet governance to an ITU-controlled model.”

This would appear to be an outright lie, and also a contradiction of an earlier acknowledgment by Dr. Touré.  In a September interview, Toure told Bloomberg BNA that “Internet Governance as we know it today,” concerns only “Domain Names and addresses.  These are issues that we’re not talking about at all,” Touré said. “We’re not pushing that, we don’t need to.”

The BNA article continues:

Touré, expanding on his emailed remarks, told BNA that the proposals that appear to involve the ITU in internet numbering and addressing were preliminary and subject to change.

‘These are preliminary proposals,’ he said, ‘and I suspect that someone else will bring another counterproposal to this, we will analyze it and say yes, this is going beyond, and we’ll stop it.’

Another tidbit from the BNA Interview that now seems ironic:

Touré disagreed with the suggestion that numerous proposals to add a new section 3.5 to the ITRs might have the effect of expanding the treaty to internet governance.

‘That is telecommunication numbering,’ he said, something that preceded the internet. Some people, Touré added, will hijack a country code and open a phone line for pornography. ‘These are the types of things we are talking about, and they came before the internet.’

I haven’t seen all of the proposals, of course, which are technically secret.   But the Russian proposal’s most outrageous proposals are contained in a proposed new section 3A, which is titled, “IP-based Networks.”

There’s more on the ITU’s subterfuge in Friday’s CNET piece, as well as these earlier posts:

1.  “Why is the UN Trying to Take Over the Internet?” Forbes.com, Aug 9, 2012.

2.  “UN Agency Reassures:  We Just Want to Break the Internet, Not Take it Over,” Forbes.com, Oct. 1, 2012.

Praise for California passage of law protecting VoIP from local utility regulators

On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.

California now joins over twenty states that have enacted similar legislation.

The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February.  It passed both houses of the California legislature with wide bi-partisan majorities.

California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.

Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime.

Historically, as I argued in a Forbes piece last month, the imposition of public utility status on a service provider has been an extreme response to an extreme situation—a monopoly provider, unlikely to have competition because of the high cost of building  and operating competing infrastructure (so-called “natural monopoly”), offering a service that is indispensable to everyday life.

Service providers meeting that definition are transformed by PUC oversight into entities that are much closer to government agencies than private companies.  The PUC sets and modifies the utility’s pricing in excruciating detail.  PUC approval is required for each and every change or improvement to the utility’s asset base, or to add new services or retire obsolete offerings.

In exchange for offering service to all residents, utilities in turn are granted eminent domain and rights of way to lay and maintain pipes, wires and other infrastructure.

VoIP services may resemble traditional switched telephone networks, but they have none of the features of a traditional public utility.  Most do not even charge for basic service, nor do they rely on their own dedicated infrastructure.  Indeed, the reason VoIP is so much cheaper to offer than traditional telephony is that it can take advantage of the existing and ever-improving Internet as its delivery mechanism.

Because entry is cheap, VoIP providers have no monopoly, natural or otherwise.  In California, according to the FCC, residents have their choice of over 125 providers—more than enough competition to ensure market discipline.

Nor would residents be in any way helped by interposing a regulator to review and pre-approve each and every change to a VoIP provider’s service offerings.  Rather, the lightning-fast evolution of Internet services provides perhaps the worst mismatch possible for the deliberate and public processes of a local PUC.

Software developers don’t need eminent domain.

But the most serious mismatch between PUCs and VoIP providers is that there is little inherently local about VoIP offerings.  Where a case can be made for local oversight of public utilities operating extensive–even pervasive–local infrastructure, it’s hard to see what expertise a local PUC brings to the table in supervising a national or even international VoIP service.

On the other hand, it’s not hard to imagine the chaos and uncertainty VoIP providers and their customers would face if they had to satisfy fifty different state PUCs, not to mention municipal regulators and regulators in other countries.

In most cases that would mean dealing with regulators on a daily basis, on every minor aspect of a service offering.  In the typical PUC relationship, the regulator becomes the true customer and the residents mere “rate-payers” or even just “meters.”

Public utilities are not known for their constant innovation, and for good reason.

Whatever oversight VoIP providers require, local PUCs are clearly the wrong choice.  It’s no surprise, then, that SB 1161 was endorsed by major Silicon Valley trade groups, including TechNet, TechAmerica, and the Silicon Valley Leadership Group.

The law is a win for California residents and California businesses—both high-tech and otherwise.

Links                                                                                                                                         

  1. Government Control of Net is Always a Bad Idea,” CNET News.com, June 4, 2012.
  2. Memo to Jerry Brown:  Sign SB 1161 for all Internet users,” CNET News.com, August 30, 2012.
  3. The Madness of Regulating VoIP as a Public Utility,” Forbes.com, Sept. 10, 2012.
  4. Brown Endorses Hands off Stance on Internet Calls,” The San Francisco Chronicle, Sept. 28. 2012.

What Google Fiber, Gig.U and US Ignite Teach us About the Painful Cost of Legacy Regulation

On Forbes today, I have a long article on the progress being made to build gigabit Internet testbeds in the U.S., particularly by Gig.U.

Gig.U is a consortium of research universities and their surrounding communities created a year ago by Blair Levin, an Aspen Institute Fellow and, recently, the principal architect of the FCC’s National Broadband Plan.  Its goal is to work with private companies to build ultra high-speed broadband networks with sustainable business models .

Gig.U, along with Google Fiber’s Kansas City project and the White House’s recently-announced US Ignite project, spring from similar origins and have similar goals.  Their general belief is that by building ultra high-speed broadband in selected communities, consumers, developers, network operators and investors will get a clear sense of the true value of Internet speeds that are 100 times as fast as those available today through high-speed cable-based networks.  And then go build a lot more of them.

Google Fiber, for example, announced last week that it would be offering fully-symmetrical 1 Gbps connections in Kansas City, perhaps as soon as next year.  (By comparison, my home broadband service from Xfinity is 10 Mbps download and considerably slower going up.)

US Ignite is encouraging public-private partnerships to build demonstration applications that could take advantage of next generation networks and near-universal adoption.  It is also looking at the most obvious regulatory impediments at the federal level that make fiber deployments unnecessarily complicated, painfully slow, and unduly expensive.

I think these projects are encouraging signs of native entrepreneurship focused on solving a worrisome problem:  the U.S. is nearing a dangerous stalemate in its communications infrastructure.  We have the technology and scale necessary to replace much of our legacy wireline phone networks with native IP broadband.  Right now, ultra high-speed broadband is technically possible by running fiber to the home.  Indeed, Verizon’s FiOS network currently delivers 300 Mbps broadband and is available to some 15 million homes.

But the kinds of visionary applications in smart grid, classroom-free education, advanced telemedicine, high-definition video, mobile backhaul and true teleworking that would make full use of a fiber network don’t really exist yet.  Consumers (and many businesses) aren’t demanding these speeds, and Wall Street isn’t especially interested in building ahead of demand.  There’s already plenty of dark fiber deployed, the legacy of earlier speculation that so far hasn’t paid off.

So the hope is that by deploying fiber to showcase communities and encouraging the development of demonstration applications, entrepreneurs and investors will get inspired to build next generation networks.

Let’s hope they’re right.

What interests me personally about the projects, however, is what they expose about regulatory disincentives that unnecessarily and perhaps fatally retard private investment in next-generation infrastructure.  In the Forbes piece, I note almost a dozen examples from the Google Fiber development agreement where Kansas City voluntarily waived permits, fees, and plodding processes that would otherwise delay the project.  As well, in several key areas the city actually commits to cooperate and collaborate with Google Fiber to expedite and promote the project.

As Levin notes, Kansas City isn’t offering any funding or general tax breaks to Google Fiber.  But the regulatory concessions, which implicitly acknowledge the heavy burden imposed on those who want to deploy new privately-funded infrastructure (many of them the legacy of the early days of cable TV deployments), may still be enough to “change the math,” as Levin puts it, making otherwise unprofitable investments justifiable after all.

Just removing some of the regulatory debris, in other words, might itself be enough to break the stalemate that makes building next generation IP networks unprofitable today.

The regulatory cost puts a heavy thumb on the side of the scale that discourages investment.  Indeed, as fellow Forbes contributor Elise Ackerman pointed out last week, Google has explicitly said that part of what made Kansas City attractive was the lack of excessive infrastructure regulation, and the willingness and ability of the city to waive or otherwise expedite the requirements that were on the books.(Despite the city’s promises to bend over backwards for the project, she notes, there have still been expensive regulatory delays that promoted no public values.)

Particularly painful to me was testimony by Google Vice President Milo Medin, who explained why none of the California-based proposals ever had a real chance.  “Many fine California city proposals for the Google Fiber project were ultimately passed over,” he told Congress, “in part because of the regulatory complexity here brought about by [the California Environmental Quality Act] and other rules. Other states have equivalent processes in place to protect the environment without causing such harm to business processes, and therefore create incentives for new services to be deployed there instead.”

Ouch.

This is a crucial insight.  Our next-generation communications infrastructure will surely come, when it does come, from private investment.  The National Broadband Plan estimated it would take $350 billion to get 100 Mbps Internet to 100 million Americans through a combination of fiber, cable, satellite and high-speed mobile networks.  Mindful of reality, however, the plan didn’t even bother to consider the possibility of full or even significant taxpayer funding to reach that goal.

Unlike South Korea, we aren’t geographically-small, with a largely urban population living in just a few cities.  We don’t have a largely- nationalized and taxpayer-subsidized communications infrastructure.   On a per-person basis, deploying broadband in the U.S. is much harder, complicated and more expensive than it is in many competing nations in the global economy.

Of course, nationwide fiber and mobile deployments by network operators including Verizon and AT&T can’t rely on gimmicks like Google Fiber’s hugely successful competition, where 1,100 communities applied to become a test site.  Nor can they, like Gig.U, cherry-pick research university towns, which have the most attractive demographics and density to start with.  Nor can they simply call themselves start-ups and negotiate the kind of freedom from regulation that Google and Gig.U’s membership can.

Large-scale network operators need to build, if not everywhere, than to an awful lot of somewheres.  That’s a political reality of their size and operating model, as well as the multi-layer regulatory environment in which they must operate.  And it’s a necessity of meeting the ambitious goal of near-universal high-speed broadband access, and of many of the applications that would use it.

Under the current regulatory and economic climate, large-scale fiber deployment has all but stopped for now.  Given the long lead-time for new construction, we need to find ways to restart it.

So everyone who agrees that gigabit Internet is a critical element in U.S. competitiveness in the next decade or so ought to look closely at the lessons, intended or otherwise, of the various testbed projects.  They are exposing in stark detail a dangerous and useless legacy of multi-level regulation that makes essential private infrastructure investment economically impossible.

Don’t get me wrong.  The demonstration projects and testbeds are great.  Google Fiber, Gig.U, and US Ignite are all valuable efforts.  But if we want to overcome our “strategic bandwidth deficit,” we’ll need something more fundamental than high-profile projects and demonstration applications.  To start with, we’ll need a serious housecleaning of legacy regulation at the federal, state, and local level.

Regulatory reform might not be as sexy as gigabit Internet demonstrations, but the latter ultimately won’t make much difference without the former.  Time to break out the heavy demolition equipment—for both.

The federal Government Plays the Spectrum Shell Game

On CNET today, I’ve posted a long critique of the recent report by the President’s Council of Advisors on Science and Technology (PCAST) urging the White House to reverse course on a two-year old order to free up more spectrum for mobile users.

In 2010, soon after the FCC’s National Broadband Plan raised alarms about the need for more spectrum for an explosion in mobile broadband use, President Obama issued a Memorandum ordering federal agencies to free up as much as 500 MHz. of radio frequencies currently assigned to them.

After a great deal of dawdling, the National Telecommunications and Information Administration, which oversees spectrum assignments within the federal government, issued a report earlier this year that seemed to offer progress.  95 MHz. of very attractive spectrum could in fact be cleared in the ten years called for by the White House.

But reading between the lines, it was clear that the 20 agencies involved in the plan had no serious intention of cooperating.  Their cost estimates (which were simply reported by NTIA without any indication of how they’d been arrived at or even whether NTIA had been given any details) for relocation appeared to be based on an amount that would make any move economically impossible.

And the NTIA’s suggestion that some of the bands could be “shared” sounded appealing until the details revealed that the feds would place impossible conditions on that sharing.

In the end, the NTIA report was 200 pages of classic smoke-and-mirrors from an entrenched bureaucracy that is expert at avoiding change.

The PCAST report seemed to throw in the cards and accept the political reality that actual spectrum clearing in the federal bands would never happen.  Instead, the President’s advisors doubled down on “sharing,” and called  for a new “Spectrum Access System” that would be based on sharing technologies it admitted don’t exist yet.

SAS might be a better system in the long-term, but current technical and political limitations make such a system impractical. I argue in the piece that the NTIA and PCAST reports are just providing cover for federal agencies, notably the DoD and Justice, to avoid actually having to follow the President’s order and take aggressive steps to free up spectrum that is needed now.  Whether this is intentional or not I leave to more savvy tea-leaf readers.

Updates to the Media Page

We’ve added over a dozen new posts to the Media page, covering some of the highlights in articles and press coverage for April and May, 2012.

Topics include privacy, security, copyright, net neutrality, spectrum policy, the continued fall of Best Buy and antitrust.

The new posts include links to Larry’s inaugural writing for several publications, including Techdirt, Fierce Mobile IT, and Engine Advocacy.

There are also several new video clips, including Larry’s interview of Andrew Keen, author of the provocative new book, “Digital Vertigo,” which took place at the Privacy Identity and Innovation conference in Seattle.

June was just as busy as the rest of the year, and we hope to catch up with the links soon.