Monthly Archives: December 2010

Chairman Genachowski and His Howling Commissioners: Reading the Open Internet Report (Part I)

At the last possible moment before the Christmas holiday, the FCC published its Report and Order on “Preserving the Open Internet,” capping off years of largely content-free “debate” on the subject of whether or not the agency needed to step in to save the Internet.

In the end, only FCC Chairman Julius Genachowski fully supported the final solution.  His two Democratic colleagues concurred in the vote (one approved in part and concurred in part), and issued separate opinions indicating their belief that stronger measures and a sounder legal foundation were required to withstand likely court challenges.  The two Republican Commissioners vigorously dissented, which is not the norm in this kind of regulatory action.  Independent regulatory agencies, like the U.S. Courts of Appeal, strive for and generally achieve consensus in their decisions.

So for now we have a set of “net neutrality” rules that a bi-partisan majority of the last Congress, along with industry groups and academics, strongly urged the agency not to adopt, and which were deemed unsatisfactory by four of the five Commissioners.  It’s hardly a moment of pride for the agency, which has been distracted by the noise around these proceedings since Genachowski was first confirmed by the Senate.  Important work freeing up radio spectrum for wireless Internet, reforming the corrupt Universal Service Fund, and promoting the moribund National Broadband Plan have all been sidelined.

How did we get here?  In October, 2009, the agency first proposed new rules, but their efforts were sidetracked by a May court decision that held the agency lacked authority to regulate broadband Internet.  After flirting with the dangerous (and likely illegal) idea of “reclassifying” broadband to bring it under the old telephone rules, sanity seemed to return.  Speaking to state regulators in mid-November, the Chairman made no mention of net neutrality or reclassification, saying instead that “At the FCC, our primary focus is simple: the economy and jobs.”

Just a few days later, at a Silicon Valley event, the Chairman seemed to reverse course, promising that net neutrality rules would be finalized.  He also complimented the “very smart lawyers” in his employ who had figured out a way to do it without the authorization of Congress, which has consistently failed to pass enabling legislation since the idea first surfaced in 2003.  (Most recently, Democratic Congressman Henry Waxman floated a targeted net neutrality bill days before the mid-term elections, but never introduced it.)

From then until the Commission’s final meeting before the new Congress comes to town in January, Commissioners and agency watchers lobbied hard and feinted outrage with the most recent version of the rules, which the agency did not make public until after the final vote was taken on Dec. 21.  In oral comments delivered at the December meeting, two commissioners complained that they hadn’t seen the version they were to vote on until midnight the night before the vote.  Journalists covering the event didn’t have the document all five Commissioners referenced repeatedly in their spoken comments, and had to wait two more days for all the separate opinions to be collated.

Why the Midnight Order?  FCC Commissioners do not serve at the whim of Congress or the President, so the mid-term election results technically had no effect on the chances of agency action.  Chairman Genachowski has had the votes to approve pretty much anything he wants to all along, and will for the remainder of his term.

Even with a Republican House, legislation to block or overturn FCC actions is unlikely.  The Republicans would have to get Democratic support in the Senate, and perhaps overcome a Presidential veto.

But Republicans could use net neutrality as a bargaining chip in future negotiations, and the House can make life difficult for the agency by holding up its budget or by increasing its oversight of the agency, forcing the Chairman to testify and respond to written requests so much as to tie the agency in knots.

So doing something as Congress was nearly adjourned and too busy to do much but bluster was perhaps the best chance the Chairman had for getting something—anything—on the Federal Register.

More likely, the agency was simply punting the problem.  Tired of the rancor and distraction of net neutrality, the new rules—incomplete, awkward, and without a solid legal foundation—move the issue from the offices of the FCC to the courts and Congress.  That will still tie up agency resources and waste even more taxpayer money, of course, but now the pressure of industry and “consumer advocate” groups will change its focus.  Perhaps this was the only chance the Chairman had of getting any real work done.

The Report and Order

Too much ink has already been spilled on both the substance and the process of this order, but there are a few tidbits from the documents that are worth calling out.  In this post, I look at the basis for issuing what the agency itself calls “prophylactic rules.”  In subsequent posts, I’ll look at the final text of the rules themselves and compare them to the initial draft, as well as to alternatives offered by Verizon and Google and Congressman Waxman.  Another post will review the legal basis on which the rules are being issued, and likely legal challenges to the agency’s authority.  I’ll also examine the FCC’s proposed approach to enforcement of the rules.

“Prophylactic” Rules

Even the FCC acknowledges that the “problem” these new rules solve doesn’t actually exist…yet.  The rules are characterized as “prophylactic” rules—a phrase that appears eleven times in the 87-page report.  The report fears that the lack of robust broadband competition in much of the U.S. (how many sets of redundant broadband infrastructure do consumer advocates want companies to build out, anyway?) could lead to ISPs using their market influence to squeeze content providers, consumers, or both.

This hasn’t happened in the ten years broadband Internet has been growing in both capability and adoption, of course, but still, there’s a chance.  As the report (¶ 21) puts it in challenged grammar, “broadband providers potentially face at least three types of incentives to reduce the current openness of the Internet.”

We’ll leave to the side for now the undiscussed potential that these new rules will themselves cause unintended negative consequences for the future development or deployment of technologies built on top of the open Internet.  Instead, let’s look at the sum total of the FCC’s evidence, collected over the course of more than a year with the help of advocates who believe the “Internet as we know it” is at death’s door, that broadband providers are lined up to destroy the technology that, ironically, is the source of their revenue.

To prove that these “potential” incentives are neither “speculative or merely theoretical,” the FCC cites precisely four examples between 2005 and 2010 where it believes broadband providers have threatened the open Internet (¶ 35).   These are:

1.      A local ISP that was “a subsidiary of a telephone company” settled claims it had interfered with Voice over Internet Telephony (VoIP) applications used by its customers.

2.      Comcast agreed to change its network management techniques when the company was caught slowing or blocking packets using the BitTorrent protocol (the subject of the 2010 court decision holding the agency lacked jurisdiction over broadband Internet).

3.      After a mobile wireless provider contracted with an online payment service, the provider “allegedly” blocked customers’ attempts to use competing services to pay for purchases made with mobile devices.

4.      AT&T initially restricted the types of applications—including VoIP and Slingbox—that customers could use on their Apple iPhone.

In the world of regulatory efficiency, this much attention being focused on just four incidents of potential or “alleged” market failures is a remarkable achievement indeed.  (Imagine if the EPA, FDA, or OSHA reacted with such energy to the same level of consumer harm.)

But in legal parlance, regulating on such a microscopically thin basis goes well beyond mere “pretense”—it’s downright embarrassing the agency couldn’t come up with more to justify its actions.  Of the incidents, (1) and (2) were resolved quickly through existing agency authority, (3) was merely alleged and apparently did not even lead to a complaint filed with the FCC (the footnote here is to comments filed by the ACLU, so it’s unclear who is being referenced) and (4) was resolved—as the FCC acknowledges–when customers put pressure on Apple to allow AT&T as the sole iPhone network provider to allow the applications.

Even under the rules adopted, (2) would almost surely still be allowed.  The Comcast case involved use of the BitTorrent protocol.  Academic studies performed since 2008 (when the protocol has been expanded to more legal uses, that is), find that over 99% of BitTorrent traffic still involves unlicensed copyright infringement.  Thus the vast majority of the traffic involved is not “lawful” traffic and, therefore, is not subject to the rules.  The no blocking rule (§8.5) only prohibits blocking of “lawful content, applications, services or non-harmful devices.”  (emphasis added)

Indeed, the FCC encourages network providers to move more aggressively to block customers who use the Internet to violate intellectual property law.  In ¶ 111, the Report makes crystal clear that the new rules “do not prohibit broadband providers from making reasonable efforts to address the transfer of unlawful content or unlawful transfers of content…..open Internet rules should not be invoked to protect CR infringement….” (Perhaps the FCC, which continues to refer to BitTorrent as an “application” or believes it to be a website, simply doesn’t understand how the BitTorrent protocol actually works.)

Under the more limited wireless rules adopted, (3) and (4) would probably still be allowed as well.  We don’t know enough about (3) to really understand what is “alleged” to have happened, but the no-blocking rule (§ 8.5) says only that mobile broadband Internet providers “shall not block consumers from accessing lawful websites, subject to reasonable network management; nor shall such person block applications that compete with the provider’s voice or video telephony service, subject to reasonable network management.”

A mobile payment application wouldn’t seem to be included in that limitation, and in the case of the iPhone, it was Apple, not AT&T, that wanted to limit VoIP.

Even so, the Report makes clear that the wireless rule (¶ 102) doesn’t apply to app stores: “The prohibition on blocking applications that compete with a broadband provider’s voice or video telephony services does not apply to a broadband provider’s operation of application stores or their functional equivalent.”  So if the software involved in incidents (3) and (4) involved rejection of proposed apps for the respective mobile devices, there would still be no violation under the new rules.

And the caveat for “reasonable network management” (§8.11(d)) says only that a practice is “reasonable if it is appropriate and tailored to achieving a legitimate network purpose, taking into account the particular network architecture of the broadband Internet access service.”  Voice and video apps, depending on how they have been implemented, can put particular strain on a wireless broadband network.  Blocking particular VoIP or apps like Slingbox might be allowed, in other words.

So that’s it.  Only four or fewer actual examples of non-open behavior by ISPs in ten years.  And the rules adopted to curb such behavior would probably only apply, at best, to the single case of Madison River (1), a local telephone carrier with six hundred employees, in a case the FCC agreed to drop without a formal finding of any kind nearly six years ago.

But maybe these aren’t the real problems.  Maybe the real problem is, as many regulatory advocates argue vaguely, the lack of “competition” for broadband.  Since the first deployment of high-speed Internet, multiple technologies have been used to deliver access to consumers, including DSL (copper), coaxial cable, satellite, cellular (3G and now 4G), wireless (WiFi and WiMax), and broadband over power lines.  According to the National Broadband Plan, 4% of the U.S. population still doesn’t have access to any of these alternatives.  In many parts of the country, only two providers are available and in others, the offered speeds of alternatives vary greatly, leaving high-bandwidth users without effective alternatives.

If lack of competition is the problem, though, why not solve that problem?  Well, perhaps the FCC would rather sidestep the issue, since it has demonstrated it is the wrong agency to encourage more competition.  The FCC, for example, has supported legal claims by states that they can prohibit municipalities from offering wireless service, and has dragged its feet on approving trials for broadband over power lines—the best hope for much of the 4% who today have no broadband option, most of whom live in rural areas which already have power line infrastructure.

Indeed, if there are anti-competitive behaviors now or in the future, existing antitrust law, enforceable by either the Department of Justice or the Federal Trade Commission, provide much more powerful tools both to prosecute and remedy activities that genuinely harm consumers.

It’s hard, by comparison, to find many examples in the long history of the FCC where it has used its sometimes vast authority to solve a genuine problem.  The Carterfone decision, which Commissioner Copps cites enthusiastically in his concurrence, and (finally) the opening of long distance telephony to competition, certainly helped consumers.  But both (and other examples) could also be seen as undoing harm caused by the agency in the first place.  And both dealt with technologies and applications that were mature.  Why does anyone believe the FCC can “prophylactically” solve a problem dealing with an emerging, rapidly-evolving new technology that has thrived in the last decade in part because it was unregulated?

The new rules, which are aimed at ensuring “edge” providers do not need to get “permission to innovate” from ISPs, may have the unintended effect of requiring ISPS—and edge providers—to get “permission to innovate” from the FCC.  That hardly seems like a risk worth taking for a problem that hasn’t presented itself.

Radio commentary on rescuing the National Broadband Plan

I recorded a commentary today  for KQED–NPR in the Bay Area–on the importance of the National Broadband Plan.  In the wake of tumult over net neutrality, Title II, and other regulatory gibberish, the important goals of the NBP, published in March of 2010, have been lost.  That’s unfortunate, because the authors did a great job of setting out ambitious goals essential to maintain U.S. competitiveness.  The plan also relies for funding on private investment and incentives, giving it a realistic chance of success.

While recent polls indicate that few Americans want the government involved in encouraging adoption of broadband, I believe this is one example where intervention–if only of the cheerleading and goal-setting variety–is appropriate.  As I’ve written extensively elsewhere, the Internet’s success is a function of network effects, as succinctly described by Metcalfe’s Law.  The more people who have broadband access, the more valuable the network is for everyone.  And the better the chances for serendipitous new uses and applications to flourish.

Those of us who already have broadband access, in other words, would benefit just as much from getting non-users online as those users themselves.

Perhaps even more.

Cloud Users and Providers Win Big Privacy Victory – U.S. v. Warshak

The Sixth Circuit ruled on Tuesday that criminal investigators must obtain a warrant to seize user data from cloud providers, voiding parts of the notorious Stored Communication Act.  The SCA allowed investigators to demand providers turn over user data under certain circumstances (e.g., data stored more than 180 days) without obtaining a warrant supported by probable cause.

I have a very long piece analyzing the decision, published on CNET this evening.  See “Search Warrants and Online Data:  Getting Real.” (I also wrote extensively about digital search and seizure in “The Laws of Disruption.”)  The opinion is from the erudite and highly-readable Judge Danny Boggs.    The case is notable if for no other reason than its detailed and lurid description of the business model for Enzyte, a supplement that promises to, well, you know what it promises to do….

The SCA’s looser rules for search and seizure created real headaches for cloud providers and weird results for criminal defendants.  Emails stored on a user’s home computer or on a service provider’s computer for less than 180 days get full Fourth Amendment protection.  But after 180 days the same emails stored remotely lose some of their privacy under some circumstances.   As the commercial Internet has evolved (the SCA was written in 1986), these provisions have become increasingly anomalous, random and worrisome, both to users and service providers.  (As well as to a wide range of public interest groups.)

Why 180 days?  I haven’t had a chance to check the legislative history, but my guess is that in 1986 data left on a service provider’s computer would have taken on the appearance of being abandoned.

Assuming the Sixth Circuit decision is upheld and embraced by other circuits, digital information will finally be covered by traditional Fourth Amendment protections regardless of age or location.  Which means that the government’s ability to seize emails (Tuesday’s case applied only to emails, but other user data would likely get the same treatment) without a warrant that is based on probable cause will turn on whether or not the defendant had a “reasonable expectation of privacy” in the data.  If the answer is yes, a warrant will be required.

(If the government seizes the data anyway, the evidence could be excluded as a penalty.  The “exclusionary rule” was not invoked in the Warshak case, however, because the government acted on a good-faith belief that the SCA was Constitutional.)

Where does the “reasonable expectation of privacy” test come from?  The Fourth Amendment protects against “unreasonable” searches and seizures, and, since the Katz decision in 1968, Fourth Amendment cases turn on an analysis of whether a criminal defendant’s  expectation of privacy in whatever evidence is obtained was reasonable.

Katz involved an electronic listening device attached to the outside of a phone booth—an early form of electronic surveillance.  Discussions about whether a phone conversation could be “searched” or “seized” got quickly metaphysical, so the U.S. Supreme Court decided that what the Fourth Amendment really protected was the privacy interest a defendant had in whatever evidence the government obtained.  “Reasonable expectation of privacy” covered all the defendant’s “effects,” whether tangible or intangible.

Which means, importantly, that not all stored data would pass the test requiring a warrant.   Only stored data that the user reasonably expects to be kept private by the service provider would require a warrant.  Information of any kind that the defendant makes no effort to keep private—e.g., talking on a  cell phone in a public place where anyone can hear—can be used as evidence without a warrant.

Here the Warshak court suggested that if the terms of service were explicit that user data would not be kept private, then users wouldn’t have a reasonable expectation of privacy that the Fourth Amendment protected.  On the other hand, terms that reserved the service provider’s own right to audit or inspect user data did not defeat a reasonable expectation of privacy, as the government has long argued.

An interesting test case, not discussed in the opinion, would be Twitter.  Could a criminal investigator demand copies of a defendant’s Tweets without a warrant, arguing that Tweets are by design public information?  On the one hand, Twitter users can exclude followers they don’t want.  But at the same time, allowed followers can retweet without the permission of the original poster.   So, is there a reasonable expectation of privacy here?

There’s no answer to this simplified  hypothetical (yet), but it is precisely the kind of analysis that courts perform when a defendant challenges the government’s acquisition of evidence without full Fourth Amendment process being followed.

To pick an instructive tangible evidence example, last month appellate Judge Richard Posner wrote a fascinating decision that shows the legal mind in its most subtle workings.  In U.S. v. Simms, the defendant challenged the inclusion of evidence that stemmed from a warranted search of his home and vehicle.  The probable cause that led to the warrant was the discovery in the defendant’s trash of marijuana cigarette butts.  The defendant argued that the search leading to the warrant was a violation of the Fourth Amendment, since the trash can was behind a high fence on his property.

Courts have held that once trash is taken to the curb, the defendant has no “reasonable” expectation of privacy and therefore is deemed to consent to a police officer’s search of that trash.  But trash cans behind a fence are generally protected by the Fourth Amendment, subject to several other exceptions.

Here Judge Posner noted that the defendant’s city had an ordinance that prohibited taking the trash to the curb during the winter, out of concern that cans would interfere with snow plowing.  Instead, the “winter rules” require that trash collectors take the cans from the resident’s property, and that the residents leave a safe and unobstructed path to wherever the cans are stored.  Since the winter rules were in effect, and the cans were left behind a fence but the gate was left open (perhaps stuck in the snow), and the police searched them on trash pickup day, the search did not violate the defendant’s reasonable expectation of privacy.

For better or worse, this is the kind of analysis judges must perform in the post-Katz era, when much of what we consider to be private is not memorialized in papers or other physical effects but which is likely to be intangible—the state of our blood chemistry, information stored in various data bases, heat given off and detectable by infrared scanners.

The good news is that the Warshak case is a big step in including digital information under that understanding of the Fourth Amendment.  Search and seizure is evolving to catch up with the reality of our digital lives.

Interview on NPR Marketplace today

Larry was interviewed this morning on NPR’s Marketplace Tech Report.  Larry talks about why the FCC’s proposed net neutrality rules treat wireless broadband with a lighter touch than they do for wired access.  Given the explosion in demand and the limited ability for carriers to add towers, increase capacity, and access additional radio spectrum (itself a constraint imposed by the FCC), it makes sense to give wireless carriers more flexibility to restrain users from high-bandwidth applications.  File sharing, HD video and other applications are technically possible on a smartphone, but the networks may not be able yet to handle the load.

Brief thoughts on today's FCC Net Neutrality announcement

Late last night, FCC Chairman Julius Genachowski made explicit what he’d been hinting for weeks–that he was going to call for a vote in December on the agency’s long-running net neutrality proceedings.

Today, the Chairman gave a speech outlining a new version of the rules he has circulated to fellow Commissioners, which will be voted on on Dec. 21, 2010..

The new order itself has not yet been made public, however, and the Chairman’s comments didn’t give much in the way of details.  The latest version appears to reflect the proposed legislation circulated before the mid-term recess by then-Commerce chair Henry Waxman.  That version, for those following the ball here, was itself based on the legislative framework proposed by Google and Verizon, which itself emerged from informal negotiations convened over the summer at the FCC.

So in some sense the agency is moving, albeit non-linearly, toward some kind of consensus.

I have a brief article this morning in the Orange County Register laying out the pros and cons of this latest iteration, to the extent that is possible without seeing the order.

The timing of today’s announcement, however, is significant.  This was Genachowski’s last chance to wrap up the proceedings before the new Congress , with its Republican House and more even Senate, clocks in.  Republicans on their own don’t have the votes to pass legislation that would have blocked the FCC from voting on net neutrality later, but Republican leaders had threatened to use their oversight authority to put additional pressure on the FCC not to enact new neutrality rules.

That might still happen, of course, and already today several Republican leaders have promised to do whatever they can do undo today’s developments.  Assuming the Commission approves the rule at its December 21, 2010 meeting, there’s also a strong likelihood of litigation challenging the rules and the FCC’s authority to issue them.

So this is not the end of the net neutrality soap opera by any stretch of the imagination.  If anything, it suggests a new chapter, one that will take the discussion farther away from the technical architecture of the Internet and the best interests of consumers and closer to pure political theater.