Category Archives: Digital Life

Doing Nothing to Save the Internet

My essay last week for Slate.com (the title I proposed is above, but it must have been too “punny” for the editors) generated a lot of feedback, for which I’m always grateful, even when it’s hostile and ad hominem.  Which much of it was.

The piece argues generally that when it comes to the Internet, a disruptive technology if ever there was one, the best course of action for traditional, terrestrial governments intent on “saving” or otherwise regulating digital life is to try as much as possible to restrain themselves.  Or as they say to new interns in the operating room, “Don’t just do something.  Stand there.”

This is not an argument in favor of anarchy, or even more generally for social Darwinism.  I have something much more practical in mind.  Disruptive technologies, by definition, do not operate within the “normal science” of those areas of life they impact. Its problems can’t be solved by reference to existing systems and institutions. In the case of the Internet, that’s pretty much all aspects of life, including regulation.

By design, modern democratic government is deliberative, incremental, and slow to change.  That is an appropriate model for regulating traditional areas including property, torts, criminal procedure, civil rights and business law.    But when applied to a new ecosystem—to a new frontier, as I suggest in the piece—that model doesn’t work.

Digital life is changing much faster than traditional regulators can hope to keep up with.  It isn’t just an interesting business use of information anymore, it’s a social phenomenon, one that has gone far beyond companies finding more effective ways to share data.  It’s also, increasingly, a global phenomenon, a poor match for local and even national lawmaking.

Digital life moves at the speed of Moore’s Law, and that is the source of its true regulation.  The Internet—acting through its engineers, its users, and its enterprises–governs itself and, while far from perfect, certainly seems to be doing a better job than traditional governments in their traditional venues, let alone online.

The piece gives a short quote from Frederick Jackson Turner, the groundbreaking historian of the American West.  The full quote gives additional context to my frontier analogy:

The policy of the United States in dealing with its land is in sharp contrast with the European system of scientific administration.  Efforts to make this domain a source of revenue, and to withhold it from emigrants in order that settlement might be compact, were in vain.  The jealousy and fears of the East were powerless in the face of the demands of the frontiersman.  John Quincy Adams was obliged to confess:  “My own system of administration, which was to make the national domain the inexhaustible fund for progressive and unceasing internal improvement, has failed.”  The reason is obvious:  a system of administration was not what the West demanded:  it wanted land.

A few key points from this passage are worth highlighting:

1.      Parochialism – Traditional governments attempting to regulate new and disruptive technologies rarely have the best interests of the users in mind.  Instead, they try to exploit the new ecosystem, at best, as a stalking horse for regulation they could get away with in traditional contexts but hope to foist off on the more poorly-organized inhabitants of the frontier.  At worst, governments captured by the vested interests most threatened by the disruption of the new technology attempt to slow down the pace of change, to preserve the interests of those in the process of being upended.

That’s in part why, despite increasingly desperate efforts by the East to impose its regulatory will on the West, those efforts failed.  The East was interested in exploiting western lands for their own benefit, not optimizing the West’s potential to create a new kind of society and economic system.  The East was working against the momentum of transformation.  It understood little of how frontier life was evolving, and its laws couldn’t keep up with the pace of change even if they were enforceable, which they weren’t.  Nor should they have been.

One need only look to one of the first U.S. efforts to regulate the Internet for an example of the first kind of lawmaking.  The Communications Decency Act, passed in 1996 and signed by President Clinton, banned classes of content on the Internet that were perfectly legal in the U.S. in any other media.  (Similar bans have been enacted, often with more bite or more focused morality, in other counties, including Thailand, Pakistan, China, the E.U., and others.)

That law, and subsequent efforts to impose an antediluvian morality on U.S. Internet users, was summarily tossed out by the U.S. Supreme Court as a facial violation of the First Amendment.  Its passage inspired John Perry Barlow to issue his famous “Declaration of the Independence of Cyberspace,” which pointed out correctly that traditional governments have anything but the best interests of this new environment in mind when they put pen to paper.

As an example of regulation to protect vested (and obsoleting) interests, consider the 1998 Digital Millennium Copyright Act, in which content owners unwilling or unable to adapt to the new physics of digital distribution, convinced their lawmakers to impose brutally restrictive new limits on digital technologies.  They bought themselves far greater protection from reverse engineering, fair use, and the First Sale doctrine than they had achieved in the real world.

Whether those protections are enforceable, or whether they used the time it bought them to get ready for a more orderly transition to digital life, remain to be seen.  But the prospects are predictably poor.  Just ask Pope Urban VIII, who banned Galileo’s insistence that the Earth revolved around the Sun.  No matter how long Galileo stayed in prison, the orbits didn’t change.

Indeed, it’s hard without doing an exhaustive survey to think of a single piece of traditional law aimed at helping or saving the Internet that wasn’t at best naïve and at worse intentionally harmful–including laws that grant law enforcement more powers online than they have in their native territory.  That’s why I’m surprised when some of my fellow frontiersman short-sightedly rush back to Washington at the first sign of trouble with Native populations, or with saloon-keepers, or with the railroads, or with any other participant in the ecosystem who isn’t living up to their standards.  They should know that it’s both dangerous, and pointless, to do so.

2.      Impotence – In some sense, in other words, it doesn’t matter whether terrestrial governments regulate or not.  We have ample evidence – file-sharing, spam, political dissent, porn, gambling–that even those activities that have been banned go on without much regard for the legal consequences.  The government of Egypt (and Burma, and Pakistan, and China) can shut down Internet access for a short or for a long period of time.  But the disruption in service is a mere blink in the eye in Internet time.  Let’s see who wins the stand-off that ensues, and how quickly the Law of Disruption takes hold.  Bets gladly accepted here.

As Barlow wrote in his Declaration, “You have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear.”  Put another way, in nearly every conflict between Moore’s Law and traditional law, Moore’s Law wins.  Digital life will make its own “social contract” whether traditional governments give it permission to or not.

3.      Reverse engineering government – To repeat, the absence or ineffectiveness of traditional regulators in digital life does not translate to anarchy and chaos.  There is a social contract to online life, and it will be followed by more organized and organic forms of governance.  As I wrote in the piece, “the posse and the hanging tree gave way to local sheriffs and circuit-riding judges.”

That does not mean, however, that over time the old forms of government and regulation will finally win the battle and establish their norms on digital life.  Quite the opposite.  What has been and will continue to develop are forms of online governance that are suited to the unique environmental properties of digital life.

For now, we can already see that the new institutions will be more democratic–more directly democratic—for better and for worse.  (As Madison said, “If every Athenian had been a Solon, every Greek Assembly would still have been a mob.”)  Watch how the users of Facebook, Twitter, YouTube, World of Warcraft, iTunes, and Android respond to efforts by the sovereigns of these domains to dictate the terms of the social contract, and you’ll see how the new social contract is being worked out.

There’s more.  Turner points out that the organic forms of governance that emerged from the American West didn’t simply create a new form of frontier law.  It created American law.  Once the global inhabitants of digital life work out their rules and enforcement mechanisms, in other words, they are unlikely to settle for a system any less efficient back on terra firma.  Turner writes, “Steadily, the frontier of settlement advanced and carried with it individualism, democracy, and nationalism, and powerfully affected the East and the Old World.”

Who will impose their collective will on whom, and which form of government will become obsolete?  Again, anyone care to place a wager?

This is starting to sound like the outline of something much longer.  So I’ll stop there.

Badges? We don’t need no stinking badges: Reading the FCC’s Net Neutrality Order (Part V)

(Follow the links for Part I, Part II, Part III and Part IV.)

In this final post on the FCC’s Dev. 23, 2010 Open Internet Report and Order, I’ll look briefly at the problematic legal foundation on which the FCC has built its new regulations on broadband Internet access.  That discussion need only be brief largely because the extended legal analysis has already been admirably detailed by FCC Commissioner Robert McDowell.  His dissent (see pages 145-177 of the Report and Order) calmly and systematically dismantles the case made by the majority (See ¶¶ 115-150).

This is no theoretical discussion of statutory interpretation.  Even before the rules have been published on the Federal Register, two broadband providers—Verizon and then MetroPCS—have already filed lawsuits in the D.C. Circuit Court of Appeals challenging the FCC’s authority to regulate.  (See Jim DeLong’s definitive deciphering of Verizon’s efforts to secure exclusive jurisdiction in the D.C. Circuit)  The arguments sketched out in Commissioner McDowell’s dissent are likely to mirror the complainants’ briefs in these and likely other Petitions for Review of the Order.

The Need for Authorization

Nate Anderson of Ars Technica, who did a great service in running side-by-side the provisions in the FCC’s final Order and the terms of Verizon-Google’s proposed legislative framework, asks the key question, “Why is Verizon suing over net neutrality rules it once supported?”

I wouldn’t and didn’t (see Part III) go as far as Anderson, who concludes that Verizon, “on substance, it got exactly what it wanted.”   Both the final rules and the Verizon-Google proposal closely tracked, with important differences, the original order the FCC proposed in October, 2009.  And there are material differences between what Verizon-Google proposed and what the FCC ultimately voted on, notably in the treatment of mobile broadband.

But those details aside, there is one crucial difference that Anderson acknowledges.  As he writes, “the Verizon/Google proposal did make one other suggestion: it should be passed by Congress, not the FCC….”

That might seem like a small enough difference.  Rules are rules, what difference if the FCC passed them under its rulemaking authority or if Congress had put them into a new statute, such as the Internet Freedom Preservation Act, which would have naturally given the FCC authority to enforce them anyway?

But in fact that procedural difference embodies the principle objection not only to the Report and Order but to the process by which it was completed. Put simply, Congress alone has the power to regulate; the FCC can only act on authority delegated to it by Congress.  Any rulemaking undertaken without authority is not only dangerous, but also unconstitutional.

And Congress, it’s clear, has not delegated authority to the FCC to regulate broadband Internet access.  What Verizon and others are most concerned with is that if the FCC somehow gets away with passing new rules anyway, the agency will have established a dangerous precedent.  Any time in the future that the FCC or any other federal independent agency wants to extend its power, it need only deputize itself.

That is the feature of the Open Internet Report and Order that has most alarmed the communications industry, members of Congress, and advocates of limited government.  And that is principally why the House has promised to reverse the ruling, even as Verizon and others challenge it in court.  In short, the text of the rules aside, it very much matters that the FCC, and not Congress, took up elements of the framework proposed by Verizon-Google.

Regulatory Overreach is not a New Problem

The problem of regulatory overreach goes far beyond net neutrality.  Under a novel and somewhat fragile arrangement that was worked out during the New Deal, independent federal regulatory agencies can exercise considerable authority that the Constitution, on its face, reserves to the Legislative and Judicial branches.  Indeed, the early New Deal Supreme Court overturned much of FDR’s regulatory agenda under the so-called “nondelegation doctrine.”

After FDR threatened to “pack the court” with more sympathetic Justices, a key swing Justice changed sides, saving the Court and the New Deal.  (The so-called “switch in time that saved nine,” which few people realize is a pun on the sewing parable of a “stitch in time saves nine.”)

But even so, federal regulators operate under strict controls that ensure they do not become, to use the Supreme Court’s word for earlier FCC power grabs, “untethered” in their authority.  FCC Commissioners are appointed by the President and confirmed by the Senate, and can only be removed from office by impeachment.  At least two of the five Commissioners must be members of a party different from the President’s.

Both the rulemaking (legislative) and adjudicatory (judicial) powers of the agency are strictly limited by implementing statutes passed by Congress.  If the agency isn’t given explicit powers to regulate, regardless of the appearance or reality of significant market failures, only Congress can delegate additional powers.    And the courts, in the checks-and-balance system, are the final determinants of what powers have and have not been granted to an agency.

So the FCC has a problem.  It wants to regulate broadband Internet providers to ensure the “level playing field” it believes essential to the success of the Internet.  But Congress has never given them authority to do so, and has failed since 2004 to pass new legislation that would grant additional authority.

The FCC has actually lost ground during the rulemaking process.  An effort to enforce its Open Internet policy statement through adjudication against Comcast was rejected in April, 2010, further limiting the wiggle room the agency might have had to go forward with the formal rulemaking it began in October 2009.   (The rulemaking was, in some sense, an effort to formalize the policy statements.)

What’s the problem?  Briefly:  Under the Communications Act of 1996, and consistent with earlier versions of the FCC’s implementing statute, the agency was given broad authority over common carrier telephone service (Title II of the Act) but almost no authority over information services or what used to be known as “enhanced” or “ancillary services” (pre-Internet access, these included call waiting and other supplements to telephone service) (Title I of the Act).  The one exception was Internet access provided by dial-up modems, which of course is no longer a significant source of access.

The Comcast case, in line with several earlier D.C. Circuit and Supreme Court cases, made clear that Title I simply did not delegate authority over broadband access.

There was nothing new in that.  The FCC has made numerous efforts to attach otherwise unauthorized regulations to Title I’s so-called “ancillary jurisdiction,” but the courts frequently reject these efforts as overreaching.

For example, in 2005 the D.C. Circuit rejected regulations the FCC approved that would have required consumer products manufacturers to include “broadcast flag” technology in any device capable of receiving a television signal—a regulation that was grounded in ancillary jurisdiction over television broadcasters.  But while the agency had unquestioned authority over broadcasters, they could not require non-broadcasters to comply with rules aimed at helping the broadcasters control unauthorized home taping.

At oral argument, the judges nearly laughed the FCC out of court.  “You’re out there in the whole world, regulating. Are washing machines next?” asked Judge Harry Edwards. Judge David Sentelle added, “You can’t regulate washing machines. You can’t rule the world.”

The result in the Comcast case was much the same.  And the October, 2009 NPRM had grounded its authority to proceed solely with Title I.  With that avenue all but foreclosed to the agency by Comcast, the Chairman found himself in one of several corners he inhabited over the last year.  Congress was unlikely to move on any of the net neutrality bills floating around committees (and indeed, did not do so), but Genachowski was committed to the rulemaking.

The FCC’s “Very Smart Lawyers” Try Again

What to do?  One option was to undertake a “reclassification” of broadband Internet to categorize it as a telephone service subject to Title II, a section of the law that comes with fifty-plus years of baggage from the regulation of the former telephone monopoly.  The Commission (for now) has wisely avoided taking that step, which itself would have been subject to substantial legal challenges.

The authority stalemate seemed to doom the net neutrality proceeding.  But then in late Fall FCC Chairman Julius Genachowski told the audience at the Web 2.0 Summit that the FCC’s “very smart lawyers” had figured out a way to get around the Title I/Title II problem.  The net neutrality faithful and faithless waited, with breath held.

In the final Report and Order, however, all we really got was a rerun of the argument that had failed in the Comcast case, with only minor tweaking.  Again, Commissioner McDowell’s detailed dissent explains the weakness of the argument without the need for much added commentary.

The courts have consistently told the FCC that to invoke ancillary jurisdiction, a rulemaking must be reasonably related to a specific delegated power elsewhere in the Communications Act.  It has to be “ancillary” to some other authority the Commission already has, in other words.  Title I gives no powers on its own over “information services.”  In the Comcast case, the FCC listed off several provisions in hopes that at least one of them would stick, but the court rejected all of them.

In the Order (¶¶ 124-137), the FCC tries several new provisions.  Obviously the best bets were already exhausted in the Comcast case, so here they provide even weaker bases for ancillary authority over broadband Internet than the laundry list rejected by the court in Comcast.    Most get only perfunctory explanation.  The FCC knows it is on thin thin ice.

Instead, the Order relies principally on a new and unconvincing reading of Section 706 of the Act.  (See ¶¶ 117-123)  Section 706 had formed the principal argument in Comcast as well, but there the agency argued that Section 706 was the provision that enabled it to use ancillary authority over Title I Information Services.  The court rejected that argument.

The revised Section 706 argument is that that provision in and of itself provides sufficient authority for the FCC to implement the Open Internet rules.  Well, here it is:

SEC. 706. ADVANCED TELECOMMUNICATIONS INCENTIVES.

(a) IN GENERAL-The Commission and each State commission with regulatory jurisdiction over telecommunications services shall encourage the deployment on a reasonable and timely basis of advanced telecommunications capability to all Americans (including, in particular, elementary and secondary schools and classrooms) by utilizing, in a manner consistent with the public interest, convenience, and necessity, price cap regulation, regulatory forbearance, measures that promote competition in the local telecommunications market, or other regulating methods that remove barriers to infrastructure investment.

(b) INQUIRY-The Commission shall, within 30 months after the date of enactment of this Act, and regularly thereafter, initiate a notice of inquiry concerning the availability of advanced telecommunications capability to all Americans (including, in particular, elementary and secondary schools and classrooms) and shall complete the inquiry within 180 days after its initiation. In the inquiry, the Commission shall determine whether advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion. If the Commission’s determination is negative, it shall take immediate action to accelerate deployment of such capability by removing barriers to infrastructure investment and by promoting competition in the telecommunications market.

(c) DEFINITIONS- For purposes of this subsection:

(1) ADVANCED TELECOMMUNICATIONS CAPABILITY- The term `advanced telecommunications capability’ is defined, without regard to any transmission media or technology, as high-speed, switched, broadband telecommunications capability that enables users to originate and receive high-quality voice, data, graphics, and video telecommunications using any technology.

On its face, neither 706(a) nor 706(b) would appear to give the FCC power to put regulatory constraints of any kind on how broadband Internet access providers operate.  The goal of this section is to encourage the FCC to promote broadband by “regulating methods that remove barriers to infrastructure investment,” including forebearance from use of its existing powers.  The history of this provision, as Commissioner McDowell explains, was aimed at removing regulations of Title II telephone carriers that hindered their ability to provide advanced telecommunications capability.

The reliance on Section 706(b) is even stranger, and deeply cynical.  It requires the FCC to issue a regular report on broadband deployment and, if it finds such deployment is not taking place in a “reasonable and timely manner,” to take “immediate action to accelerate deployment” by “removing barriers” to investment.

Again, as Commissioner McDowell notes, the 706(b) Reports have consistently found broadband deployment to be proceeding at a rapid pace, confirming what everyone already knows.  Americans are signing on to the Internet faster than any previous information technology, whether through wireline or, increasingly, wireless broadband,

That is, until July, 2010, a few short months after the Comcast decision.  For the first time ever, the 706(b) Report found that “broadband deployment to all Americans is not reasonable and timely.”  (The Report, along with the Open Internet Order, was approved on a party-line 3-2 vote of the Commission.)  This despite the fact that broadband availability grew from 15% of Americans in 2003 to 95% in 2010 (data made available in the National Broadband Plan as well).

The negative 706(b) Report was clearly a pretext to give the agency the ability to trigger the “immediate action” language of the 706(b), but even then, see above, the action the FCC is supposed to take is in the nature of deregulating broadband, not adding additional regulations.  How will rules that limit the operational flexibility of broadband providers “accelerate deployment”?  The majority argues simply (¶ 123)  that “Section 706(b) provides express authority for the pro-investment, pro-competition rues we adopt today.”  Hardly.

The effort to connect Section 706 to the Open Internet rules is, charitably, flimsy at best.  But there’s yet another problem.  The FCC has already foreclosed that connection.  The agency has long rejected the view it now adopts that Section 706 provides any explicit authority for rulemaking, whether on its own (the new argument) or as a hook for ancillary jurisdiction under Title I.

As the D.C. Circuit noted in the Comcast case (Slip. Op. at 30-31), “In an earlier, still-binding order, the Commission ruled that section 706 ‘does not constitute an independent grant of authority.’  Instead, the Commission explained, section 706 ‘directs the Commission to use the authority granted in other provisions . . . to encourage the deployment of advanced services.’”  So Section 706 doesn’t give the agency any regulatory authority, just guidance on how to apply (or not) other provisions in the Act.  That, at least, has long been the FCC’s own view of the law, a view courts will give considerable deference.

In dispensing with the Section 706 argument in Comcast, the court concluded that “Because the Commission has never questioned, let alone overruled, that understanding of section 706, and because agencies ‘may not . . . depart from a prior policy sub silentio,’ the Commission remains bound by its earlier conclusion that section 706 grants no regulatory authority.”  (citations omitted)

That last sentence seemed to leave the door open just a crack for the FCC to “depart from its prior policy” in an explicit way.  And, it’s possible to read the Report and Order as doing just that.  (See ¶ 122 for the majority’s hilarious explanation for why it had never before noticed that Section 706 granted explicit authority.)

But not so fast.  While agencies have broad discretion to overrule earlier decisions, there must be some rational basis for doing so. There must be some changed circumstances, some evidence, some explanation that passes the sniff test. A reviewing court will look to see if there is some external evidence that justifies the changed interpretation of Section 706.

And there’s nothing here that meets even that minimal standard.  Again, to quote Commissioner McDowell (Report at 148), “This move is arbitrary and capricious and is not supported by the evidence in the record or a change of law.”  Losing the Comcast case is not reason enough, but that seems to be all that’s happened to justify this surprising new understanding of a 15 year-old provision in the FCC’s implanting statute.

Preserving which Internet again?

The rest of the FCC’s “Authority” section, as noted, throws in the rest of the kitchen sink, largely provisions of Title II, that Comcast didn’t already dispose of.  The connection between the Open Internet rules and the Commission’s regulatory powers over telephone service, television and radio broadcasting, cable TV and spectrum management are just too tenuous to be convincing to a reviewing court.  If that authority is close enough to support net neutrality, it’s close enough to support anything, including, for example, the broadcast flag rules already overturned.

There’s more.  Trying the net neutrality rules to problems of VoIP, IP-based television broadcasting, IP radio, and other video and audio services, as one of my law professors used to say, proves too much.  It actually undermines the FCC’s position by bringing into sharp focus the reality behind the agency’s real problem here.

Since Congress last reviewed the agency’s authority in 1996, the Internet’s packet-switching protocols have quickly and surprisingly taken over as the super-dominant technology for all forms of communications, traditional and new.  The world of television, radio, and computing have changed completely, leaving little of the world the 1996 Act gave the FCC authority to regulate.  Even the “Internet” as we knew it in 1996 looks little like the robust ecosystem of digital life that we enjoy today.

Which brings us squarely back to the problem of “nostalgia” I described in the previous post.  The FCC is operating under a statute that has its origins in the 1930’s, and which was lasted updated (poorly) fifteen years ago, when dial-up consumer Internet was still very much in its infancy.  The communications, computing and entertainment industries operated in silos with little overlap.  Each had its own established players and long histories of regulatory intervention.

But these and other related industries have all undergone nearly complete transformation in the intervening years, largely outside the notice or authority of the FCC to intervene.  Device and content convergence is a reality.  Consumers now use far more computing resources than do businesses.

Meanwhile, those aspects of the industry still under strict FCC control—including Plain Old Telephone Service (POTS) and over-the-air television and radio—have gone into deep decline.  They’ve become a legacy business that owners can’t even exit from, because there’s no one interested in the dwindling assets.

That’s no coincidence.  Those businesses (in some cases parts of companies whose unregulated operations are thriving), thanks to the regulatory environment in which they operate, are simply unable to respond quickly to rapidly evolving new technologies, applications, and consumer demands.   They suffer from a regulatory disease closely related to the Innovator’s Dilemma.  They can’t adapt, even if they had the will to do so.

Continued efforts, including this one, to fit round regulations into square statutory pegs underscores that the FCC has no authority over what has evolved to be our new and magical communications platform.   They have no authority because Congress hasn’t given them any.  Period.

Moreover, invocations (incantations?) of outmoded, obsolete, and inapplicable provisions of the old communications law also reminds us how much progress has been made during the period when the FCC has been unable or unwilling to intervene in the evolution of that platform.

Probably not the conclusion the FCC was hoping to have drawn from its nearly 200-page Report.  But there you have it.

TechFreedom Launches to "Promote the Progress of Technology"

I’m pleased to join in announcing the launch of TechFreedom, a non-profit, non-partisan technology policy think tank.  TechFreedom’s mission is “to promote the progress of technology that improves the human condition and expands individual capacity to choose.”

I have agreed to participate in TechFreedom’s activities as a Senior Adjunct Fellow.  As my first contribution, we will be republishing my multi-part blog on the FCC’s Open Internet Report and Order as a white paper, and hope to host a conference on the topic in late February.

TechFreedom officially launched last week, just after the State of the Net 2011 Conference.  Video of the opening conference, which featured FCC Commissioner Robert McDowell, is available by following the link.

The launch of TechFreedom also coincides with the publication of “The Next Digital Decade:  Essays on the Future of the Internet“, edited by Berin Szoka and Adam Marcus.  The book includes essays from a wide range of authors, including David Johnson, Hal Varian, Tim Wu, Jonathan Zittrain and the Hon. Alex Kozinski.

I have two essays in the book, one on information economics and the other on the so-called privacy “debate.”

The book is available as a free PDF download, or for purchase in hardcover.

Congress's Tech Agenda: Something Old, Something Older

I reported for CNET yesterday on highlights from the State of The Net 2011 conference, sponsored by the Advisory Committee to the Congressional Internet Caucus.  Though I didn’t attend last year’s event, I suspect much of the conversation hasn’t changed.

For an event that took place nearly a month after the FCC’s “final” vote on net neutrality, the issue seems not to have quieted down in the least.  A fiery speech from Congresswoman Martha Blackburn promised a “Congressional hurricane” in response to the FCC’s perceived ultra vires decision to regulate where Congress has refused to give it authority, a view supported by House and Senate counsel who spoke later in the day.

There seemed to be agreement from Republicans and Democrats that undoing the Open Internet Report and Order was the Republicans’ top priority on the tech agenda.  Blackburn has already introduced a bill, with at least one Democratic co-sponsor, to make clear (clearer?) that the FCC has no authority to regulate any Internet activity.  And everyone agreed that the Republicans would move forward with a resolution of disapproval under the Congressional Review Act, and that the resolution would pass the House and probably the Senate.  (Such resolutions are filibuster-proof, so Senate Republicans would need only a few Democrats.)

House Energy and Commerce senior counsel Neil Fried had mentioned the CRA resolution at CES a few weeks ago.  But now it’s been upgraded from a possibility to a likelihood.

The disagreement comes over whether President Obama would veto the resolution. Speculating in a vacuum, as many participants did, doesn’t really help.   The answer will ultimately depend on what other horse trading is in progress at the time.  (See:  tax cuts, health care, etc.)  Much as those of us who follow net neutrality may think it’s the center of the political universe, the reality is that it could easily become a bargaining chip.

That’s especially so given that almost no one was happy with the rules as they were finally approved.   Among advocates, opponents, and even among the five FCC Commissioners, only Chairman Genachowski had any enthusiasm for Order.  (He may be the only enthusiast, full stop.  On a panel on which I participated on the second day, advocates for net neutrality were tepid in their support of the Order or its prospects in court.  I think tepid is being generous.)

And everyone agreed that there would be legal challenges based on the FCC’s dubious statutory authority.  Amy Schatz of the Wall Street Journal said she knew of several lawyers in town shopping for friendly courts, and that pro-regulation advocates may themselves challenge the rule.  Timing could be important, or not.

Beyond net neutrality, which seems likely to dominate the tech agenda for the first six months of the new Congress, bi-partisan words were flung over the need to resolve the imminent (arrived?) “spectrum crisis,” and to reform the bloated and creaky Universal Service Fund.  These, it’s worth remembering, were two of the top priorities from last year’s National Broadband Plan, which sadly disappeared into the memory hole soon after publication.

Other possible agenda items I heard over the course of the two day event, but much farther down the list:  revival of COICA (giving DHS new powers to seize domains used for trademark and copyright violations), privacy, cloud computing, cybersecurity, ECPA reform, retransmission, inter-carrier compensation, and Comcast/NBC merger.  I missed a few panels, so I’m sure there was more.

What are the chances any of these conversations will actually generate new law?  Anybody?

The Future: It's Not What it Used to Be

Adam Thierer pointed me to a posting by Maria Popova at Brain Pickings, in which Popova has pieced together a somewhat cheesy 1972 documentary based on Alvin Toffler’s classic “Future Shock.”  The over-the-top narration by Orson Welles is well worth the price of your time.  I read Future Shock as a kid (I was 11 when it was published) and didn’t really understand it, but I loved its urgent tone and its sense of revealing a secret the “establishment” couldn’t handle.

Toffler was really the second person to point out the gap between the capacity of technology to change our lives and the ability of human beings to absorb that change.  The first (or at least an earlier version) comes from Henry Adams in his autobiographical “Education.”

Adams called the phenomenon, which he claimed had shipwrecked his life “The Law of Acceleration,” a revelation that struck him in the Hall of Dynamos at the 1893 World’s Columbian Exposition in Chicago, where he seems to have suffered a serious mental breakdown.  Working with pseudo-math and pseudo-physics, Adams calculated that the accelerating pace of change would end civilization by around 1920.

The world didn’t end in 1920, nor did it end in 1970.  Although perhaps from the standpoint of a Henry Adams (who died in 1917), the world of today might represent something so different and unpleasant that it might be described as post-apocalyptic.

Taking a more economic view of the process, I referred to is as The Law of Disruption.  Some technological breakthroughs cause considerable chaos and revolutionary change in business and society, it’s true, but what results on the other side, for those who survive the change, is always something better, or at least more efficient.

The period of adjustment is, to me, the most interesting.  Lucky for me then, that we seem to move from one period of adjustment to the next, or even to overlapping periods coming faster and closer together as “The Law of Acceleration” proceeds.

As excited as we get by the latest innovations in information, medical, materials, and transportation science, we can never exploit those changes as fast as the technology itself would make possible.  We need time to adjust, and to replace our inadequate metaphors (horseless carriage, radio with pictures, iron rope, etc.) for a new reality in which disruptive technologies are recognized as something new and not simply an incremental improvement.

From a business standpoint, the gap is an opportunity—companies can focus on helping customers and other business partners move up the curve just a little faster than they might on their own.  In some ways it’s like the old joke about the two campers who hear a bear outside their tent.  One of them starts to put on his shoes and the other says, “What are you doing?  You can’t outrun a bear.”  “I don’t have to outrun a bear,” the first one says.  “I just have to outrun you.”

But outside the business context, anxiety about the future—future shock—can look like a threat rather than an opportunity.  In my experience, in fact, you can divide the world into two kinds of people:  those who look forward to the sometimes uneven and messy process of adaptation and those who fear it.  Will the future be utopian (the United Federation of Planets) or dystopian (The Borg)?  The answer is of course neither—some things change for the better, some don’t, but mostly the course is unpredictable.  How someone views the future says much more about who they are now than anything else.

Toffler, as best I remember the book, didn’t really take a position, but his talent and the phenomenal success of the book rested on his ability to make the anxiety seem as unbearable as possible.   The transformation of society was inevitable, Toffler agreed, but the process was creating social chaos and leaving people to suffer from “shattering stress and disorientation.”

The changes Toffler wrote about (as a journalist, he largely collected example after example and lined them up as overwhelming evidence) seem modest and even quaint by comparison to those we are now experiencing forty years later.  Clearly, humanity survived, more-or-less intact.

Which suggests that we’ll survive the next wave, and the one after that, as well.  (Further still, well, who knows?)  Transformation is an inevitable feature of modern life, and isn’t likely to go away.  When change is constant, the only thing you can predict is unpredictability.  But at least you can predict that!

Watching the documentary, you get a sense of perspective from the inability of a previous generation to imagine its survival.  The process is there, but the result wasn’t nearly as disastrous as imagined.  So maybe we’ll cope just fine, too, with our dangerous inventions.

It’s not as if we have any choice.  As the Firesign Theater famously albeit cryptically said, “Live in the future.  It’s happening now.”  (To which a character in the background retorts: “The future?  The future’s not here yet, man.”)

I met Alvin Toffler in 1999.  We were teamed together for a one-day program, incongruously, in Buenos Aires.  Toffler spoke in the morning, and I spoke in the afternoon.  We weren’t asked to coordinate our messages, but it more-or-less worked out that he was giving the pessimist view of the future and I was giving the optimist view.  Most of all, I was interested in what new examples Toffler had found in his research on future shock—examples I hoped to use in my own speeches.

He spoke eloquently and powerfully, but all the examples he gave were the same old ones he had been using all along.  His research—at least—was stuck in 1970, and he seemed largely unaware of the digital revolution already well in progress by 1999.   (I have not read his most recent book, published in 2006.)

But perhaps in some sense, Toffler found—perhaps unintentionally—a cure for future shock.  Just keep living in the past, even a past in which you are fearing a future that has already arrived and kept going.

He is of course not alone.  The hell we’re familiar with is always easier to manage than the heaven that might be achieved.

“Preserving the Internet,” But Which One?: Reading the FCC’s Net Neutrality Order (Part IV)

This is Part IV of a five-part commentary on the FCC’s Dec. 23, 2010 “Open Internet” Report and Order.

Part I looked at the remarkably weak justification the majority gave for issuing the new rules.

Part II explored the likely costs of the rules, particularly the undiscussed costs of enforcement that will be borne by the agency and accused broadband access providers, regardless of the merits.  (See Adam Thierer’s post on the first attenuated claim of violation, raised before the rules even take effect.)

Part III compared the final text of the rules to earlier drafts and alternative proposals, tracing the Commission’s changing and sometimes contradictory reasoning over the last year.

Part IV, (this part), looks at the many exceptions and carve-outs from the rules, and what,  taken together, they say about the majority’s dogged determination to see the Internet as it was and not as it is or will become.

Part V will review the legal basis on which the majority rests its authority for the rules, likely to be challenged in court.

What does an Open Internet mean?

The idea of the “open Internet” is relatively simple:  consumers of broadband Internet access should have the ability to surf the web as they please and enjoy the content of their choice, without interference by access providers who may have financial or other anti-competitive reasons to shape or limit that access.

In the act of trying to translate that idea into enforceable rules—enforceable, inexplicably, by a federal regulatory agency with no legislative authority over any substantial feature of the Internet economy and no real justification for creating rules of any kind for a system that is working nearly flawlessly so far—the FCC has found itself tied in unholy knots.

The rules as enacted carved out exceptions and caveats that, taken together, render the final regulations not meaningless but certainly incoherent.

In exempting from the rules a host of important innovations in network management and infrastructure optimization developed over the last decade, the FCC has stepped back from the brink of its original plan, which would have returned the Internet to the days of unreliable dial-up access and static websites.

But it has also revealed the danger of trying to regulate a rapidly-evolving life form, and risked the unintended consequence of denying it future forms of nutrition and good health.  If these rules stand and are vigorously enforced, the Internet’s further growth and development may be stunted.

The Mythical Neutrality Principle

Back in the stone age of 1998, I wrote in “Unleashing the Killer App” that one of the fundamental bases on which the Internet became an engine of innovation and even social change was that its basic protocols are non-proprietary.  Anyone can make use of them, any device can support them, and every node is a peer—without paying royalties or other tribute to anyone.  As the “lowest common denominator” standard, TCP/IP benefited from network effects to overtake several popular proprietary standards, including IBM’s SNA.

The technical and legal openness of TCP/IP has been romanticized over the years, particularly by legal scholars and journalists who know less about technology than they think they do, into a view of the Internet as a meritocracy and a vehicle for true collaboration and consciousness-raising; the fruition, as Tim O’Reilly put it, “of “what we were talking about at Esalen in the ’70s—except we didn’t know it would be technology-mediated.”

The ideal of neutrality—of a level playing field in which every website, application, and device is no more prominent than any other–is a persistent and compelling myth.  It evokes the heroism of the entrepreneur in the garage, developing the next Yahoo or Google or YouTube or Facebook or Twitter or Groupon, with little more than a great idea, technical skills, and the willingness to sacrifice sleep and social life for the promise of a future liquidity event.  Optimally, the great IPO, or to change the world and make it a better place by connecting people and information in new and unexpected ways.  Wikipedia, for example.

Whatever the motivation, after a grueling race against the clock, the app is released.  If all goes well, it reaps the benefit of Metcalfe’s Law, goes viral, and becomes the next Big Thing, all in the span of time between one SXSW conference and the next Web 2.0 Summit.

No large corporation can stop the plucky inventor, or ransom a part of her invention.  No access provider can hold its invaluable user base hostage.  No competing content provider, no matter how giant, can buy up all the available market channels and freeze out the upstart start-up.  No government regulator need approve or license the invention before human testing and general use can begin.

When Worlds Collide

A considerably more mundane version of that ideal world did exist in the last half of the 1990’s.  It still exists today.  But it has become much more complex and nuanced in the last decade.

The Internet, the Web, the Cloud and the app-based economy of wireless computing devices, TVs and increasingly other things (including cars and other non-traditional computing platforms such as consumer electronics and home appliances) have evolved in interesting and productive ways, often “under the covers” of the network infrastructure.

Few consumers know or would care to know about the existence, let alone the details, of network optimization algorithms, content delivery networks, complex peering arrangements, caching and edge servers, file torrenting, mirror sites, specialized services, virtual private networks, packet prioritization based on media type, spam and other malware filters, dynamic IP addresses or domain name redirection.

All of these (and more) are mechanisms for speeding up the delivery of the most popular or the most bandwidth intensive content.  Many have been developed by entrepreneurs or by the large access and hosting services, often working in concert with the voluntary protocol and technical committees of the Internet Society.

ISOC keeps the standards alive, flexible, and responsive to new opportunities for expansion and reinvention made possible through the agency of Moore’s Law, which continues to drive the basic technological components of digital life into the uncharted realm of the faster, cheaper, and smaller.

Strictly speaking, of course, all of these innovations violate the neutrality principle.  They recognize that some packets, either because of file size or popularity or media characteristics or importance to the recipient, requires special treatment in the transport from host to client.

Video (YouTube, Hulu, Netflix), for example, can consist of very large files, and the  component packets must arrive at their destination with relatively short delays in order to maintain the integrity of streaming display.

Hosted services, such as medical monitoring, use parts of the same infrastructure as the public Internet, but cannot safely be left to the normal ebb and flow of Internet traffic patterns.  Limitations of the 3G wireless infrastructure—in large part a result of regulatory restrictions on cell siting and spectrum mismanagement—make it difficult to satisfy exploding customer demand for ever-more of the most bandwidth-intensive apps.

When all is said and done, the core problem with the FCC’s Open Internet Report and Order comes down to a clash of the idealized view of the neutral Internet with the reality of an always-evolving, always-improving technology infrastructure.

Chairman Genachowski, himself a former venture capitalist, is clinging to the myth of the Internet as virtual frontier, an understandable but highly dangerous indulgence in nostalgia, a remembrance of Internets past.  He’s not alone.  The romance of the American west has persisted more than a hundred years since historian Frederick Jackson Turner famously declared the frontier closed.

As he said in introducing the Open Internet proceeding in September, 2009, shortly after taking office:

“The Internet’s creators didn’t want the network architecture — or any single entity — to pick winners and losers. Because it might pick the wrong ones. Instead, the Internet’s open architecture pushes decision-making and intelligence to the edge of the network — to end users, to the cloud, to businesses of every size and in every sector of the economy, to creators and speakers across the country and around the globe. In the words of Tim Berners-Lee, the Internet is a ‘blank canvas’ — allowing anyone to contribute and to innovate without permission.”

Many of us fortunate enough to have been there at the moment the Internet reached its tipping point and became an unstoppable force, a kind of network gravity, share this nostalgia.  It was a moment that changed the trajectory of computing, upended giants, and unleashed tremendous creativity.  For me, it utterly transformed my career, much as my first FORTRAN course as an undergraduate had unintentionally started it.

But the effort to translate nostalgia into federal law—assuming, but only for the moment, that the FCC is the appropriate agency to preserve an Internet that has long since passed even if it was ever the way we old-timers remember it—has already fallen down more than its fair share of abandoned mine shafts

.

The Exceptions that Expose the Rule

Even the original Notice of Proposed Rulemaking and draft order released for comment in October, 2009 included many (necessary) exceptions from strict adherence to the neutrality principle.

The proposed rules, most important, limited all six neutrality rules (§§ 8.5-8.15) to an exception for “reasonable network management.”  Reasonable network management was defined as all “reasonable practices” broadband Internet access providers undertook to, among other things, “reduce or mitigate the effects of congestion on the network or to address quality-of-service concerns.”  (§ 8.3).  And bowing to legal limits to neutrality, reasonable network management did not apply to efforts by broadband access providers to “address unlawful conduct on the Internet,” including unlicensed sharing of copyrighted content. (¶ 139)

In explaining “reasonable network management,” (¶¶ 135-141), the FCC acknowledged that the technology by which a user accessed the Internet could play a significant role in determining when a provider could act “inconsistently” with the neutrality principle but still not violate the rules.  Access over coaxial cable follows a different architecture—with different constraints—than fiber, copper, satellite, or cellular access.  For purposes of “quality of service,” the agency acknowledged that it might be appropriate for an access provider to implement a “network management practice of prioritizing classes of latency-sensitive traffic,” such as VoIP, gaming, and streaming media traffic.  (¶137)

Since the FCC has up until now had little role to play in the regulation of the Internet, it’s not surprising that the agency began this process with a highly outdated view of how the Internet “worked.”  So the NPRM here and in eighty other sections, sought comment on the current state of the Internet ecosystem, the technologies of broadband access, network management principles in place, and the nature of the broadband access market throughout the U.S.—the latter a subject the agency took up again in the National Broadband Plan.

Not surprisingly, the FCC heard plenty.  The final report lists over 450 sources of comments and replies to the NPRM, many of which addressed themselves to educating the FCC on the technologies it had undertaken to regulate.

As a result of this formal (and no doubt a great deal of informal) feedback, the final rules added numerous additional exceptions, authorizing a wide range of ways a provider of broadband Internet access could act “inconsistently” with the neutrality principle but still not be thought to have violated them.

The new exceptions include:

  • Exemption from many of the rules for all providers of mobile broadband Internet access, including the “no unreasonable discrimination” rule and some of the “no blocking” rule.  (§ 8.5, 8.7)
  • Explicit exemption from the “no blocking” rule for app stores and other control mechanisms used by mobile broadband providers.  (¶ 102)
  • A change from a strict “nondiscrimination” rule for wireline providers to a rule prohibiting only “unreasonable discrimination.” (§ 8.7)  (See Part III for a discussion of the difference between those two formulations.)
  • A limited definition of “broadband Internet access service” that applies the rules only to providers of a “mass market retail service” providing “the capability to transmit data to and receive data from all or substantially all Internet endpoint.”  (§ 8.11(a)  That change leaves out a range of relatively new Internet devices and services—including the Amazon Kindle, game consoles, cars, TVs and refrigerators—that offer some form of web access incidental to their main purpose in connecting to the network.  (See ¶ 47)
  • A broader definition of “reasonable network management,” that includes any practice that is “appropriate and tailored to achieving a legitimate network management purpose.”  (§ 8.11(d) and see ¶ 82)
  • Exemption for virtual private networks, which use much of the same infrastructure as the public Internet. (¶ 47)
  • Exemption for Content Delivery Networks and co-located servers that put particular content in closer proximity to important network nodes and therefore speed its transmission to requesting users. (see ¶ 47 and ¶ 76 note 235)
  • Exemption for multichannel video programming services (e.g., U-verse) that use TCP/IP protocols and existing Internet infrastructure.  (¶ 47)
  • Exemption for Internet backbone services.  (¶ 47)
  • Exemption for hosting or data storage services. (¶ 47)
  • Exemptions for “coffee shops, bookstores, airlines and other entities when they acquire Internet service from a broadband provider to enable their patrons to access the Internet from their establishments.” (¶ 52)
  • Exemption from the discrimination rule for “existing arrangements for network interconnection, including existing peering arrangements.”  (¶ 67 n. 209)
  • Exemption (for now) ­ for “specialized services,” including multichannel video programming (see above) or facilities-based VoIP, that “share capacity with broadband Internet access services over providers’ last-mile facilities.”  (¶¶ 112-114)
  • A hedge on whether “paid priority” of some content, either of the access provider or a third party, would necessarily violate the “unreasonable discrimination” rule (¶ 76), and an explicit rejection of the argument that CDNs constitute illegal “pay for priority” though they have the same effect on consumer experience as prohibited prioritization schemes.  (¶ 77)
  • Recognition that end-users may elect to acquire Internet access that limits their choice of content, including services that support parental controls or which “allow end users to choose a service that provides access to the Internet but not to pornographic websites.”  (¶ 89).  Further, “[b]roadband providers are also free under this Order to offer a wide range of ‘edited’ services,” including a “service limited to ‘family friendly’ materials.”  (¶ 143, cf. ¶ 141)
  • Recognition that existing federal law allows all Internet Service Providers to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”   (¶ 89 n. 279)

Finding the Forest Amid the Exemptions

Of course these exceptions, particularly the measured approach to mobile broadband access and the provisional reprieve for specialized services, generated howls of indignation from advocacy groups hoping for pure neutrality, and led many of the Chairman’s initial supporters to abandon him over the course of the year the NPRM was publicly and privately debated.

My concern is quite different.  I think each of these exceptions makes good sense, and will keep the new rules, at least in the short-term, from causing life-threatening damage to the Internet ecosystem.

Rather, what the laundry list of exceptions demonstrates is that the majority just isn’t seeing the forest for the trees.  What the exceptions have in common is that each of them represents change to the Internet’s architecture and service models that have emerged in the last decade and a half. They are all new services, technologies, or service providers who, in these and other ways, violate the neutrality principle.

But these innovations have been developed for beneficial, not evil purposes.  The network is better in every sense imaginable, and will continue to improve in speed, efficiency, and usability so long as future innovations don’t run afoul of the rules and their enforcement.)  The Internet is not “open” in the way it may have been in 1995 (it was never as open as the idealists imagine).  But in order for the Internet we have today—faster, cheaper, better—to exist, each of these changes had to be made.

The genius of a virtual infrastructure is that it can absorb redesign without any interruption in service.  One unfortunate side-effect of that ease of transformation is that users don’t see the construction cones and highway workers.  Consumers—and the FCC–don’t realize that we’re now traveling on a multi-lane highway rather than the old dirt road.  The technology is utterly changed, and the rules of the road have changed with it.  For better or worse, but largely for the better.

The final rules, with all their exceptions suggest a majority clinging to the idealized past, and a stubborn refusal in the end to admit that the Internet has changed and continues to change—that it needs to change.

The exceptions for  the “inconsistent” behavior of CDNs, specialized services, peering arrangements, e-readers and game consoles, and app stores have no logical rationale, other than that the FCC has now learned that they are part of the current status quo.  But they are being exempted because they are in place, and they work.

For example, paying a CDN to replicate your content and co-locate servers at key network access points is surely “paying for priority.”  It puts a start-up offering similar content but without the funds for similar services at a competitive disadvantage.  The cached content will arrive faster when requested by a consumer.  But for consumers, that feature is a good thing—an improvement—even though it is not “neutral.”

Likewise, the mobile Internet is given special treatment because it is “evolving rapidly.”  But the fixed Internet is evolving rapidly as well, as many of these exemptions implicitly recognize.

The majority is fixated on maintaining a neutral Internet even though it now understands that neutrality is a virtue more honored in the breach.  The final report uses the word “traditionally” 25 times, the word “historically” 9 times, and the word “typically” 21 times.  These are the only justifications for the exceptions, and they undermine the purpose of the rules that remain.  There is no neutral Internet to preserve.  There’s only one that works.

The reality is that we’re moving away from websites to the mobile, app-based economy, specialized services and high-bandwidth applications such as video that shouldn’t be treated the same.  A “level playing field” doesn’t mean everyone gets a trophy

The good news is that the final rules grandfather in many existing technologies that violate the neutrality principle.  That’s essential, even if each of the exceptions is granted in isolation and begrudgingly at that.  But the open Internet regulations could have the effect of freezing that status quo, for no good (and lots of bad) reasons.

Nostalgia can be fun.  I enjoy sitting around with my fellow veterans of the pre-bubble dot com boom talking about the good old days, toasting to our irrational exuberance. But translating that wistfulness into federal law, even as here with rules pockmarked by the blemishes of a reality that looks far different than our idealized view of the past, is a dangerous way to celebrate it.

Next:  Not to worry.  The FCC has no authority, either.