Category Archives: Copyright

“Fake Neutrality” or Government Takeover?: Reading the FCC’s Net Neutrality Report (Part III)

In Part I of this analysis of the FCC’s Report and Order on “Preserving the Open Internet,” I reviewed the Commission’s justification for regulating broadband providers.   In Part II, I looked at the likely costs of the order, in particular the hidden costs of enforcement.  In this part, I compare the text of the final rules with earlier versions.  Next, I’ll look at some of the exceptions and caveats to the rules—and what they say about the true purpose of the regulations.

In the end, the FCC voted to approve three new rules that apply to broadband Internet providers.  One (§8.3) requires broadband access providers to disclose their network management practices to consumers.  The second One (§8.4) prohibits blocking of content, applications, services, and non-harmful devices.  The third One (§8.5) forbids fixed broadband providers (cable and telephone, e.g.) from “unreasonable” discrimination in transmitting lawful network traffic to a consumer.

There has of course been a great deal of commentary and criticism of the final rules, much of it reaching fevered pitch before the text was even made public.  At one extreme, advocates for stronger rules have rejected the new rules as meaningless, as “fake net neutrality,” “non neutrality,” or the latest evidence that the FCC has been captured by the industries it regulates.  On the other end, critics decry the new rules as a government takeover of the Internet, censorship, and a dangerous and unnecessary interference with a healthy digital economy.  (I agree with that last one.)

One thing that has not been seriously discussed, however, is just how little the final text differs from the rules originally proposed by the FCC in October, 2009.  Indeed, many of those critical of the weakness of the final rules seem to forget their enthusiasm for the initial draft, which in key respects has not changed at all in the intervening year of comments, conferences, hearings, and litigation.

The differences—significant and trivial—that have been made can largely be traced to comments the FCC received on the original draft, as well as interim proposals made by industry and Congress, particularly the framework offered by Verizon and Google in August and a bill circulated by Rep. Henry Waxman just before the mid-term elections.

1.      Transparency

Compare, for example, the final text of the transparency rule with the version first proposed by the FCC.

Subject to reasonable network management, a provider of broadband Internet access service must disclose such information as is reasonably required for users and content, application and service providers to enjoy the protections specified in this part. (Proposed)

A person engaged in the provision of broadband Internet access service shall publicly disclose accurate information regarding the network management practices, performance and commercial terms of its broadband Internet access service sufficient for consumers to make informed choices regarding use of such services and for content, application, service and device providers to develop, market and maintain Internet offerings. (Final)

The final rule is much stronger, and makes clearer what it is that must be disclosed.  It is also not subject to the limits of reasonable network management,  Rather than the vague requirement of the draft for disclosures sufficient to “enjoy the protections” of the open Internet rules, the final rule requires disclosures sufficient for consumers to make “informed choices” about the services they pay for, a standard more easily enforced.

By comparison, the final rule comes close to the version that appeared in draft legislation circulated but never introduced by Rep. Henry Waxman in October of 2010. It likewise reflects the key concepts in the Verizon-Google Legislative Framework Proposal from earlier in the year.

As the Report makes clear (¶¶ 53-61), the transparency rule has teeth.  Though the agency declines for now from making specific decisions about the contents of the disclosure and how is must be posted, the Report lays out a non-exhaustive list of nine major categories of disclosure, including network practices, performance characteristics, and commercial terms, that must be included.  It’s hard to imagine a complying disclosure that will not run to several pages of very small text.

That generosity, of course, may be the rule’s undoing.  As anyone who has ever thrown away a required disclosure from a service provider (mortgage, bank, drug, electronic device, financial statement, privacy, etc.) knows full well, information “sufficient” to make an informed choice is far more information than any non-expert consumer could possibly absorb and evaluate, even if they wanted to.   The more information consumers are given, the less likely they’ll pay attention to any of it, including what may be important.

The FCC recognizes that risk, however, but believes it has an answer.  “A key purpose of the transparency rule,” the Commission notes (¶ 60), “is to enable third-party experts such as independent engineers and consumer watchdogs to monitor and evaluate network management practices, in order to surface concerns regarding potential open Internet violations.”

Perhaps the agency has in mind here organizations like BITAG, which has been established by a wide coalition of participants in the Internet ecosystem to develop “consensus on broadband network management practices or other related technical issues.”  Or by consumer watchdogs, perhaps the agency imagines that some of the public interest groups who have most strenuously rallied for the rules will become responsible stewards of their implementation, trading the acid pens of political rhetoric for responsible analysis and advocacy to their members and other consumers.

We’ll see.  I wish I shared the Commissions confidence that, “for a number of reasons” (none cited), “the costs of the disclosure rule we adopt today are outweighed by the benefits of empowering end users and edge providers to make informed choices….”  (¶ 59).  But I don’t. Onward.

2.       Blocking

The final version of the blocking rule (§8.5) consolidated the Content, Applications and Services and Devices rule of the original draft.  The final rule states:

A person engaged in the provision of fixed broadband Internet access services, insofar as such person is so engaged, shall not block lawful content, applications, services or non-harmful devices, subject to reasonable network management.

A more limited rule applies to mobile broadband providers, who

[S]hall not block consumers from accessing lawful websites, subject to reasonable network management, nor shall such person block applications that compete with the providers’ voice or video telephony services, subject to reasonable network management

Much of the anguish over the final rules that has been published so far relates to a few of the limitations built into the blocking rule.  First, copyright-reform activists object to the word “lawful” appearing in the rule.  “Lawful” content, applications, and services do not include activities that constitute copyright and trademark infringement.  Therefore, the rule allows broadband providers to use whatever mechanisms they want (or may be required to) to reduce or eliminate traffic that involves illegal fire-sharing, spam, viruses and other malware, and the like.

A provider who blocks access to a site selling unlicensed products, in other words, is not violating the rules.  And as the agency finds it is “generally preferable to neither require not encourage broadband providers to examine Internet traffic in order to discern which traffic is subject to the rules” (¶ 48), there will be considerable margin of error given to providers who block sites, services, or applications which may include some legal components.

On this view, though the FCC otherwise contradicts it—see footnote 245 and elsewhere—a complete ban on the BitTorrent protocol, for better or worse, might not be a violation of the blocking rule.  Academic studies have shown that over 99% of BitTorrent traffic constitutes unlicensed file sharing of protected content.  Other than inspecting individual torrents, which the agency disfavors, how else can an access provider determine what tiny minority of BitTorrent traffic is in fact lawful?

A second concern is the repeated caveat for “reasonable network management,” which gives access providers leeway to balance traffic during peak times, limit users whose activity may be harming other users, and other “legitimate network management” purposes.

Finally, disappointed advocates object to the special treatment for mobile broadband, which may, for example, block applications, services or devices without violating the rule.  There is an exception to the exception for applications, such as VoIP and web video, that compete with the provider’s own offerings, but that special treatment doesn’t keep mobile providers from using “app stores” to exclude services they don’t approve.  (See ¶ 102)

Of course even the original draft of the rules included the limitation for “reasonable network management,” and refused to apply any of the rules to unlawful activities.  The definition of “reasonable network management” in the original draft is different, but functionally equivalent, to the final version.

The carve-out for mobile broadband, however, is indeed a departure from the original rules.  Though the Oct. 2009 Notice of Proposed Rulemaking expressed concern about applying the same rule to fixed and mobile broadband (see  13, 154-174), the draft blocking rule did not distinguish between fixed and mobile Internet access.  The FCC did note, however, that different technologies “may require differences in how, to what extent, and when the principles apply.”  The agency sought comment on these differences (and asked for further comment in a later Notice of Inquiry).  Needless to say, they heard plenty.

Wireless broadband is, of course, a newer technology, and one still very much in development.  Spectrum is limited, and capacity cannot easily be added.  Those are not so much market failures as they are regulatory failures.  The FCC is itself responsible for managing the limited radio spectrum, and has struggled by its own admission to allocate spectrum for its most efficient and productive uses—indeed, even to develop a complete inventory of who has which frequencies of licensed spectrum today.

Adding additional capacity is another regulatory obstacle.  Though mobile users rail against their providers for inadequate or unreliable coverage, no one, it seems, wants to have cellular towers and other equipment near where they live.  Local regulators, who must approve new infrastructure investments, take such concerns very much to heart.  (There is also rampant corruption and waste in the application, franchising, and oversight processes at the state and local levels, a not-very-secret secret.)

The FCC, it seems, has taken these concerns into account in the final rule.  Its original open Internet policy statements—from which the rules derive—applied only to fixed broadband access, and the October, 2009 draft’s inclusion of mobile broadband came as a surprise to many.

The first indication that the agency was considering a return to the original open Internet policy came with the Verizon-Google proposal, where the former net neutrality adversaries jointly released a legislative framework (that is, something they hoped Congress, not the FCC, would take seriously) that gave different treatment to mobile.  As the V-G proposal noted, “Because of the unique technical and operational characteristics of wireless networks, and the competitive and still-developing nature of wireless broadband services, only the transparency principle would apply to wireless at this time.”

The Waxman proposal didn’t go as far as V-G, however, adding a provision that closely tracks with the final rule.  Under the Waxman bill, mobile providers would have been prohibited from blocking “lawful Internet websites”, and applications “that compete with the providers’ voice or video communications services.”

So the trajectory of the specialized treatment for mobile broadband is at least clear and, for those following the drama, entirely predictable.  Yet the strongest objections to the final rule and the loudest cries of betrayal from neutrality advocates came from the decision to burden mobile providers less than their fixed counterparts.  (Many providers offer both, of course, so will be subject to different rules for different parts of their service.)

At the very least, the advocates should have seen it coming.  Many did.  A number of “advocacy” groups demonized Google for its cooperation with Verizon, and refused to support Waxman’s bill.  (It should also be noted that none of the groups objecting to the final rules or any interim version ever actually proposed their own version—that is, what they actually wanted as opposed to what they didn’t want.)

3.      Unreasonable discrimination

The final rule, applicable only to fixed broadband providers, demands that a provider not “unreasonably discriminate in transmitting lawful network traffic over a consumer’s broadband Internet access service.”  (§ 8.7, and see ¶¶ 68-79 of the Report).

Though subtle, the difference in language between the NPRM and the final rule are significant, as the FCC acknowledges.  The NPRM draft rule noted plainly that “a provider of broadband Internet access service must treat lawful content, applications, and services in a nondiscriminatory manner.”

The difference here is between “nondiscrimination,” which prohibits all forms of differential network treatment, and “unreasonable discrimination,” which allows discrimination so long as it is reasonable.

The migration from a strict nondiscrimination rule (subject, however, to reasonable network management) to a rule against “unreasonable” discrimination can be seen in the interim documents.  The Verizon-Google proposal, which called for a “Non-Discrimination Requirement,” nonetheless worded the requirement to ban only “undue discrimination against lawful Internet content, application, or service in a manner that causes meaningful harm to competition or to users.” (emphasis added)

Rep. Waxman’s draft bill, likewise, would have applied a somewhat different standard for wireline providers, who “shall not unjustly or unreasonably discriminate in transmitting lawful traffic over a consumer’s wireline broadband Internet access service,” also subject to reasonable network management.

Over time, the FCC recognized the error of its original draft and now agrees “with the diverse group of commenters who argue that any nondiscrimination rule should prohibit only unreasonable discrimination.” (¶ 77)

As between the suggested limiting terms “undue,” “unjust” and “unreasonable,” the FCC chose the latter for the final rule.  Though many have complained that “unreasonable” is a nebulous, subjective term, it should be noted that of the three it is the only one with understood (if not entirely clear) legal meaning, particularly in the context of the FCC’s long history of rulemaking and adjudication.

The earliest railroad regulations, for example, which also provided the beginning of the FCC’s eventual creation and authority over communications industries, required reasonable rates of carriage, and empowered the Interstate Commerce Commission to intervene and eventually set the rates itself, much as the FCC later did with telephony.

One lesson of the railroad and telephone histories, however, is the danger of turning over to regulators decisions about what behaviors are reasonable. (Briefly, regulatory capture often ends up leaving the industry unable to respond to new forms of competition from disruptive technologies, with disastrous consequences.)

The V-G proposal gets to the heart of the problem in the text I italicized.  Despite the negative connotations of the word in common use, “discrimination” isn’t inherently bad. As the Report makes clear, in managing Internet access and network traffic, there are many forms of discrimination—which means, after all, affording different treatment to different things—that are entirely beneficial to overall network behavior and to the consumer’s experience with the Internet.

The draft rule, as the FCC now admits (see ¶ 77 of the Report), was dangerously rigid.  If any behavior should be regulated, it is the kind of discrimination whose principal purpose is to harm competition or users—though that kind of behavior is already illegal under various antitrust laws.

For one thing, users may want some kinds of traffic – e.g., voice and video – to receive a higher priority over text and graphics, which do not suffer from latency problems.  Companies operating Virtual Private Networks for their employees may likewise want to limit Web access to selected sites and activities for workers while on the job.

A strict nondiscrimination rule would have also discouraged or perhaps banned tiered pricing, harming consumers who do not need the fastest speeds and the highest volume of downloads to accomplish what the want to online.  (Without tiered pricing, such consumers effectively subsidize power users who, not surprisingly, are the most vociferous objectors to tiered pricing.)

Discrimination may also be necessary to manage congestion during peak usage periods or when failing nodes put pressure on the backbone.  Discrimination against spam, viruses and other malware, much of which is not “lawful,” is also permitted and indeed encouraged.  (See ¶ 90-92.)

By comparison, the Report notes three (¶ 75) types of provider discrimination that are of particular concern.  These are:  discrimination that harms competitors (e.g., VoIP providers of over-the-top telephone service, such as Skype or Vonage, that competes with the provider’s own telephone service), “inhibiting” end users from accessing content, services, and applications of their choice (but see the no-blocking rule, above, which already covers this), and discrimination that “impairs free expression,” including slowing or blocking access to a blog whose message the broadband provider does not approve.

On that last point, however, it’s important to note that Congress has already given broadband providers (and others) broad freedom to filter and otherwise curate content they do not approve of or which they believe their customers don’t want to see.  Under Section 230 of the Communications Decency Act,

“No provider or user of an interactive computer service shall be held liable on account of . . . any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

The goal of Section 230 was to immunize early Internet providers including CompuServe and Prodigy from efforts to exercise editorial control over message boards whose content was provided by customers themselves.  But it gives providers broad discretion in determining what kind of content it believes its customers don’t want to see.  So long as the filtering is undertaken in “good faith” (e.g., not with the intent of harming a competitor), there is no liability for the provider, who does not, for example, become a “publisher” for purposes of defamation law.

The FCC (¶ 89) acknowledges the limit that Section 230 puts on the discrimination rule.

On the harm to competitors prong, the FCC waffles (see ¶ 76) on whether “pay for priority”—the bugaboo that launched the neutrality offensive in the first place, actually constitutes a violation of the rules.  While a broadband provider’s offering to prioritize the traffic of a particular source for a premium fee “would raise significant cause for concern,” the agency acknowledges that such behavior has occurred and thrived for years in the form of third party Content Delivery Networks.  (See footnote 236)  CDNs are allowed.   (More on CDNs in the next post.)

So in the end the discrimination rule doesn’t appear to add much to the blocking rule or existing antitrust laws.  Discrimination against competing over-the-top providers would violate antitrust.  Blocking or slowing access to disfavored content is already subject to the blocking rule.  And interfering with “free expression” rights of users is already significantly allowed by Section 230.

What’s left?   “The rule rests on the general proposition,” the agency concludes (¶ 78), “that broadband providers should not pick winners and losers on the Internet,” even when doing so is independent of competitive interests.  What exactly this means—and how “reasonable” discrimination will be judged in the course of enforcing the rules—remains to be seen.

Next:  The exceptions and what they say about the real purpose of the rules

Domain Name Seizures and the Limits of Civil Forfeiture

I was quoted this morning in Sara Jerome’s story for The Hill on the weekend seizures of domain names the government believes are selling black market, counterfeit, or copyright infringing goods.

The seizures take place in the context of an on-going investigation where prosecutors make purchases from the sites and then determine that the goods violate trademarks or copyrights or both.

Several reports, including from CNET, The Washington Post and Techdirt, wonder how it is the government can seize a domain name without a trial and, indeed, without even giving notice to the registered owners.

The short answer is the federal civil forfeiture law, which has been the subject of increasing criticism unrelated to Internet issues.  (See http://law.jrank.org/pages/1231/Forfeiture-Constitutional-challenges.html for a good synopsis of recent challenges, most of which fail.)

The purpose of forfeiture laws is to help prosecutors fit the punishment to the crime, especially when restitution of the victims or of the cost of prosecution is otherwise unlikely to have a deterrent effect, largely because the criminal has no assets to attach.  In the war on drugs, for example, prosecutors can now seize pretty much any property used in the commission of the crime, including a seller’s vehicle or boat.  (See U.S. v. 1990 Toyota 4 Runner for an example and explanation of the limits of federal forfeiture law.)

Forfeiture laws have been increasingly used to fund large-scale enforcement operations, and many local and federal police now develop budgets for these activities based on assumptions about the value of seized property.  This has led to criticism that the police are increasingly only enforcing the law when doing so is “profitable.”  But police point out that in an age of regular budget cuts, forfeiture laws are all they have in the way of leverage.

Sometimes the forfeiture proceedings happen after the trial, but as with the domain names, prosecutors also have the option to seize property before any indictment and well before any trial or conviction.  Like a search warrant, a warrant to seize property requires only that a judge find probable cause that the items to be seized fit the requirements of forfeiture—in general, that they were used in the commission of a crime.

The important difference between a seizure and a finding of guilt—the difference that allows the government to operate with such a free hand—is that the seizure is only temporary.  A forfeiture, as here, isn’t permanent until there is a final conviction.

The pre-trial seizure is premised on the idea that during the investigation and trial, prosecutors need to secure the items so that the defendant doesn’t destroy or hide it.

If the defendant is acquitted, the seized items are returned.  Or, if the items turn out not to be subject to forfeiture (e.g., they were not used in the commission of any crimes the defendant is ultimately convicted for), they are again returned.  Even before trial, owners can sue to quash the seizure order on the grounds that there was insufficient (that is, less than probable) cause to seize it in the first place.

All of that process takes time and money, however, and many legal scholars believe in practice that forfeiture reverses the presumption of innocence, forcing the property owner to prove the property is “innocent” in some way.

In current (and expanding) usage, forfeiture may also work to short-circuit due process of the property owner.  (Or owners—indeed, seized property may be jointly owned, and the victim of the crime may be one of the owners, as when the family car is seized when the husband uses it to liaison with a prostitute.)

That’s clearly a concern with the seizure of domain names.  This “property” is essential for the enterprise being investigated to do business of any kind.  So seizing the domain names before indictment and trial effectively shuts down the enterprise indefinitely. (Reports are that most if not all of the enterprises involved in this weekend’s raid, however, have returned under new domain names.)

If prosecutors drag their heels on prosecution, the defendant gets “punished” anyway.  So even if the defendant is never charged or is ultimately acquitted, there’s nothing in the forfeiture statute that requires the government to make them whole for the losses suffered during the period when their property was held by the prosecution.  The loss of the use of a car or boat, for example, may require the defendant to rent another while waiting for the wheels of justice to turn.

For a domain name, even a short seizure effectively erases any value the asset has.  Even if ultimately returned, it’s now worthless.

Clearly the prosecutors here understand that a pre-trial seizure is effectively a conviction.  Consider the following quote from Immigration and Customs Enforcement Director John Morton, who said at a press conference today, “Counterfeiters are prowling in the back alleys of the Internet, masquerading, duping and stealing.”  Or consider the wording of the announcement placed on seized domain names (see http://news.cnet.com/8301-1023_3-20023918-93.html), implying at the least that the sites were guilty of illegal acts.

There’s no requirement for the government to explain the seizures are only temporary measures designed to safeguard property that may be evidence of crime or may be an asset used to commit it.  Nor do they have to acknowledge that none of the owners of the domain names seized has been charged or convicted of any crime yet.  But the farther prosecutors push the forfeiture statute, the bigger the risk that courts or Congress will someday step in to pull them back.

Updates to the media page

The fall has been filled with important developments in the technology world, and I continue to be a regular source for journalists as well as publishing frequent editorials and analyses of my own.  I’ve just posted another ten items to the Media Page of my website, including several articles I’ve written for CNET News.com, an election-day op-ed in Roll Call, legal analysis for The Wall Street Journal and a long review of “The Laws of Disruption” in the International Journal of Communications.  The accidents continue to pile up at the dangerous intersection of innovation and the law, the main theme of The Laws of Disruption.

Some highlights:

The U.S. Supreme Court heard arguments in EMA v. Schwarzenegger, which challenges California’s ban on violent video games on First Amendment ground.  My article for CNET explained why the timing of the case is significant, with implications for all new media enterprises.

The European Commission is preparing new legislation to guarantee its citizens a “right to be forgotten.  On CNET, I explain why that well-intentioned initiative could have disastrous consequences for the digital economy.

My election-day op-ed for Roll Call, the leading newspaper of Capitol Hill, urged Congress to stop the FCC’s dangerous plans to “reclassify” broadband Internet access and treat it like 1930’s-style telephone business.

My detailed analysis of Rep. Henry Waxman’s proposed net neutrality bill, a last-minute effort to resolve the long-running conflict before the election, was featured on The Wall Street Journal’s “All Things Digital.”

In the important Vernor decision, the Court of Appeals in California ruled that licensing agreements that deny users a right to resell copies of software are enforceable.  Though many viewed this decision as harmful to consumers, I explain why developments in the software industry have already relegated license agreements to the margins, in a controversial article for CNET News.com.

NextGenWeb, sponsored by the U.S. Telecom Association, interviewed me one of many recent visits to Washington.

As the new Congress prepares to convene in January, watch for more important developments.

The end of software ownership




My article for CNET this morning, “The end of software ownership…and why to smile,” looks at the important decision a few weeks ago in the Ninth Circuit copyright case, Vernor v. Autodesk.  (See also excellent blog posts on Eric Goldman’s blog. Unfortunately these posts didn’t run until after I’d finished the CNET piece.)

The CNET article took the provocative position that Vernor signals the eventual (perhaps imminent) end to the brief history of users “owning” “copies” of software that they “buy,” replacing the regime of ownership with one of rental.  And, perhaps more controversially still, I try to make the case that such a dramatic change is in fact not, as most commentators of the decision have concluded, a terrible loss for consumers but a liberating victory.

I’ll let the CNET article speak for itself.  Here I want to make a somewhat different point about the case, which is that the “ownership” regime was always an aberration, the result of an unfortunate need to rely on media to distribute code (until the Internet) coupled with a very bad decision back in 1976 to extend copyright protection to software in the first place.

The Vernor Decision, Briefly

First, a little background.

The Vernor decision, in brief, took a big step in an on-going move by the federal courts to allow licensing agreements to trump user rights reserved by the Copyright Act.  In the Vernor case, the most important of those rights was at issue:  the right to resell used copies.

Vernor, an eBay seller of general merchandise, had purchased four used copies of an older version of AutoCAD from a small architectural firm at an “office sale.”

The firm had agreed in the license agreement not to resell the software, and had reaffirmed that agreement when it upgraded its copies to a new version of the application.  Still, the firm sold the media of the old versions to Vernor, who in turn put them up for auction on eBay.

Autodesk tried repeatedly to cancel the auctions, until, when Vernor put the fourth copy up for sale, eBay temporarily suspended his account.  Vernor sued Autodesk, asking the court for a declaratory judgment (essentially a preemptive lawsuit) that as the lawful owner of a copy of AutoCAD, he had the right to resell it.

A lower court agreed with Vernor, but the Ninth Circuit reversed, and held that the so-called “First Sale Doctrine,” codified in the Copyright Act, did not apply because the architectural firm never bought a “copy” of the application.  Instead, the firm had only paid to use the software under a license from Autodesk, a license the firm had clearly violated.  Since the firm never owned the software, Vernor acquired no rights under copyright when he purchased the disks.

The Long Arm of Vernor?

This is an important decision, since all commercial software (and even open source and freeware software) is enabled by the producer only on condition of acceptance by the user of a license agreement.

These days, nearly all licenses purport to restrict the user’s ability to resell the software without permission from the producer.  (In the case of open source software under the GPL, users can redistribute the software so long as they repeat the other limits, including the requirement that modifications to the software also be distributed under the GPL.)  Thus, if the Vernor decision stands, used markets for software will quickly disappear.

Moreover, as the article points out, there’s no reason to think the decision is restricted just to software.  The three-judge panel suggested that any product—or at least any information-based product—that comes with a license agreement is in fact licensed rather than sold.  Thus, books, movies, music and video games distributed electronically in software-like formats readable by computers and other devices are probably all within the reach of the decision.

Who knows?  Perhaps Vernor could be applied to physical products—books, toasters, cars—that are conveyed via license.  Maybe before long consumers won’t own anything anymore; they’ll just get to use things, like seats at a movie theater (the classic example of a license), subject to limits imposed—and even changed at will—by the licensor.  We’ll become a nation of renters, owning nothing.

Well, not so fast.  First of all, let’s note some institutional limits of the decision.  The Ninth Circuit’s ruling applies only within federal courts of the western states (including California and Washington, where this case originated).  Other circuits facing similar questions of interpretation may reach different or even opposite decisions.

Vernor may also appeal the decision to the full Ninth Circuit or even the U.S. Supreme Court, though in both cases the decision to reconsider would be at the discretion of the respective court.  (My strong intuition is that the Supreme Court would not take an appeal on this case.)

Also, as Eric Goldman notes, the Ninth Circuit already has two other First Sale Doctrine cases in the pipeline.  Other panels of the court may take a different or more limited view.

For example, the Vernor case deals with a license that was granted by a business (Autodesk) to another business (the architectural firm).  But courts are often hesitant to enforce onerous or especially one-sided terms of a contract (a license is a kind of contract) between a business and an individual consumer.  Consumers, more than businesses, are unlikely to be able to understand the terms of an agreement, let alone have any realistic expectation of negotiating over terms they don’t like.

Courts, including the Ninth Circuit, may decline to extend the ruling to other forms of electronic content, let alone to physical goods.

The Joy of Renting

So for now let’s take the decision on its face:  Software licensing agreements that say the user is only licensing the use of software rather than purchasing a copy are enforceable.  Such agreements require only a few “magic words” (to quote the Electronic Frontier Foundation’s derisive view of the opinion) to transform software buyers into software renters.  And it’s a safe bet that any existing End User Licensing Agreements (EULAs) that don’t already recite those magic words will be quickly revised to do so.

(Besides EFF, see scathing critiques of the Vernor decision at Techdirt and Wired.)

So.  You don’t own those copies of software that you thought you purchased.  You just rent it from the vendor, on terms offered on a take-it-or-leave-it basis and subject to revision at will.  All those disks sitting in all those cardboard albums sitting on a shelf in your office are really the property of Microsoft, Intuit, Activision, and Adobe.  You don’t have to return them when the license expires, but you can’t transfer ownership of them to someone else because you don’t own them in the first place.

Well, so what?  Most of those boxes are utterly useless within a very short period of time, which is why there never has been an especially robust market for used software.  What real value is there to a copy of Windows 98, or last year’s TurboTax, or Photoshop Version 1.0?

Why does software get old so quickly, and why is old software worthless?  To answer those questions, I refer in the article to an important 2009 essay by Kevin Kelly.  Kelly, for one, thinks the prospect of renting rather than owning information content is not only wonderful but inevitable, and not because courts are being tricked into saying so.  (Kelly’s article says nothing about the legal aspects of ownership and renting.)

Renting is better for consumers, Kelly says, because ownership of information products introduces significant costs and absolutely no benefits to the consumer.  Once content is transformed into electronic formats, both the media (8-track) and the devices that play them (Betamax) grow quickly obsolete as technology improves under the neutral principle of Moore’s Law.  So if you own the media you have to store it, maintain it, catalog it and, pretty soon, replace it.  If you rent it, just as any tenant, those costs are borne by the landlord.

Consumers who own libraries of media find themselves regularly faced with the need to replace them with new media if they want to take advantage of the new features and functions of new media-interpreting devices.  You’re welcome to keep the 78’s that scratch and pop and hiss, but who really wants to?  Nostalgia only goes so far, and only for a unique subset of consumers.  Most of us like it when things get better, faster, smaller, and cheaper.

In the case of software, there’s the additional and rapid obsolescence of the code itself.  Operating systems have to be rewritten as the hardware improves and platforms proliferate.  Tax preparation software has to be replaced every year to keep up with the tax code.  Image manipulation software gets ever more sophisticated as display devices are radically improved.

Unlike a book or a piece of music, software is only written for the computer to “read” in the first place.  You can always read an old book, whether you prefer the convenience of a mass storage device such as a Kindle.  But you could never read the object code for AutoCAD even if you wanted to—the old version (which got old fast, and not just to encourage you to buy new versions) is just taking up space in your closet.

The Real Crime was Extending Copyright to Software in the First Place

In that sense, it never made any sense to “own” “copies” of software in the first place.  That was only the distribution model for a short time, necessitated by an unfortunate technical limit of computer architecture that has nearly disappeared.  CPUs require machine-readable code to be moved into RAM in order to be executed.

But core memory was expensive.  Code came loaded on cheap tape, which was then copied to more expensive disks, which was then read into even more expensive memory.  In a perfect world with unlimited free memory, the computer would have come pre-loaded with everything.

That wouldn’t have solved the obsolescence problem, however.  But the Internet solved that by eliminating the need for the physical media copies in the first place.  Nearly all the software on my computer was downloaded in the first place—if I got a disk, it was just to initiate the download and installation.  (The user manual, the other component of the software album, is only on the disk or online these days.)

As we move from physical copies to downloaded software, vendors can more easily and more quickly issue new versions, patches, upgrades, and added functionality (new levels of video games, for example).

And, as we move from physical copies to virtual copies residing in the cloud, it becomes increasingly less weird to think that the thing we paid for—the thing that’s sitting right there, in our house or office—isn’t really ours at all, even though we paid for, bagged it, transported and unwrapped it just as do all the other commodities that we do own.

That’s why the Vendor decision, in the end, isn’t really all that revolutionary.  It just acknowledges in law what has already happened in the market.  We don’t buy software.  We pay for a service—whether by the month, or by the user, or by looking at ads, or by the amount of processing or storage or whatever we do with the service—and regardless of whether the software that implements the service runs on our computer or someone else’s, or, for that matter, everyone else’s.

The crime here, if there is one, isn’t that the courts are taking away the First Sale Doctrine.  It’s not, in other words, that one piece of copyright law no longer applies to software.  The crime is that copyright—any part of it—every applied to software in the first place.  That’s what led to the culture of software “packages” and “suites” and “owning copies” that was never a good fit, and which now has become more trouble than it’s worth.

Remember that before the 1976 revisions to the Copyright Act, it was pretty clear that software wasn’t protected by copyright.  Until then, vendors (there were very few, and, of course, no consumer market) protected their source code either by delivering only object code and/or by holding user’s to the terms of contracts based on the law of trade secrets.

That regime worked just fine.  But vendors got greedy, and took the opportunity of the 1976 reforms to lobby for extension of copyright for source code.  Later, they got greedier, and chipped away at bans on applying patent law to software as well.

Not that copyright or patent protection really bought the vendors much.  Efforts to use it to protect the “look and feel” of user interfaces, as if they were novels that read too closely to an original work, fell flat.

Except when it came to stopping the wholesale reproduction and unauthorized sale of programs in other countries, copyright protection hasn’t been of much value to vendors.  And even then the real protection for software was and remains the rapid revision process driven by technological, rather than business or legal, change.

But the metaphor equating software with novels had unintended consequences.  With software protected by copyright, users—especially consumers—became accustomed to the language of copies and ownership and purchase, and to the protections of the law of sales, which applies to physical goods (books) and not to services (accounting).

So, if consumer advocates and legal scholars are enraged by the return to a purely contractual model for software use, in some sense the vendors have only themselves—or rather their predecessors—to blame.

But that doesn’t change the fact that software never fit the model of copyright, including the First Sale Doctrine.  Just because source code kind of sort of looked like it was written in a language readable by a very few humans, the infamous CONTU Committee making recommendations to Congress made the leap to treating software as a work of authorship by (poor) analogy.

With the 1976 Copyright Act, the law treated software as if it were a novel, giving exclusive rights to its “authors” for a period of time that is absurd compared to the short economic lifespan of any piece of code written since the time of Charles Baggage and Ada Lovelace.

The farther away from a traditional “work of authorship” that software evolves (visual programming, object-oriented architecture, interpretive languages such as HTML), the more unfortunate that decision looks in retrospect.  Source code is just a convenience, making it easier to write and maintain programs.  But it doesn’t do anything.  It must be compiled or interpreted before the hardware will make a peep or move a pixel.

Author John Hersey, one of the CONTU Committee members, got it just right.  In his dissent from the recommendation to extend copyright to software, Hersey wrote, “software utters work.  Work is its only utterance and its only purpose.”

Work doesn’t need the incentives and protections we have afforded to novels and songs.  And consumers can no more resell work than they can take home their seat from the movie theater after the show.

Copyright Office Weighs in on Awkward Questions of Software Law

I dashed off a piece for CNET today on the Copyright Office’s cell phone “jailbreaking” rulemaking earlier this week.  Though there has already been extensive coverage (including solid pieces in The Washington Post, a New York Times editorial, CNET, and Techdirt), there were a few interesting aspects to the decision I thought were worth highlighting.

Most notably, I was interested that no one had discussed the possibility and process by which Apple or other service providers could appeal the rulemaking.  Ordinarily, parties who object to rules enrolled by administrative agencies can file suit in federal district court under the Administrative Procedures Act.  Such suits are difficult to win, as courts give deference to administrative determinations and review them only for errors of law.  But a win for the agency is by no means guaranteed.

The Appeals Process

What I found in interviewing several leading high tech law scholars and practitioners is that no one was really clear how or even if that process applied to the Copyright Office.  In the twelve years that the Register of Copyrights has been reviewing requests for exemptions, there are no reported cases of efforts to challenge those rules and have them overturned.

With the help of Fred von Lohmann, I was able to obtain copies of briefs in a 2006 lawsuit filed by TracFone Wireless that challenged an exemption (modified and extended in Monday’s rulemaking) allowing cell phone users to unlock their phones from an authorized network in hopes of moving to a different network.  TracFone sued the Register in a Florida federal district court, claiming that both the process and substance of the exemption violated the APA and TracFone’s due process rights under the Fifth Amendment.

But the Justice Department, in defending the Copyright Office, made some interesting arguments.  They claimed, for example, that until TracFone suffered a particular injury as a result of the rulemaking, the company had no standing to sue.  Moreover, the government argued that the Copyright Office is not subject to the APA at all, since it is an organ of Congress and not a regulatory agency.  The briefs hinted at the prospect that rulemakings from the Copyright Office are not subject to judicial review of any kind, even one subject to the highly limited standard of “arbitrary and capricious.”

There was, however, no published opinion in the TracFone case, and EFF’s Jennifer Granick told me yesterday she believes the company simply abandoned the suit.  No opinion means the judge never ruled on any of these arguments, and so there is still no precedent for how a challenge to a DMCA rulemaking would proceed and under what legal standards and jurisdictional requirements.

Should Apple decide to pursue an appeal (an Apple spokesperson “declined to comment” on whether the company was considering such an action, and read me the brief statement the company has given to all journalists this week), it would be plowing virgin fields in federal jurisdiction.  That, as we know, can often lead to surprising results—including, just as an example, a challenge to the Copyright Office’s institutional ability to perform rulemakings of any kind.

The Copyright Office Moves the Fair Use Needle…a Little

A few thoughts on the substance of the rulemaking, especially as it shines light on growing problems in applying copyright law in the digital age.

Since the passage of the 1998 revisions to the Copyright Act known as the Digital Millennium Copyright Act, the Register of Copyrights is required every three years to review requests to create specific classes of exemptions to some of the key provisions of the law, notably the parts that prohibit circumvention of security technologies such as DRM or other forms of copy protection.

The authors of the DMCA with some foresight recognized that the anti-circumvention provisions rode on the delicate and sharp edge where static law meets rapidly-evolving technology and new business innovation.  Congress wanted to make sure there was a process that ensured the anti-circumvention provisions did not lead to unintended consequences that hindered rather than encouraged technological innovation.  So the Copyright Office reviews requests for exemptions with that goal in mind.

In the rulemaking completed on Monday, of course, one important exemption approved by the Register was one proposed by the Electronic Frontier Foundation, which asked for an exemption for “jailbreaking” cell phones, especially iPhones.

Jailbreaking allows the customer to override security features of the iPhone’s firmware that limits which third party applications can be added to the phone.  Apple strictly controls which third party apps can be downloaded to the phone through the App Store, and has used that control to ban apps with, for example, political or sexual content.  Of course the review process also ensures that the apps work are technically compatible with the phone’s other software, don’t unduly harm performance, and aren’t duplicative of other apps already approved.

Jailbreaking the phone allows the customer to add whatever apps they want, including those rejected by or simply never submitted to Apple in the first place, for whatever reason.

In approving the exemption, the Copyright Office noted that jailbreaking probably does involve copyright infringement.  The firmware must be altered as part of the process, and that alteration violates Apple’s legal monopoly on derivative or adapted works.  But the Register found that such alteration was de minimis and approved the exemption based on the concept of “fair use.”

Fair use, codified in Section 107 of the Copyright Act, holds that certain uses of a copyrighted work that would otherwise be reserved to the rights holder are not considered infringement.  These include uses that have positive social benefits but which the rights holder as a monopolist might be averse to permitting under any terms, such as quotations in a potentially-negative review.

EFF had argued initially that jailbreaking was not infringement at all, but the Register rejected that argument.  Fair use is a much weaker rationale, as it begins by acknowledging a violation, though one excused by law.  The law of fair use, as I note in the piece, has also been in considerable disarray since the 1980’s, when courts began to focus almost exclusively on whether the use (technically, fair use is an affirmative defense to a claim of infringement) harmed the potential commercial prospects for the work.

Courts are notoriously bad at evaluating product markets, let alone future markets.  So copyright holders now simply argue that future markets, thanks to changing technology, could include anything, and that therefore any use has the potential to harm the commercial prospects of their work.  So even noncommercial uses by people who have no intention of “competing” with the market for the work are found to have infringed, fair use notwithstanding.

But in granting the jailbreaking exemption, the Copyright Office made the interesting and important distinction between the market for the work and the market for the product or service in which the work is embedded.

Jailbreaking, of course, has the potential to seriously undermine the business strategy Apple has carefully designed for the iPhone and, indeed, for all of its products, which is to tightly control the ecosystem of uses for that product.

This ensures product quality, on the one hand, but it also means Apple is there to extract fees and tolls from pretty much any third party they want to, on technical and economic terms they can dictate.  Despite its hip reputation, Apple’s technical environment is more “closed” than Microsoft’s.  (The open source world of Linux being on the other end of the spectrum.)

In granting the exemption, the Copyright Office rejected Apple’s claim that jailbreaking harmed the market for the iPhone.  The fair use analysis, the Register said, focuses on the market for the protected work, which in this case is the iPhone’s firmware.  Since the modifications needed to jailbreak the firmware don’t harm the market for the firmware itself, the infringing use is fair and legally excused.   It doesn’t matter, in other words, that jailbreaking has a potentially big commercial impact on the iPhone service.

That distinction is the notable feature of this decision in terms of copyright law.  Courts, and now the Copyright Office, are well aware that technology companies try to leverage the monopoly rights granted by copyright to create legal monopolies on uses of their products or services.  In essence, they build technical controls into the copyrighted work that limits who and how the product or service can be used, than claim their intentional incompatibilities are protected by law.

A line of cases involving video game consoles, printer cartridges and software applications generally has been understandably skeptical of efforts to use copyright in this manner, which quickly begins to smell of antitrust.  Copyright is a monopoly—that is, a trust.  So it’s not surprising that its application can leak into concerns over antitrust.  The law strives to balance the need for the undesirable monopoly (incentives for authors) with the risks to related markets (restraint of trade).

As Anthony Falzone put it in a blog post at the Stanford Center for Internet and Society, “The Library went on to conclude there is no basis for Apple to use copyright law to ‘protect[] its restrictive business model’ and the concerns Apple articulated about the integrity of the iPhone’s ‘ecosystem’ are simply not harms that would tilt the fair use analysis Apple’s way.”

The exemption granted this week follows the theory that protecting the work itself is what matters, not the controlled market that ownership of the work allows the rights holder to create.

The bottom line here:  messing with the firmware is a fair use because it doesn’t damage the market for the firmware, regardless of (or perhaps especially because of) its impact on the market for the iPhone service as Apple has designed it.  That decision is largely consistent with case law evaluating other forms of technical lockout devices.

The net result is that it becomes harder for companies to use copyright as a legal mechanism to fend off third parties who offer replacement parts, add-ons, or other features that require jailbreaking to ensure compatibility.

Which is not to say that Apple or anyone else trying to control the environment around copyright-protected software is out of luck.  As I note in the CNET piece, the DMCA is just one, and perhaps the weakest arrow in Apple’s quiver here.  Just because jailbreaking has now been deemed a fair use does not mean Apple is forced to accommodate any third party app.  Not by a long shot.

Jailbreaking the iPhone remains a breach of the user agreement for both the device and the service.  It still voids the warranty and still exposes the customer to action, including cancelling the service or early termination penalties, that Apple can legally take to enforce the agreement.  Apple can also still take technical measures, such as refusing to update or upgrade jailbroken phones, to keep out unapproved apps.

Contrary to what many comments have said in some of the articles noted above, the DMCA exemption does not constitute a “get out of jail free” card for users.

It’s true that Apple can no longer rely on the DMCA (and the possibility of criminal enforcement by the government) to protect the closed environment of the iPhone.  But consumers can still waive legal rights—including the right to fair use—in agreeing to a contract, license agreement, or service agreement.  (In some sense that’s what a contract is, after all—agreement by two parties to waive various rights in the interest of a mutual bargain.)

Ownership Rights to Software Remain a Mystery

A third interesting aspect to the Copyright Office’s rulemaking has to do with the highly-confused question of software ownership. For largely technical reasons, software has moved from intangible programs that must of necessity be copied to physical media (tapes, disks, cartridges) in order to be distributed to intangible programs distributed electronically (software as a service, cloud computing, etc.).  That technical evolution has made the tricky problem of ownership has gotten even trickier.

Under copyright law, the owner of a “copy” of a work has certain rights, including the right to resell their copy.  The so-called “first sale doctrine” makes legal the secondary market for copies, including used book and record stores, and much of what gets interesting on Antiques Roadshow.

But the right to resell a copy of the work does not affect the rights holders’ ability to limit the creation of new copies, or of derivative or adapted works based on the original.  For example, I own several pages of original artwork used in 1960’s comic books drawn by Jack Kirby, Steve Ditko, and Gene Colan.

While Marvel still owns the copyright to the pages, I own the artifacts—the pages themselves.  I can resell the pages or otherwise display the artifact, but I have no right until copyright expires to use the art to produce and sell copies or adaptations, any more than the owner of a licensed Mickey Mouse t-shirt can make Mickey Mouse cartoons.

(Mike Masnick the other day had an interesting post about a man who claims to have found unpublished lost negatives made by famed photographer Ansel Adams.  Assuming the negatives are authentic and there’s no evidence they were stolen at some point, the owner has the right to sell the negatives.  But copyright may still prohibit him from using the negatives to make or sell prints of any kind.)

Software manufacturers and distributors are increasingly trying to make the case that their customers no longer receive copies of software but rather licenses to use software owned by the companies.  A license is a limited right to make use of someone else’s property, such as a seat in a movie theater or permission to drive a car.

As software is increasingly disconnected from embodiment in physical media, the legal argument for license versus sale gets stronger, and it may be over time that this debate will be settled in favor of the license model, which comes with different and more limited rights for the licensee than the sale of a copy.  (There is no “first sale” doctrine for licenses.  They can be canceled under terms agreed to in advance by the parties.)

For now, however, debate rages as to whether and under what conditions the use of software constitutes the sale of a copy versus a license to use.  That issue was raised in this week’s rulemaking several times, notably in a second exemption dealing with unlocking phones from a particular network.

Under Section 117 of the Copyright Act, the “owner of a copy” of a computer program has certain special rights, including the right to make a copy of the software (e.g. for backup purposes, or to move it from inert media to RAM) or modify it when doing so is “essential” to make use of the copy.

Unlocking a phone to move it to another network, particularly a used phone being recycled, necessarily requires at least minor modification, and the question becomes whether the recycler or anyone lawfully in possession of a cell phone “owns a copy” of the firmware.

Though this issue gave the Copyright Office great pause and lots of pages of analysis, ultimately they sensibly hedged on the question of copy versus license.  The Register did note, however, that Apple’s license agreement was “not a model of clarity.”

In the interests of time, let me just say here that this is an issue that will continue to plague the software industry for some time to come.  It is a great example of how innovation continues to outpace law, with unhappy and unintended consequences.  For more on that subject, see Law Seven (copyright) and Law Nine (software) of “The Laws of Disruption.”

The Horses are Gone–So Let's Close Some Other Barn Door

The White House and the Federal Communications Commission have painted themselves into a very tight and very dangerous corner on Net Neutrality.  To date, a bi-partisan majority of Congress, labor leaders, consumer groups and, increasingly, some of the initial advocates of open Internet rules are all shouting that the agency has gone off the rails in its increasingly Ahab-like pursuit of an obscure and academic policy objective.

Now comes further evidence, none of it surprising, that all this effort has been a fool’s errand from the start.  Jacqui Cheng of Ars Technica is reporting today on a new study from Australia’s University of Ballarat that suggests only .3% of file sharing using the BitTorrent protocol is something other than the unauthorized distribution of copyrighted works.  Which is to say that 99.7% of the traffic they sampled is illegal.  The Australian study, as Cheng notes, supports similar conclusions of a Princeton University study published earlier this year

Remember how we got here

What does that have to do with Net Neutrality?

Let’s recall how we got into this mess.

When it became clear in 2007 that Comcast was throttling or blocking BitTorrent traffic without disclosing the practice to consumers, the FCC held hearings to determine if the company had violated the agency’s 2005 Internet policy statement.  The Framework for Broadband Access to the Internet included the principle that “consumers are entitled to access the lawful Internet content of their choice . . . [and] to run applications and use services of their choice,” and many argued that Comcast’s behavior violated that principle.

In the interim, Comcast changed its method of managing high-volume activities and achieved a peaceful resolution with BitTorrent.  Still, the FCC concluded that Comcast had violated the policy and issued a non-financial sanction against the cable provider in 2008.

Comcast challenged the order to the U.S. Court of Appeals for the D.C. Circuit, which hears all appeals of FCC adjudications.  Comcast argued that the FCC lacked authority to enforce its policy, and the D.C. Circuit agreed.

While the D.C. Circuit case was pending, however, the FCC in October of last year issued its Notice of Proposed Rulemaking for “Preserving the Open Internet.”  The goal of this NPRM, still pending, is to codify and enlarge the 2005 Internet policy statement and transform it into enforceable net neutrality rules.

Why change the policy into rules?  In explaining the “Need for Commission Action,” the NPRM noted that “Despite our efforts to date, some conduct is occurring in the marketplace that warrants closer attention and could call for additional action by the Commission, including instances in which some Internet access service providers have been blocking or degrading Internet traffic, and doing so without disclosing those practices to users.”  (¶50)  The NPRM added to the four principles laid out in the 2005 policy a new requirement that ISPs make their network management practices more transparent to consumers.

But the NPRM premised the FCC’s authority to issue net neutrality rules on the same jurisdiction it used to issue the sanctions against Comcast, so-called “ancillary jurisdiction” under Title I of the Communications Act.

Once the D.C. Circuit ruled in April of this year that “ancillary jurisdiction” was insufficient, the FCC’s ability to complete and defend the NPRM was called into doubt.  The FCC couldn’t sanction Comcast under the policy statement, and may not be able to enforce the proposed rules either.  There may be no legal authority, the agency believes, to prohibit Comcast’s interruption of BitTorrent transfers.

So the FCC is now pursuing perhaps the most extreme option for shoring up its authority, and that is the reclassification of broadband Internet access to be a Title II “telecommunications” service subject to a dizzying array of potential new rules and regulations at the federal, state, and local level.

It is that leap of madness that has splintered the net neutrality coalition, and united Congress in calling for the FCC to step back from the brink.

Back to BitTorrent

Back to the BitTorrent studies.  The Australian and Princeton research makes clear what everyone already knows.  Despite the technical merits of the BitTorrent protocol and the best efforts of the company that manages the protocol, the vast majority of users availing themselves of this technology are using it for activities that violate U.S. and foreign copyright laws.

Here’s the problem.  The FCC’s Internet policy statement, the proposed rules, and the effort to ensure authority to enforce those rules under Title II are all premised on the sensible limitation that consumers should have the right to access the “lawful Internet traffic” of their choice.  (See ¶ 1 of the Title II Notice of Inquiry, e.g.) (emphasis added)

They don’t apply at all to unlawful activities, whether of consumers or content providers.  Which is to say, they don’t apply to the vast majority of BitTorrent file transfers.

Not clear?  Let’s keep going.  According to the NOI, the FCC reads the Comcast decision as holding “the Commission lacked authority to prohibit practices of a major cable modem Internet service provider that involved secret interruption of lawful Internet transmissions, which the Commission found were unjustified and discriminatory and denied users the ability to access the Internet content and applications of their choice.” (emphasis added)

The proposed net neutrality rules are equally emphatic:  they apply only to lawful Internet activity.  (The NPRM refers to “lawful content” nearly 50 times.)

If there’s any doubt about the intent of the old policy, the proposed new rules, or Title II to protect illegal file sharing, the FCC dispels it over and over in the NPRM.  “The draft rules would not prohibit broadband Internet access service providers from taking reasonable action to prevent the transfer of unlawful content,” according to the Executive Summary of the NPRM, “such as the unlawful distribution of copyrighted works.(emphasis added)

This is a fine how-do-you do.  Comcast limited its arguments in the D.C. Circuit to jurisdictional and procedural flaws in the FCC sanctions.  But assuming Comcast had made the argument, now supported by ample evidence, that it was not blocking any or nearly any “lawful content” in the first place, neither the old Internet policy nor the proposed Net Neutrality rules would actually apply to Comcast’s interference with BitTorrent transfers–the “practice” that started this catastrophe and which has led us to the verge of policy warfare.

Indeed, under the Digital Millennium Copyright Act and other copyright laws, it’s very likely that Comcast could be compelled by the Department of Justice or affected copyright holders to stop the vast majority of BitTorrent transfers, on pain of large civil or even criminal penalties.  Which is yet another reason (if the FCC had needed another reason) that none of the proposed rules, regulations, or reclassifications would actually correct the only problem the FCC claims it is trying to address.

So neither the NPRM nor the Title II Notice of Inquiry, in the end, have anything to do with Comcast’s network management practices or the D.C. Circuit’s decision.  The sad irony here is that assuming the Commission goes ahead with reclassification and then completes the net neutrality rulemaking, there would be nothing to stop Comcast from going right back to blocking BitTorrent traffic.  There might even be legal authority compelling them to do so.

Meanwhile, the National Broadband Plan, the Commission’s stand-out achievement under Chairman Julius Genachowski, has taken a back-seat to hyperventilating over a non-event and a non-problem.

Please, can we get back to making the Internet better for more Americans?