Category Archives: Digital Life

Cloud Users and Providers Win Big Privacy Victory – U.S. v. Warshak

The Sixth Circuit ruled on Tuesday that criminal investigators must obtain a warrant to seize user data from cloud providers, voiding parts of the notorious Stored Communication Act.  The SCA allowed investigators to demand providers turn over user data under certain circumstances (e.g., data stored more than 180 days) without obtaining a warrant supported by probable cause.

I have a very long piece analyzing the decision, published on CNET this evening.  See “Search Warrants and Online Data:  Getting Real.” (I also wrote extensively about digital search and seizure in “The Laws of Disruption.”)  The opinion is from the erudite and highly-readable Judge Danny Boggs.    The case is notable if for no other reason than its detailed and lurid description of the business model for Enzyte, a supplement that promises to, well, you know what it promises to do….

The SCA’s looser rules for search and seizure created real headaches for cloud providers and weird results for criminal defendants.  Emails stored on a user’s home computer or on a service provider’s computer for less than 180 days get full Fourth Amendment protection.  But after 180 days the same emails stored remotely lose some of their privacy under some circumstances.   As the commercial Internet has evolved (the SCA was written in 1986), these provisions have become increasingly anomalous, random and worrisome, both to users and service providers.  (As well as to a wide range of public interest groups.)

Why 180 days?  I haven’t had a chance to check the legislative history, but my guess is that in 1986 data left on a service provider’s computer would have taken on the appearance of being abandoned.

Assuming the Sixth Circuit decision is upheld and embraced by other circuits, digital information will finally be covered by traditional Fourth Amendment protections regardless of age or location.  Which means that the government’s ability to seize emails (Tuesday’s case applied only to emails, but other user data would likely get the same treatment) without a warrant that is based on probable cause will turn on whether or not the defendant had a “reasonable expectation of privacy” in the data.  If the answer is yes, a warrant will be required.

(If the government seizes the data anyway, the evidence could be excluded as a penalty.  The “exclusionary rule” was not invoked in the Warshak case, however, because the government acted on a good-faith belief that the SCA was Constitutional.)

Where does the “reasonable expectation of privacy” test come from?  The Fourth Amendment protects against “unreasonable” searches and seizures, and, since the Katz decision in 1968, Fourth Amendment cases turn on an analysis of whether a criminal defendant’s  expectation of privacy in whatever evidence is obtained was reasonable.

Katz involved an electronic listening device attached to the outside of a phone booth—an early form of electronic surveillance.  Discussions about whether a phone conversation could be “searched” or “seized” got quickly metaphysical, so the U.S. Supreme Court decided that what the Fourth Amendment really protected was the privacy interest a defendant had in whatever evidence the government obtained.  “Reasonable expectation of privacy” covered all the defendant’s “effects,” whether tangible or intangible.

Which means, importantly, that not all stored data would pass the test requiring a warrant.   Only stored data that the user reasonably expects to be kept private by the service provider would require a warrant.  Information of any kind that the defendant makes no effort to keep private—e.g., talking on a  cell phone in a public place where anyone can hear—can be used as evidence without a warrant.

Here the Warshak court suggested that if the terms of service were explicit that user data would not be kept private, then users wouldn’t have a reasonable expectation of privacy that the Fourth Amendment protected.  On the other hand, terms that reserved the service provider’s own right to audit or inspect user data did not defeat a reasonable expectation of privacy, as the government has long argued.

An interesting test case, not discussed in the opinion, would be Twitter.  Could a criminal investigator demand copies of a defendant’s Tweets without a warrant, arguing that Tweets are by design public information?  On the one hand, Twitter users can exclude followers they don’t want.  But at the same time, allowed followers can retweet without the permission of the original poster.   So, is there a reasonable expectation of privacy here?

There’s no answer to this simplified  hypothetical (yet), but it is precisely the kind of analysis that courts perform when a defendant challenges the government’s acquisition of evidence without full Fourth Amendment process being followed.

To pick an instructive tangible evidence example, last month appellate Judge Richard Posner wrote a fascinating decision that shows the legal mind in its most subtle workings.  In U.S. v. Simms, the defendant challenged the inclusion of evidence that stemmed from a warranted search of his home and vehicle.  The probable cause that led to the warrant was the discovery in the defendant’s trash of marijuana cigarette butts.  The defendant argued that the search leading to the warrant was a violation of the Fourth Amendment, since the trash can was behind a high fence on his property.

Courts have held that once trash is taken to the curb, the defendant has no “reasonable” expectation of privacy and therefore is deemed to consent to a police officer’s search of that trash.  But trash cans behind a fence are generally protected by the Fourth Amendment, subject to several other exceptions.

Here Judge Posner noted that the defendant’s city had an ordinance that prohibited taking the trash to the curb during the winter, out of concern that cans would interfere with snow plowing.  Instead, the “winter rules” require that trash collectors take the cans from the resident’s property, and that the residents leave a safe and unobstructed path to wherever the cans are stored.  Since the winter rules were in effect, and the cans were left behind a fence but the gate was left open (perhaps stuck in the snow), and the police searched them on trash pickup day, the search did not violate the defendant’s reasonable expectation of privacy.

For better or worse, this is the kind of analysis judges must perform in the post-Katz era, when much of what we consider to be private is not memorialized in papers or other physical effects but which is likely to be intangible—the state of our blood chemistry, information stored in various data bases, heat given off and detectable by infrared scanners.

The good news is that the Warshak case is a big step in including digital information under that understanding of the Fourth Amendment.  Search and seizure is evolving to catch up with the reality of our digital lives.

Brief thoughts on today's FCC Net Neutrality announcement

Late last night, FCC Chairman Julius Genachowski made explicit what he’d been hinting for weeks–that he was going to call for a vote in December on the agency’s long-running net neutrality proceedings.

Today, the Chairman gave a speech outlining a new version of the rules he has circulated to fellow Commissioners, which will be voted on on Dec. 21, 2010..

The new order itself has not yet been made public, however, and the Chairman’s comments didn’t give much in the way of details.  The latest version appears to reflect the proposed legislation circulated before the mid-term recess by then-Commerce chair Henry Waxman.  That version, for those following the ball here, was itself based on the legislative framework proposed by Google and Verizon, which itself emerged from informal negotiations convened over the summer at the FCC.

So in some sense the agency is moving, albeit non-linearly, toward some kind of consensus.

I have a brief article this morning in the Orange County Register laying out the pros and cons of this latest iteration, to the extent that is possible without seeing the order.

The timing of today’s announcement, however, is significant.  This was Genachowski’s last chance to wrap up the proceedings before the new Congress , with its Republican House and more even Senate, clocks in.  Republicans on their own don’t have the votes to pass legislation that would have blocked the FCC from voting on net neutrality later, but Republican leaders had threatened to use their oversight authority to put additional pressure on the FCC not to enact new neutrality rules.

That might still happen, of course, and already today several Republican leaders have promised to do whatever they can do undo today’s developments.  Assuming the Commission approves the rule at its December 21, 2010 meeting, there’s also a strong likelihood of litigation challenging the rules and the FCC’s authority to issue them.

So this is not the end of the net neutrality soap opera by any stretch of the imagination.  If anything, it suggests a new chapter, one that will take the discussion farther away from the technical architecture of the Internet and the best interests of consumers and closer to pure political theater.

Domain Name Seizures and the Limits of Civil Forfeiture

I was quoted this morning in Sara Jerome’s story for The Hill on the weekend seizures of domain names the government believes are selling black market, counterfeit, or copyright infringing goods.

The seizures take place in the context of an on-going investigation where prosecutors make purchases from the sites and then determine that the goods violate trademarks or copyrights or both.

Several reports, including from CNET, The Washington Post and Techdirt, wonder how it is the government can seize a domain name without a trial and, indeed, without even giving notice to the registered owners.

The short answer is the federal civil forfeiture law, which has been the subject of increasing criticism unrelated to Internet issues.  (See http://law.jrank.org/pages/1231/Forfeiture-Constitutional-challenges.html for a good synopsis of recent challenges, most of which fail.)

The purpose of forfeiture laws is to help prosecutors fit the punishment to the crime, especially when restitution of the victims or of the cost of prosecution is otherwise unlikely to have a deterrent effect, largely because the criminal has no assets to attach.  In the war on drugs, for example, prosecutors can now seize pretty much any property used in the commission of the crime, including a seller’s vehicle or boat.  (See U.S. v. 1990 Toyota 4 Runner for an example and explanation of the limits of federal forfeiture law.)

Forfeiture laws have been increasingly used to fund large-scale enforcement operations, and many local and federal police now develop budgets for these activities based on assumptions about the value of seized property.  This has led to criticism that the police are increasingly only enforcing the law when doing so is “profitable.”  But police point out that in an age of regular budget cuts, forfeiture laws are all they have in the way of leverage.

Sometimes the forfeiture proceedings happen after the trial, but as with the domain names, prosecutors also have the option to seize property before any indictment and well before any trial or conviction.  Like a search warrant, a warrant to seize property requires only that a judge find probable cause that the items to be seized fit the requirements of forfeiture—in general, that they were used in the commission of a crime.

The important difference between a seizure and a finding of guilt—the difference that allows the government to operate with such a free hand—is that the seizure is only temporary.  A forfeiture, as here, isn’t permanent until there is a final conviction.

The pre-trial seizure is premised on the idea that during the investigation and trial, prosecutors need to secure the items so that the defendant doesn’t destroy or hide it.

If the defendant is acquitted, the seized items are returned.  Or, if the items turn out not to be subject to forfeiture (e.g., they were not used in the commission of any crimes the defendant is ultimately convicted for), they are again returned.  Even before trial, owners can sue to quash the seizure order on the grounds that there was insufficient (that is, less than probable) cause to seize it in the first place.

All of that process takes time and money, however, and many legal scholars believe in practice that forfeiture reverses the presumption of innocence, forcing the property owner to prove the property is “innocent” in some way.

In current (and expanding) usage, forfeiture may also work to short-circuit due process of the property owner.  (Or owners—indeed, seized property may be jointly owned, and the victim of the crime may be one of the owners, as when the family car is seized when the husband uses it to liaison with a prostitute.)

That’s clearly a concern with the seizure of domain names.  This “property” is essential for the enterprise being investigated to do business of any kind.  So seizing the domain names before indictment and trial effectively shuts down the enterprise indefinitely. (Reports are that most if not all of the enterprises involved in this weekend’s raid, however, have returned under new domain names.)

If prosecutors drag their heels on prosecution, the defendant gets “punished” anyway.  So even if the defendant is never charged or is ultimately acquitted, there’s nothing in the forfeiture statute that requires the government to make them whole for the losses suffered during the period when their property was held by the prosecution.  The loss of the use of a car or boat, for example, may require the defendant to rent another while waiting for the wheels of justice to turn.

For a domain name, even a short seizure effectively erases any value the asset has.  Even if ultimately returned, it’s now worthless.

Clearly the prosecutors here understand that a pre-trial seizure is effectively a conviction.  Consider the following quote from Immigration and Customs Enforcement Director John Morton, who said at a press conference today, “Counterfeiters are prowling in the back alleys of the Internet, masquerading, duping and stealing.”  Or consider the wording of the announcement placed on seized domain names (see http://news.cnet.com/8301-1023_3-20023918-93.html), implying at the least that the sites were guilty of illegal acts.

There’s no requirement for the government to explain the seizures are only temporary measures designed to safeguard property that may be evidence of crime or may be an asset used to commit it.  Nor do they have to acknowledge that none of the owners of the domain names seized has been charged or convicted of any crime yet.  But the farther prosecutors push the forfeiture statute, the bigger the risk that courts or Congress will someday step in to pull them back.

Updates to the media page

The fall has been filled with important developments in the technology world, and I continue to be a regular source for journalists as well as publishing frequent editorials and analyses of my own.  I’ve just posted another ten items to the Media Page of my website, including several articles I’ve written for CNET News.com, an election-day op-ed in Roll Call, legal analysis for The Wall Street Journal and a long review of “The Laws of Disruption” in the International Journal of Communications.  The accidents continue to pile up at the dangerous intersection of innovation and the law, the main theme of The Laws of Disruption.

Some highlights:

The U.S. Supreme Court heard arguments in EMA v. Schwarzenegger, which challenges California’s ban on violent video games on First Amendment ground.  My article for CNET explained why the timing of the case is significant, with implications for all new media enterprises.

The European Commission is preparing new legislation to guarantee its citizens a “right to be forgotten.  On CNET, I explain why that well-intentioned initiative could have disastrous consequences for the digital economy.

My election-day op-ed for Roll Call, the leading newspaper of Capitol Hill, urged Congress to stop the FCC’s dangerous plans to “reclassify” broadband Internet access and treat it like 1930’s-style telephone business.

My detailed analysis of Rep. Henry Waxman’s proposed net neutrality bill, a last-minute effort to resolve the long-running conflict before the election, was featured on The Wall Street Journal’s “All Things Digital.”

In the important Vernor decision, the Court of Appeals in California ruled that licensing agreements that deny users a right to resell copies of software are enforceable.  Though many viewed this decision as harmful to consumers, I explain why developments in the software industry have already relegated license agreements to the margins, in a controversial article for CNET News.com.

NextGenWeb, sponsored by the U.S. Telecom Association, interviewed me one of many recent visits to Washington.

As the new Congress prepares to convene in January, watch for more important developments.

Resurrecting the National Broadband Plan: Why is Washington Fighting a Winning Strategy?

I published an opinion piece today at CNET, calling on all tech stakeholders in Washington to stop the pointless quibbling and sniping about net neutrality, reclassification, and other side-show issues.  (I’m too depressed to list them here—but see “Fox-Cablevision and the Net Neutrality Hammer” for an example of just how degraded the conversation has become.)

Instead, why not focus on a positive message, one that has the potential for win-win-win-win?  For example, the National Broadband Plan, issued in March, eloquently made the case for a U.S. commitment to universal broadband adoption.  Not as a matter of gee-whiz futurism but in the interest of giving Americans “a better way of life.”

As a technology optimist, I happen to agree.  Broadband Internet provides users with much more than cute kitten videos and finding old friends on social networking sites (not that there’s anything wrong with these).  As the plan makes clear, it also gives them access to education and employment opportunities otherwise hard to find (and certainly at a much higher price), access to government services, public safety and better health care options. The Internet will play a key role in the development of a “smart” energy grid.

And as more urban countries with higher penetration rates and faster speeds have learned to their delight, the network effects of having everyone online generate all kinds of serendipitous positive returns.

Even better, achieving the goals of the NBP won’t require massive taxpayer spending, making it palatable to both Democrats and Republicans.  Most of the $350 billion it will cost to get 100 mbps speeds to 100 million Americans—a key benchmark of the plan—will come from private investment, much of it already planned for.

So moving forward with the Plan will improve the lives of ordinary citizens, make government more responsive and responsible, stimulate the economy, and help keep the U.S. competitive in a global information economy.  And it can be done without significant taxpayer expense or new regulatory overhead.

This is the feel-good story of the decade.  Come on, everybody!  We can use my barn.

It’s all in the plan. But given the strum and drang exerted over largely inside-the-beltway minutia, the NBP’s positive messages has been drowned out.

Case in point:  a recent report from the NTIA reveals that among the 25% of American homes that don’t have a single Internet user, the most frequently cited reason not to sign up for a broadband service is that they just don’t want it.  A full two thirds of the non-users, according to the report, “reported a lack of need or interest as their primary reason for not having broadband at home.”  Cost was a much lower factor.  Only four percent cited lack of availability.

It’s depressing and disappointing that so many of my fellow citizens haven’t gotten the message:  the Internet is cool, and broadband access will pay for itself many times over.

It’s also frustrating to the authors of the NBP, whose herculean efforts were unfairly and unduly overshadowed by the universal hand-wringing that followed the D.C. Circuit’s decision in the Comcast case, which came out just a few weeks later.  (For the record, NBP executive director Blair Levin agrees with legal scholars who don’t believe Comcast undermined the FCC’s ability to move forward with the plan itself:  “I think there is a lot of good stuff that can be done to advance the National Broadband Plan,” he recently told CNET’s Marguerite Reardon, “that doesn’t require any action from Congress.”)

In some sense the mid-term elections have provided the opportunity for all stakeholders—Congress, the FCC, lobbyists and advocacy groups—to resurrect the NBP and feature it as the central document in a national dialogue on technology policy.  It’s the right thing to do for the economy, and for individuals.  And in one of those rare harmonic convergences, it’s also politically expedient.  It’s positive!  It’s bi-partisan!  It’s high-tech!

So why isn’t anyone doing it?

Europe Reinvents the Memory Hole

Inspired by thoughtful pieces by Mike Masnick on Techdirt and L. Gordon Crovitz’s column yesterday in The Wall Street Journal, I wrote a perspective piece this morning for CNET regarding the European Commission’s recently proposed “right to be forgotten.”

A Nov. 4th report promises new legislation next year “clarifying” this right under EU law, suggesting not only that the Commission thinks it’s a good idea but, even more surprising, that it already exists under the landmark 1995 Privacy Directive.

What is the “right to be forgotten”?  The report is cryptic and awkward on this important point, describing “the so-called ‘right to be forgotten’, i.e. the right of individuals to have their data no longer processed and deleted when they [that is, the data] are no longer needed for legitimate purposes.”

The devil, of course, will be in the forthcoming details.  But it’s important to understand that under current EU law, the phrase “their data” doesn’t just mean information a user supplies to a website, social network, or email host.  Any information that refers to or identifies an individual is considered private information under the control of the person to whom it refers.  So “their data” means anyone’s data, even if the individual identified had nothing to do with its collection or storage.

And EU law doesn’t just limit privacy protections to computer data. Users have the right to control information about them appearing in printed and other analog formats as well.

As I say in the piece, the “right to be forgotten” begins to sound like Big Brother’s “memory hole” in Orwell’s classic 1984.  But instead of Winston Smith “rectifying” newspaper articles at the direction of his faceless masters at the Ministry of Truth, a right to be forgotten creates a kind of personal memory hole.  Something you did in the past that you would prefer never happened?  Just issue orders to anyone who knows about, and force them to destroy any evidence.

Of course such a right would be as impractical to enforce as it is ill-conceived to grant.

Both Masnick and Crovitz, in particular, worry about the free speech implications of such a right, both for the press and for individuals.  And those are indeed potentially catastrophic.  Having the power to rewrite history devalues any information, including information that hasn’t been erased.

The social contract operates on facts and the ability to sort out truth from lie.  A right to be forgotten gives every individual the power to rewrite that contract whenever they feel like.  So who would sensibly enter into such a relationship in the first place?

My concern, however, is even more metaphysical.  The privacy debate currently going on in public policy circles is disturbing, perhaps most of all because it is being framed as a policy discussion.  Rather than work out what costs and benefits we get from increased information sharing with each other, those who are feeling anxious about the pace of change in digital life are running, as anxious people often do, to regulators, demanding they do something—anything—to alleviate their future shock.  And regulators, who are pretty anxious people themselves, are too-often happy to oblige, even when they understand neither the technology nor the implications of their lawmaking.

Beyond the worst possible choice of forum to begin a conversation, the privacy debate in its current form is no debate at all.  It is mostly a bunch of emotional people hurling rhetorical platitudes at each other, trading the worst-case examples of the deadly potential of privacy invasions (teen suicides, evil corporations) with fear-inspiring claims of the risk of keeping information secret (terrorists win).

It’s not really a debate at all when the two “sides” are talking about entirely different subjects.  And when no one’s really listening anyway. All that is happening is that the stress level amps up, and those not participating in the discussion get the distinct impression that the world is about to end.

A starting point for a real conversation about privacy—one that is dangerously absent from any of the current lawmaking efforts—is an understanding about the nature of information.  Privacy in general and a right to be forgotten specifically begins with the false assumption that information (private or otherwise) is a kind of property, a discrete, physical item that can be controlled, owned, traded, used up, and destroyed.  (Both “sides” have fallen into this trap, and can’t seem to get out.)

The fight often breaks down into questions of entitlement—who initially owns the information that refers to me?  The person who found it and translated it into a form that could be accessed by others, or the person to whom it refers, regardless of source?  Under what conditions can it be transferred?  Does the individual maintain a universal and inalienable right of rescission—the ability to take it back later, for any reason, and without compensating the person who now has it?

But these are the wrong questions to be asking in the first place.  Information isn’t property, at least not as understood by our industrial-age legal system or popular metaphors of ownership.  Information, from an economic standpoint, is a virtual good.  It can be “possessed” and used by everyone at the same time.  It can become more valuable in being combined with other information.  It can maintain or improve its value forever.

And, whether the law says so or not, it can’t be repossessed, put back in the safety deposit box, buried at sea, or “devoured by the flames” like the old newspaper articles Winston Smith rewrites when the truth turns out to be inconvenient to the past.  That of course was Orwell’s point.  You can send down the memory hole the newspaper that reported Big Brother’s promise of increased chocolate rations, but people still remember that he said it.  You can try to brainwash them, too, and limit their choice of language to eliminate the possibility of unsanctioned thoughts.  You can destroy the individual who rebels against such efforts.

But it still doesn’t work.  The facts, warts and all, are still there, even when their continued existence is subjectively embarrassing to an individual.  Believe me, I wish sometimes it were otherwise.  I would very much like to “rectify” high school, or my parents, or the recent death of my beloved dog.  The truth often hurts.

But burning all the libraries and erasing all the bits in the world doesn’t change the facts.  It just makes them harder to access.  And that makes it harder to learn anything from them.

Maybe the European Commission was just being sloppy in its choice of words.  Perhaps it has something much more limited in mind for a “right to be forgotten.”  Or perhaps as it begins the ugly process of writing actual directives that must then be implemented in law by member countries, it will see both the impossibility and danger of going down this path.

Perhaps they’ll then pretend they never actually promised to “clarify” such a right in the first place.

But we’ll all know that they did.  For whatever it’s worth.