Category Archives: Infrastructure

Announcing "VC/DC," a new webcast series for Forbes

 

This week, Forbes Senior Online Editor Kashmir Hill and I launched a new video series, “VC/DC,” where we review the latest developments at the accident-prone intersection of technology and policy. The first two episodes are embedded below.

We’ve envisioned the series as a regular look at the policy issues technology companies and investors should be paying attention to but probably aren’t.

Kashmir and I each bring a unique perspective to technology and policy. A former D.C. resident, Kashmir relocated to the Bay Area a few years ago to cover privacy, security, digital currency and other cutting edge topics.

As a Silicon Valley veteran who now spends nearly half my time in Washington at the Georgetown Center for Business and Public Policy, on the other hand, I am working to bridge the gap between disruptive innovations and the regulators who sometimes love them to death.

The program will cover a wide range of topics, and won’t be limited just to developments inside the beltway. As our inaugural episodes makes clear, we’re also looking closely at how technology businesses are affected by local and international laws, as well as developments in the courts and the legal system overall.

I hope you like the series and find it interesting enough to subscribe.  We’d be grateful for your feedback in any case, as well as suggestions for future episodes.

 

Episode 1:  “The Accident-Prone Intersection of Innovation and Policy”

 

Episode 2:  “Security Standards and the Patent Crisis”

Larry Appears on C-SPAN's "The Communicators"

Embedded image permalink

Larry is the guest this week on C-SPAN’s long-running program “The Communicators,” talking with host Peter Slen about the research behind “Big Bang Disruption.”  The program covers a wide range of topics, including the impact of exponential technologies, the fate of incumbents, and the role of regulators in impeding or encouraging disruptive innovation.

How Should Governments Approach Technology Policy? Modestly.

Earlier this week, I spoke in San Francisco at an event co-sponsored by the Reason Foundation, TechFreedom, and the Koch Institute.  Hosted by my CNET colleague Declan McCullagh, the topic was “DC vs. SF:  A Clash of Visions for Tech Policy.”

The discussion ranged widely, from copyrights and patents to NSA surveillance to the failure of the government’s Healthcare.gov website.  Although panelists from the political left and right disagreed on some issues, there was as usual widespread consensus that from the standpoint of entrepreneurs and engineers, the core problem in technology policy is that the pace of change for innovation continues to accelerate while the pace of government, at best, remains constant.  Moore’s Law, increasingly, trumps legislated law, often unintentionally, and often with unintended negative consequences.

At the same time, as I emphasized, the shift in commerce, social life, civil life and nearly everything else from offline to online media means that future collisions at the intersection of innovation and regulation are inevitable and, indeed, certain to increase in both frequency and the degree of collateral damage.  Governments claim a monopoly on regulating much of this activity, after all, and like any institution that believes in its own mission is unlikely to let itself go quietly out of business as its markets change.

Governments rely for revenue on taxes. As more traditionally taxable activity migrates online, lawmakers are certain to follow. That’s been true in the development of any frontier, physical or virtual.

The longstanding Silicon Valley approach of ignoring Washington in hopes lawmakers won’t notice what we’re doing was always a dangerous policy, and has now become downright reckless.  So how should innovators engage with regulators?

Watch the video below to find out!

DC v. SF: A Clash of Visions for Tech Policy from Charles Koch Institute on FORA.tv

The return of Net Neutrality

Over on Forbes today, I have a very long post inspired by Monday’s oral arguments in Verizon’s challenge of the FCC’s Open Internet rules, passed in 2010

I say “inspired” because the post has nothing to say about the oral arguments which, in any case, I did not attend.  Mainstream journalists can’t resist the temptation to try to read into the questions asked or the mood of the judges some indication of how the decision will come out

But as anyone who has ever worked in a court or followed appellate practice  well knows, the tone of oral arguments signals nothing about a judge’s point-of-view.  Often, the harshest questioning is reserved for the side a judge is leaning towards supporting, perhaps because the briefs filed were inadequate.  Bad briefs create more work for the judge and her clerks.

I use the occasion of the hearing to take a fresh look at the net neutrality “debate,” which has been on-going since at least 2005, when I first started paying attention to it.  In particular, I try to disentangle the political term “net neutrality” (undefined and, indeed, not even used in the 2010 Open Internet order) from the engineering principles of packet routing.

According to advocates for government regulation of broadband access, the political argument for net neutrality regulation is simply a codification of the Internet’s design.  But regardless of whether it would even make sense to transform the FCC into the governing body of engineering protocols for the network (the Internet Society and the its engineering task forces are and always have been doing a fine job, thanks very much), the reality is that the political argument has almost nothing to do with the underlying engineering.

Indeed, those most strongly advocating for more government regulation either don’t understand the engineering or intentionally mischaracterize it, or both.  That’s clear from the wide range of supposed competitive problems that have been lumped together under the banner of “net neutrality” issues over the years–almost none of which have anything to do with packet routing.

Fortunately, very little of the larger political agenda of the loose coalition of net neutrality advocates is reflected in the rules ultimately passed by a bare majority of the FCC in 2010.  Even so, those rules, limited as they were, face many challenges.

For one thing, the FCC, despite over a year of dedicated attention to the problem, could identify only four incidents that suggested any kind of market failure, and only one of which (the Comcast-BitTorrent incident) was ever actually considered in detail by the Commission.  (Two of the others never even rose to the level of a complaint.)  The agency was left to regulate on the basis of “preserving” the Open Internet through what it called (nearly a dozen times) “prophylactic” rules.

Second, and of particular interest in the D.C. Circuit proceeding, Congress has never authorized the FCC to issue rules dealing with broadband Internet access.  Though many authorizing bills have circulated over the years, none have ever made it out of committee.  With no legal basis to regulate, the agency was left pointing to irrelevant provisions of the existing Communications Act–most of which were already rejected by the same court in the Comcast case.  Nothing in the law has changed since Comcast, and on that basis, regardless of the merits of Internet regulation, the FCC is very likely to lose.  Which the Commission surely knew in passing the rules in 2010.

The piece ends by describing, as I did in my testimony before the House Judiciary Committee in early 2011, how the Report and Order betray the technical reality that from an engineering standpoint, even the supposed neutrality of packet routing is largely a sentimental myth.  The FCC identified and exempted a dozen network management technologies, practices, and protocols that they acknowledged do not follow the neutrality principle, but which are essential to effective and efficient management of the network.  There is no “neutral” Internet to preserve, and never was.

The agency was right to exempt these practices.  But the problem with the rules as written is that they could not and did not extend to future innovations that new applications and new users will certainly make as essential as today’s management techniques.

If the rules stand, network engineers, application developers, device makers and others in the vibrant, dynamic Internet ecosystem will be forced to seek permission to innovate from the FCC, which will both slow the high-speed world of Internet design to a crawl and introduce a decision maker with no technical expertise and lots of political baggage.

That of course was the kind of counter-productive and unnecessary regulatory intrusion that Internet users successfully rose up against last year when the UN’s International Telecommunications Union threatened to assert itself in basic Internet governance, or the year before that when Congress, without technical understanding of the most basic variety, tried to re-architect the Internet  on behalf of media companies in the failed SOPA and PIPA legislation.

If the FCC gains a foothold in broadband access with the Open Internet rules or other efforts to gain oversight where Congress has delegated none, expect a similar reaction.  Or, in any case, hope for one.

Where to next for the FCC?

crossroads

Tuesday was a big day for the FCC.  The Senate Commerce, Science and Transportation Committee held an oversight hearing with all five Commissioners, the same day that reply comments were due on the design of eventual “incentive auctions” for over-the-air broadcast spectrum.  And the proposed merger of T-Mobile USA and MetroPCS was approved.

All this activity reflects the stark reality that the Commission stands at a crossroads.  As once-separate wired and wireless communications networks for voice, video, and data converge on the single IP standard, and as mobile users continue to demonstrate insatiable demand for bandwidth for new apps, the FCC can serve as midwife in the transition to next-generation networks.  Or, the agency can put on the blinkers and mechanically apply rules and regulations designed for a by-gone era.

FCC Chairman Julius Genachowski, for one, believes the agency is clearly on the side of the future.  In an op-ed last week in the Wall Street Journal, the Chairman took justifiable pride in the focus his agency has demonstrated in advancing America’s broadband advantage, particularly for mobile users.

Mobile broadband has clearly been a bright spot in an otherwise bleak economy.  Network providers and their investors, according to the FCC’s most recent analysis, have spent over a trillion dollars since 1996 building next-generation mobile networks, today based on 4G LTE technology.

These investments are essential for high-bandwidth smartphones and tablet devices and the remarkable ecosystem of voice, video, and data app they have enabled.  This platform for disruptive innovation has powered a level of “creative destruction” that would do Joseph Schumpeter proud.

Mobile disruptors, however, are entirely dependent on the continued availability of new radio spectrum.  In the first five years following the 2007 introduction of the iPhone, mobile data traffic increased 20,000%.  No surprise, then, that the FCC’s 2010 National Broadband Plan conservatively estimated that mobile consumers desperately needed an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020.

With nearly all usable spectrum long-since allocated, the Plan acknowledged the need for creative new strategies for repurposing existing allocations to maximize the public interest.  But some current licensees including over-the-air television broadcasters and the federal government itself are resisting Chairman Genachowski’s efforts to keep the spectrum pipeline open and flowing.

So far, despite bold plans from the FCC for new unlicensed uses of TV “white spaces” and the  passage early in 2012 of “incentive auction” legislation from Congress, almost no new spectrum has been made available for mobile consumers.  The last significant auction the agency conducted was in 2008, based on capacity freed up in the digital television transition.

The “shared” spectrum the agency has recently been touting would have to be shared with the Department of Defense and other federal agencies, which have so far stonewalled a 2010 Executive Order from President Obama to vacate its unused or underutilized allocations.  (The federal government is, by far, the largest holder of usable spectrum today, with as much as 60% of the total.)

And after over a year of on-going design, there is still no timetable for the incentive auctions.  Last week, FCC Commissioner Jessica Rosenworcel, speaking to the National Association of Broadcasters, urged her colleagues at least to pencil in some dates.  But even in the best-case scenario, it will be years before significant new spectrum comes online for mobile devices.  The statute gives the agency until 2022.

In the interim, the mobile revolution has been kept alive by creative use of secondary markets, where mobile providers have bought and sold existing licenses to optimize current allocations, and by mergers and acquisitions, which allow network operators to combine spectrum and towers to improve coverage and efficiency.  Many transactions have been approved, but others have not.  Efforts to reallocate or reassign underutilized satellite spectrum are languishing in regulatory limbo.  Local zoning bodies continue to slow or refuse permission for the installation of new equipment.  Delays are endemic.

So even as the FCC pursues its visionary long-term plan for spectrum reform, the agency must redouble efforts to encourage optimal use of existing resources.  The agency and the Department of Justice must accelerate review of secondary market transactions, and place the immediate needs of mobile users ahead of hypothetical competitive harms that have yet to emerge.

In conducting the incentive auctions, unrelated conditions and pet projects need to be kept out of the mix, and qualified bidders must not be artificially limited to advance vague policy objectives that have previously spoiled some auctions and unnecessarily depressed prices on others.

Let’s hope today’s oversight hearing will hold Chairman Genachowski to his promise to “[keep] discussions focused on solving problems, and on facts and data….so that innovation, private investment and jobs follow.”  We badly need all three.

(A condensed version of this essay appears today in Roll Call.)

Disruptive Tecnologies and the Watchful Waiting Principle

When the smoke cleared and I found myself half caught-up on sleep, the information and sensory overload that was CES 2013 had ended.

There was a kind of split-personality to how I approached the event this year.  Monday through Wednesday was spent in conference tracks, most of all the excellent Innovation Policy Summit put together by the Consumer Electronics Association.  (Kudos again to Gary Shapiro, Michael Petricone and their team of logistics judo masters.)

The Summit has become an important annual event bringing together legislators, regulators, industry and advocates to help solidify the technology policy agenda for the coming year and, in this case, a new Congress.

I spent Thursday and Friday on the show floor, looking in particular for technologies that satisfy what I coined the The Law of Disruptionsocial, political, and economic systems change incrementally, but technology changes exponentially.

What I found, as I wrote in a long post-mortem for Forbes, is that such technologies are well-represented at CES, but are mostly found at the edges of the show–literally.

In small booths away from the mega-displays of the TV, automotive, smartphone, and computer vendors, in hospitality suites in nearby hotels, or even in sponsored and spontaneous hackathons going on around town, I found ample evidence of a new breed of innovation and innovators, whose efforts may yield nothing today or even in a year, but which could become sudden, overnight market disrupters.

Increasingly, it’s one or the other, which is saying something all by itself.  For one thing, how do incumbents compete with such all or nothing innovations?

That, however, is a subject for another day.

For now, consider again the policy implications of such dramatic transformations.  As those of us sitting in room N254 debated the finer points of software patents, IP transition, copyright reform, and the misapplication of antitrust law to fast-changing technology industries (increasingly, that means ALL industries), just a few feet away the real world was changing under our feet.

The policy conference was notably tranquil this year, without such previous hot-button topics as net neutrality, SOPA, or the lack of progress on spectrum reform to generate antagonism among the participants.  But as I wrote at the conclusion of last year’s Summit, at CES, the only law that really matters is Moore’s Law.  Technology gets faster, smaller, and cheaper, not just predictably but exponentially.

As a result, the contrast between what the regulators talk about and what the innovators do gets more dramatic every year, accentuating the figurative if not the literal distance between the policy Summit and the show floor.  I felt as if I had moved between two worlds, one that follows a dainty 19th century wind-up clock and the other that marks time using the Pebble watch, a fully-connected new timepiece funded entirely by Kickstarter.

The lesson for policymakers is sobering, and largely ignored.  Humility, caution, and a Hippocratic-like oath of first-do-no-harm are, ironically, the most useful things regulators can do if, as they repeat at shorter intervals, their true goal is to spur innovation, create jobs, and rescue American entrepreneurialism.

The new wisdom is simple, deceptively so.  Don’t intervene unless and until it’s clear that there is demonstrable harm to consumers (not competitors), that there’s a remedy for the harm that doesn’t make things, if only unintentionally, worse, and that the next batch of innovations won’t solve the problem more quickly and cheaply.

Or, as they say to new interns in the Emergency Room, “Don’t just do something.  Stand there.”

That’s a hard lesson to learn for those of us who think we’re actually surgical policy geniuses, only to find increasingly we’re working with blood-letting and leeches.  And no anesthesia.

In some ways, it’s the opposite of an approach that Adam Thierer calls the Technology Precautionary Principle.  Instead of panicking when new technologies raise new (but likely transient) issues, first try to let Moore’s Law sort it out, until and if it becomes crystal clear that it can’t.  Instead of a hasty response, opt for a delayed response.  Call it the Watchful Waiting Principle.

Not as much fun as fuming, ranting, and regulating at the first sign of chaos, of course, but far more helpful.

That, if anything, is the thread of my dispatches from Vegas, in any case:

  1. Telcos Race Toward an all-IP Future,” CNET
  2. At CES, Companies Large and Small Bash Broken Patent System, Forbes
  3. FCC, Stakeholders Align on Communications Policy—For Now,” CNET
  4. The Five Most Disruptive Technologies at CES 2013, Forbes