Category Archives: Information Economics

Larry Appears on C-SPAN's "The Communicators"

Embedded image permalink

Larry is the guest this week on C-SPAN’s long-running program “The Communicators,” talking with host Peter Slen about the research behind “Big Bang Disruption.”  The program covers a wide range of topics, including the impact of exponential technologies, the fate of incumbents, and the role of regulators in impeding or encouraging disruptive innovation.

How Should Governments Approach Technology Policy? Modestly.

Earlier this week, I spoke in San Francisco at an event co-sponsored by the Reason Foundation, TechFreedom, and the Koch Institute.  Hosted by my CNET colleague Declan McCullagh, the topic was “DC vs. SF:  A Clash of Visions for Tech Policy.”

The discussion ranged widely, from copyrights and patents to NSA surveillance to the failure of the government’s Healthcare.gov website.  Although panelists from the political left and right disagreed on some issues, there was as usual widespread consensus that from the standpoint of entrepreneurs and engineers, the core problem in technology policy is that the pace of change for innovation continues to accelerate while the pace of government, at best, remains constant.  Moore’s Law, increasingly, trumps legislated law, often unintentionally, and often with unintended negative consequences.

At the same time, as I emphasized, the shift in commerce, social life, civil life and nearly everything else from offline to online media means that future collisions at the intersection of innovation and regulation are inevitable and, indeed, certain to increase in both frequency and the degree of collateral damage.  Governments claim a monopoly on regulating much of this activity, after all, and like any institution that believes in its own mission is unlikely to let itself go quietly out of business as its markets change.

Governments rely for revenue on taxes. As more traditionally taxable activity migrates online, lawmakers are certain to follow. That’s been true in the development of any frontier, physical or virtual.

The longstanding Silicon Valley approach of ignoring Washington in hopes lawmakers won’t notice what we’re doing was always a dangerous policy, and has now become downright reckless.  So how should innovators engage with regulators?

Watch the video below to find out!

DC v. SF: A Clash of Visions for Tech Policy from Charles Koch Institute on FORA.tv

The return of Net Neutrality

Over on Forbes today, I have a very long post inspired by Monday’s oral arguments in Verizon’s challenge of the FCC’s Open Internet rules, passed in 2010

I say “inspired” because the post has nothing to say about the oral arguments which, in any case, I did not attend.  Mainstream journalists can’t resist the temptation to try to read into the questions asked or the mood of the judges some indication of how the decision will come out

But as anyone who has ever worked in a court or followed appellate practice  well knows, the tone of oral arguments signals nothing about a judge’s point-of-view.  Often, the harshest questioning is reserved for the side a judge is leaning towards supporting, perhaps because the briefs filed were inadequate.  Bad briefs create more work for the judge and her clerks.

I use the occasion of the hearing to take a fresh look at the net neutrality “debate,” which has been on-going since at least 2005, when I first started paying attention to it.  In particular, I try to disentangle the political term “net neutrality” (undefined and, indeed, not even used in the 2010 Open Internet order) from the engineering principles of packet routing.

According to advocates for government regulation of broadband access, the political argument for net neutrality regulation is simply a codification of the Internet’s design.  But regardless of whether it would even make sense to transform the FCC into the governing body of engineering protocols for the network (the Internet Society and the its engineering task forces are and always have been doing a fine job, thanks very much), the reality is that the political argument has almost nothing to do with the underlying engineering.

Indeed, those most strongly advocating for more government regulation either don’t understand the engineering or intentionally mischaracterize it, or both.  That’s clear from the wide range of supposed competitive problems that have been lumped together under the banner of “net neutrality” issues over the years–almost none of which have anything to do with packet routing.

Fortunately, very little of the larger political agenda of the loose coalition of net neutrality advocates is reflected in the rules ultimately passed by a bare majority of the FCC in 2010.  Even so, those rules, limited as they were, face many challenges.

For one thing, the FCC, despite over a year of dedicated attention to the problem, could identify only four incidents that suggested any kind of market failure, and only one of which (the Comcast-BitTorrent incident) was ever actually considered in detail by the Commission.  (Two of the others never even rose to the level of a complaint.)  The agency was left to regulate on the basis of “preserving” the Open Internet through what it called (nearly a dozen times) “prophylactic” rules.

Second, and of particular interest in the D.C. Circuit proceeding, Congress has never authorized the FCC to issue rules dealing with broadband Internet access.  Though many authorizing bills have circulated over the years, none have ever made it out of committee.  With no legal basis to regulate, the agency was left pointing to irrelevant provisions of the existing Communications Act–most of which were already rejected by the same court in the Comcast case.  Nothing in the law has changed since Comcast, and on that basis, regardless of the merits of Internet regulation, the FCC is very likely to lose.  Which the Commission surely knew in passing the rules in 2010.

The piece ends by describing, as I did in my testimony before the House Judiciary Committee in early 2011, how the Report and Order betray the technical reality that from an engineering standpoint, even the supposed neutrality of packet routing is largely a sentimental myth.  The FCC identified and exempted a dozen network management technologies, practices, and protocols that they acknowledged do not follow the neutrality principle, but which are essential to effective and efficient management of the network.  There is no “neutral” Internet to preserve, and never was.

The agency was right to exempt these practices.  But the problem with the rules as written is that they could not and did not extend to future innovations that new applications and new users will certainly make as essential as today’s management techniques.

If the rules stand, network engineers, application developers, device makers and others in the vibrant, dynamic Internet ecosystem will be forced to seek permission to innovate from the FCC, which will both slow the high-speed world of Internet design to a crawl and introduce a decision maker with no technical expertise and lots of political baggage.

That of course was the kind of counter-productive and unnecessary regulatory intrusion that Internet users successfully rose up against last year when the UN’s International Telecommunications Union threatened to assert itself in basic Internet governance, or the year before that when Congress, without technical understanding of the most basic variety, tried to re-architect the Internet  on behalf of media companies in the failed SOPA and PIPA legislation.

If the FCC gains a foothold in broadband access with the Open Internet rules or other efforts to gain oversight where Congress has delegated none, expect a similar reaction.  Or, in any case, hope for one.

Big Bang Disruption vs. Conventional Wisdom

HBR March 2013Last month, Paul F. Nunes and I hosted an audio webinar for Harvard Business Review on “Big Bang Disruption,” our article from the March, 2013 issue of the magazine.  An archive of the webinar has now been posted, including the PowerPoint slides.

We had a lively discussion with the audience, who posted some terrific questions and comments during the course of the session.

What companies, products, start-ups, and industries do you see being transformed by this new kind of disruptive innovation?  We’d like to know.

Where to next for the FCC?

crossroads

Tuesday was a big day for the FCC.  The Senate Commerce, Science and Transportation Committee held an oversight hearing with all five Commissioners, the same day that reply comments were due on the design of eventual “incentive auctions” for over-the-air broadcast spectrum.  And the proposed merger of T-Mobile USA and MetroPCS was approved.

All this activity reflects the stark reality that the Commission stands at a crossroads.  As once-separate wired and wireless communications networks for voice, video, and data converge on the single IP standard, and as mobile users continue to demonstrate insatiable demand for bandwidth for new apps, the FCC can serve as midwife in the transition to next-generation networks.  Or, the agency can put on the blinkers and mechanically apply rules and regulations designed for a by-gone era.

FCC Chairman Julius Genachowski, for one, believes the agency is clearly on the side of the future.  In an op-ed last week in the Wall Street Journal, the Chairman took justifiable pride in the focus his agency has demonstrated in advancing America’s broadband advantage, particularly for mobile users.

Mobile broadband has clearly been a bright spot in an otherwise bleak economy.  Network providers and their investors, according to the FCC’s most recent analysis, have spent over a trillion dollars since 1996 building next-generation mobile networks, today based on 4G LTE technology.

These investments are essential for high-bandwidth smartphones and tablet devices and the remarkable ecosystem of voice, video, and data app they have enabled.  This platform for disruptive innovation has powered a level of “creative destruction” that would do Joseph Schumpeter proud.

Mobile disruptors, however, are entirely dependent on the continued availability of new radio spectrum.  In the first five years following the 2007 introduction of the iPhone, mobile data traffic increased 20,000%.  No surprise, then, that the FCC’s 2010 National Broadband Plan conservatively estimated that mobile consumers desperately needed an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020.

With nearly all usable spectrum long-since allocated, the Plan acknowledged the need for creative new strategies for repurposing existing allocations to maximize the public interest.  But some current licensees including over-the-air television broadcasters and the federal government itself are resisting Chairman Genachowski’s efforts to keep the spectrum pipeline open and flowing.

So far, despite bold plans from the FCC for new unlicensed uses of TV “white spaces” and the  passage early in 2012 of “incentive auction” legislation from Congress, almost no new spectrum has been made available for mobile consumers.  The last significant auction the agency conducted was in 2008, based on capacity freed up in the digital television transition.

The “shared” spectrum the agency has recently been touting would have to be shared with the Department of Defense and other federal agencies, which have so far stonewalled a 2010 Executive Order from President Obama to vacate its unused or underutilized allocations.  (The federal government is, by far, the largest holder of usable spectrum today, with as much as 60% of the total.)

And after over a year of on-going design, there is still no timetable for the incentive auctions.  Last week, FCC Commissioner Jessica Rosenworcel, speaking to the National Association of Broadcasters, urged her colleagues at least to pencil in some dates.  But even in the best-case scenario, it will be years before significant new spectrum comes online for mobile devices.  The statute gives the agency until 2022.

In the interim, the mobile revolution has been kept alive by creative use of secondary markets, where mobile providers have bought and sold existing licenses to optimize current allocations, and by mergers and acquisitions, which allow network operators to combine spectrum and towers to improve coverage and efficiency.  Many transactions have been approved, but others have not.  Efforts to reallocate or reassign underutilized satellite spectrum are languishing in regulatory limbo.  Local zoning bodies continue to slow or refuse permission for the installation of new equipment.  Delays are endemic.

So even as the FCC pursues its visionary long-term plan for spectrum reform, the agency must redouble efforts to encourage optimal use of existing resources.  The agency and the Department of Justice must accelerate review of secondary market transactions, and place the immediate needs of mobile users ahead of hypothetical competitive harms that have yet to emerge.

In conducting the incentive auctions, unrelated conditions and pet projects need to be kept out of the mix, and qualified bidders must not be artificially limited to advance vague policy objectives that have previously spoiled some auctions and unnecessarily depressed prices on others.

Let’s hope today’s oversight hearing will hold Chairman Genachowski to his promise to “[keep] discussions focused on solving problems, and on facts and data….so that innovation, private investment and jobs follow.”  We badly need all three.

(A condensed version of this essay appears today in Roll Call.)

Big Bang Launch of "Big Bang Disruption"–and a Note on Regulatory Implications

In the upcoming issue of Harvard Business Review, my colleague Paul Nunes at Accenture’s Institute for High Performance and I are publishing the first of many articles from an on-going research project on what we are calling “Big Bang Disruption.”

The project is looking at the emerging ecosystem for innovation based on disruptive technologies, following up on work we have done separately and now together over the last fifteen years.

Our chief finding is that the nature of innovation has changed dramatically, calling into question much of the conventional wisdom on business strategy and competition in information-intensive industries–which is to say, these days, every industry.

The drivers of this new ecosystem are ever-cheaper, faster, and smaller computing devices, cloud-based virtualization, crowdsourced financing, collaborative development and marketing, and the proliferation of mobile everything (including, increasingly, not just people but things).

The result is that new innovations now enter the market cheaper, better, and more customizable than products and services they challenge.  (For example, smartphone-based navigation apps versus standalone GPS devices.)  In the strategy literature, such innovation would be characterized as thoroughly “undiscplined.”  It shouldn’t succeed.  But it does.

So when the disruptor arrives and takes off with a bang, often after a series of low-cost, failed experiments, incumbents have no time for a competitive response.  The old rules for dealing with disruptive technologies, most famously from the work of Harvard’s Clayton Christensen, have become counter-productive.   If incumbents haven’t learned to read the new tea leaves ahead of time, it’s game over.

The HBR article doesn’t go into much depth on the policy implications of this new innovation model, but the book we are now writing will.  The answer should be obvious.

This radical new model for product and service introduction underscores the robustness of market behaviors that quickly and efficiently correct many transient examples of dominance, especially in high-tech markets.

As a general rule (though obviously not one without exceptions), the big bang phenomenon further weakens the case for regulatory intervention.  Market dominance is sustainable for ever-shorter periods of time, with little opportunity for incumbents to exploit it.

Quickly and efficiently, a predictable next wave of technology will likely put a quick and definitive end to any “information empires” that have formed from the last generation of technologies.

Or, at the very least, do so more quickly and more cost-effectively than alternative solutions from regulation.  The law, to paraphrase Mark Twain, will still be putting its shoes on while the big bang disruptor has spread halfway around the world.

Unfortunately, much of the contemporary literature on competition policy from legal academics is woefully ignorant of even the conventional wisdom on strategy, not to mention the engineering realities of disruptive technologies already in the market.  Looking at markets solely through the lens of legal theory is, truly, an academic exercise, one with increasingly limited real-world applications.

Indeed, we can think of many examples where legacy regulation actually makes it harder for the incumbents to adapt as quickly as necessary in order to survive the explosive arrival of a big bang disruptor.  But that is a story for another day.

Much more to come.

Related links:

Why Best Buy is Going out of Business…Gradually,” Forbes.com.

What Makes an Idea a Meme?“, Forbes.com

The Five Most Disruptive Technologies at CES 2013,” Forbes.com