Virulent Word of Mouse

May 26, 2014

Did the Internet Prevent all Invention from Moving to one Place?

The diffusion of the internet has had varying effects on the location of economic activity, leading to both increases and decreases in geographic concentration. In an invited column at VoxEU, Chris Forman, Avi Goldfarb and I presents evidence that the internet worked against increasing concentration in invention. This relationship is particularly strong for inventions with more than one inventor, and when inventors live in different cities. Check out the post here.

mouseonmouse

 

April 23, 2014

The Fault Lines Along Fast Lanes

highwayUntil recently, a fast lane from a broadband ISP was a remote possibility in the US. ISPs had to give data equal treatment, regardless of the source, and could not offer faster delivery for a higher price while giving slower service as a default.

Although fast lanes were allowed by regulators a few years ago in the wireless networks, the carriers hesitated to offer them. In December 2013, AT&T Wireless broke with the norm and announced just such a program. FCC regulations forbidding fast lanes at landline broadband ISPs had also prevented them, but a January 2014 US appeals courts struck down those regulations.

Is that a good or bad trend? The answer depends on who’s talking. Critics of government regulation despise the rules forbidding fast lanes, whereas net neutrality supporters view the presence of fast lanes as a nightmare come to life.

Legal and political aspects of this topic typically get most of the attention, as do the implications for the variety of speech online. Most reporters find these aspects interesting, and understand them. However, the economics of fast lanes receives less attention. That is a surprise, because the economics is not very difficult, and it’s worth understanding. It illuminates the fault lines between many different points of view.

Mirrors and servers

The public Internet has evolved considerably since the days when the design for packet networks presumed that the message did not have to arrive at an inbox immediately. Users today prefer and expect speedier services. That goes for more than just IP telephony and video chat, where users notice the smallest delay. It also holds true for video, such as YouTube and many online games. Many providers believe it also affects the bottom line—namely, that users switch services if they do not get fast delivery of data.

Long before fast lanes became a real possibility, many participants in the Internet made investments aimed at reducing delays. For example, for some time now, Akamai has sold a well-known approach to improving speed. Their service also defines the first fault line, so this is a good place to start the discussion. Opponents to net neutrality ask why Akamai can operate a business to speed up data delivery but a carrier cannot.

Akamai’s service supports servers inside ISPs, closer to households. Any seriously large Internet content firm must buy these services, and it is considered a cost of doing business online. Many ISPs like working with Akamai, because their customers experience better service without much investment from the ISP.

That is not the only method for speeding up data. For example, Google has bypassed Akamai’s charges in many locations by building its own data network to ISPs. Netflix has recently sought to do the same, though it is not quite done (because it has not successfully negotiated a presence with every US ISP). Any gathering of more than three Internet engineers will generate discussion of even more potential solutions in the cloud. Amazon built a content delivery network with enormous geographic range. Microsoft has similar investments and aspirations, as does IBM. The list goes on.

That leads to the deeper question. The last few years have witnessed robust experimentation among distinct approaches to functional improvement, and these might be either complements to, or substitutes for, each other. Accordingly, carriers have had two roles. They act as a firm whose users benefit from faster delivery, and they act as a supplier that could choose to cooperate—or refuse to cooperate—with solutions offered by others.

When a carrier had no investments in fast lanes, it had every reason to cooperate with solutions offered by others. Will that change if the carrier has its own fast lane?

The answer defines a fault line between points of view. Some observers label this a possibility that might never arise. They want a regulatory response only when a problem emerges, and otherwise they anticipate that a regulator will err. Net neutrality supporters think regulators have an obligation to protect the Internet. Advocates worry that introducing fast lanes messes with a system that already works well. They do not trust carriers to cooperate with solutions that might substitute for a fast lane business or threaten an investment in some way.

Competition and monopoly

The next fault line has to do with the role of money. Defenders of fast lanes expect them to become a cost of doing business for content firms, and forecast that fast lanes will be profitable and generate more investment. Opponents have the same forecast about profitability, but a different interpretation. They worry that fast lanes will lead to an Internet where only rich firms can deliver their content effectively.

This concern tends to get plenty of press, and a few rhetorical questions illuminate the fault line. Will the default speeds offered by ISPs be good enough for startups or for small specialty websites? One side believes that the defaults will be good enough, whereas the other believes that fast lanes will lead ISPs to neglect investing in their slow services.

One’s point of view about the state of competition for ISPs has a big role in interpreting the role of money. Some believe a competitive ISP market would melt away most problems. Others argue that belief about competitive ISP markets is a fantasy and masks many dangers.

The belief in competition is not a belief in magic, so it is worth examining. Rather, this side views competition as a painful process. In competitive markets, customers substitute into alternatives if they do not like what a supplier does. Suppliers hesitate to do things that make their users angry. In other words, ISPs would compete for customers by offering better fast lanes. In this view, users would get angry if they perceived that carriers were slowing down content from firms they cared about, and angry users would find another carrier.

Where is the fault line? Recognize the two key factors that make ideal competitive markets operate well—namely, transparency and the availability of many user options.

Just about everybody is in favor of transparency, but not necessarily more of it if rules require it. Those with faith in competitive processes tend to see the merits in nothing more than a few light-handed requirements, such as programs to facilitate measuring the speed of different ISPs. The other side asks for much more, such as the publication of all fast lane contracts (more on that later).

As for the second concern about options, consider the key open question: Do users have many options available to them, or do they face de facto monopoly ISP markets? Once again, there are different beliefs about the preponderance of competition and monopoly found throughout locales of the US. Those who presume that competition is inadequate lack sympathy for leaving ISPs alone (versus those who presume it is adequate).

That also leads to different interpretation about how lucrative fast lanes will be. Supporters of fast lanes say that ISPs should charge whatever the market will bear, and competition will discipline pricing. Opponents say that the monopolies emerged from granting public franchises and use of public rights of way, and characterize high prices as misuse of utility franchises.

A classic debate about government merger policy also arises. Net neutrality supporters argue that fast lanes give ISPs artificial incentives to consolidate in order to increase their bargaining leverage with content providers, thus concentrating economic power in ISPs. Net neutrality opponents do not see anything wrong with large ISPs. In a competitive market, size is irrelevant.

Mixed incentives

The foregoing leads into the last fault line in discussions about fast lanes—namely, views about mixed incentives at carriers. A mixed incentive arises when a carrier distributes a service that substitutes for one available on the public Internet.

Many broadband ISPs have a thriving broadband service and provide video on demand, and make a pretty good margin on both services. Will most cable firms want to sell a fast lane service to Netflix at a low price? If the carrier did not make money on video on demand, then a carrier’s price for a fast lane for Netflix would be lower, and the same goes for entrepreneurial firms offering video services. That also begins to suggest the intuition behind the concern that cable firms will tilt their other actions against online video to protect their existing businesses.

Mixed incentives also come up in discussions about scrutinizing carrier contracting practices. To put this fault line in perspective, consider a hypothetical scenario: What would happen after a carrier sells a fast lane to, say, ESPN? Can anyone else expect the same terms, even Netflix? Yet again, one side argues that competition will solve these issues, and the other sees a need for regulatory intervention to make terms of fast lane contracts public.

A mixed incentive also can emerge when a carrier has an economic incentive to protect its partner’s business in which it gets a cut. In other words, is it okay if ESPN gets a better deal than Fox Sports because an ISP made a deal with the local team who competes with something done by Fox Sports? The same fault line as just mentioned: should competition solve this question, or should governments intervene to publish fast lane contracts? Should ISPs be required to give the same terms to all takers?

To summarize, the fault lines between perspectives hinge crucially on several beliefs about the economics. Forecasts depend on whether the observer sees a preponderance of competitive or monopoly markets for ISP services. They also depend on whether transparency resolves potential problems.

 

Copyright held by IEEE. To view the original, see here.

mouseonmouse

March 22, 2014

USPTO public hearing on Attributable Ownership.

Filed under: Announcements — Shane Greenstein @ 12:12 pm

Attributable Ownership Public Hearing in San Francisco on March 26, 2014: Testimony and Written Comments Invited

The USPTO announces a public hearing on Wednesday, March 26, 2014 at U.C. Hastings College of Law in San Francisco from 9 a.m. until noon to receive feedback about proposed rules concerning the ownership of patents and applications (aka “attributable ownership proposed rules”). The public is invited to attend the hearing in person or via Webcast. Additionally, the public is invited to give testimony in person at the hearing and/or to submit written comments about the proposed rules.

To request to give testimony, please send an email to: aohearingrequest@uspto.gov. To submit written comments, please email: AC90.comments@uspto.gov.

The attributable ownership proposed rules require that the attributable owner, including the ultimate parent entity, be identified during the pendency of a patent application and at specified times during the life of a patent. The goal of the proposed rules is to increase the transparency of patent ownership rights. More details about the attributable ownership proposed rules are available here: http://www.gpo.gov/fdsys/pkg/FR-2014-01-24/pdf/2014-01195.pdf

Hearing Logistics:
• Wednesday, March 26, 2014, from 9 a.m. until noon (PT)
U.C. Hastings College of the Law
Louis B. Mayer Lounge
198 McAllister Street
San Francisco, CA 94102

LiveStream Access Information:
https://new.livestream.com/uspto/usptopublichearing
An agenda for the hearing is available here: http://www.uspto.gov/patents/init_events/ao_agenda_san_francisco_3-26-2014.pdf

mouseonmouse

March 11, 2014

Podcast about bias and slant on Wikipedia

Filed under: Academic Research — Shane Greenstein @ 9:13 pm

The web site, Surprisingly Free, organized a podcast about my recent paper, Collective Intelligence and Neutral Point of View: The Case of Wikipedia, coauthored with Harvard assistant professor Feng Zhu. Click here.wikipedia

The paper takes a look at whether Linus’ Law applies to Wikipedia articles. Do Wikipedia articles have a slant or bias? If so, how can we measure it? And, do articles become less biased over time, as more contributors become involved?

Jerry Brito conducts the interview. This is sponsored by the Mercatus Center at George Mason University. In the podcast we discuss the findings of the research.

Click here.

mouseonmouse

March 7, 2014

The Irony of Public Funding

Misunderstandings and misstatements perennially pervade any debate about public funding of research and development. That must be so for any topic involving public money, almost by definition, but arguments about funding for scientific research and development contain a unique and special irony.apache-logo

Well-working government funding is, by definition, difficult to assess, because of two criteria common to subsidies for R&D at virtually all western governments: specifically, governments seek to fund activities yielding large benefits, and these activities should be actions not otherwise undertaken by the private sector.

The first action leads government funders to avoid funding scientific research with low rates of return. That sounds good because it avoids wasting money. However, combining it with the second criteria does some funny things. If private firms only fund scientific R&D, where the rate of return can be measured precisely, government funding tends to fund activities where returns are imprecisely measured.

That is the irony of government funding of science. Governments tend to fund scientific research in precisely the areas where the returns are believed to be high, but where there is little data to confirm or refute the belief.

ApacheThis month’s column will illustrate, with a little example, the server software Apache. As explained in a prior column (“How Much Apache?”), Apache was borne and invented with government funding. Today, it is rather large and taken for granted. But how valuable is it? What was the rate of return on this publically funded invention? It has been difficult to measure.

(more…)

January 30, 2014

Google and Motorola in the Wake of Nortel

googlemotorolaGoogle has announced a plan to sell Motorola to Lenovo for just under three billion dollars. Google paid more than twelve billion only two years ago, and many commentators have declared that this is Larry Page’s first big bet, and potentially his first big experiment to go sour.

Even the best reporters characterize the strategy incorrectly, however, and forget the motivation. The best recognize that the acquisition had several motives, but still use wishy-washy language to discuss the priorities. Here is the language of the New York Times, for example:

“The deal is not a total financial loss for the extremely wealthy Google, which retains patents worth billions of dollars, but it is a sign of the fits and starts the company is experiencing as it navigates business in the mobile age, which has upended technology companies of all types.

In addition to using Motorola’s patents to defend itself in the mobile patent wars, Google pledged to reinvent mobile hardware with Motorola’s new phones, and directly compete with Apple by owning both mobile hardware and software.”

I have a bone to pick here. Even the best reporters are not recalling the sequence of events. Public policy shares some of the blame, and viewed from that perspective, much of this looks like a waste of resources. Let’s get that interpretation on the table by doing a bit of a flashback, shall we? (more…)

January 12, 2014

How Much Apache?

Filed under: Academic Research,Essays,Internet economics — Shane Greenstein @ 4:48 pm

Apache-software-FoundationAlmost with inexorable momentum, the Internet hurls itself into new territory. Some time ago, more than two billion humans had adopted at least one Internet-enabled device in some form, and nobody doubts that another two billion will accrue soon. New webpages increasingly find ways to inform readers, as more information in a variety of formats continues to be layered on the basic system of data internetworking.

That growth has been measured in a variety of dimensions. Today I would like to report on some research to measure one aspect of the Web’s growth, which I did with Frank Nagle, a doctoral student at Harvard Business School. We sought to figure out how much Apache served web surfers in the United States.Apache

That is not a misprint. Apache is the name for the most popular webserver in the world. It is believed to be the second most popular open source project after Linux.

Why do this? Measuring Apache is a key step in understanding the underlying economics. Because it’s free, Apache’s value is easy to mismeasure, and that makes its economics easy to misunderstand. (more…)

December 31, 2013

End the broadband panic meme

Filed under: Editorial,Internet economics and communications policy — Shane Greenstein @ 9:22 am

 

It happens about every twelve months, maybe with more frequency recently. Another reporter writes about how the US is falling behind international rivals in the supply of broadband. I am growing very tired of this meme, and answering emails from friends wondering if it is so. There are serious issues to debate, but this standard meme takes attention away from them.

 

The latest version of this article came from the New York Times. It had the title “US Struggling to Keep Pace in Broadband Service,” and it brought out the usual concern that all US growth will fall behind if the US does not have the fastest broadband in the world. If you are curious, read this.

 

Why is this tiring? Let me count the ways.

 

First, while it is irritating to have slow service at home, US productivity does not depend much on that. Household broadband is less important for economic growth than the broadband to business. And what really matters for productivity? Speed to business. The number of minutes it takes a household to download Netflix is statistically irrelevant for productivity growth in comparison to the time it takes to download information to conduct business transactions with employees, suppliers, and customers. We get measures of broadband speed to homes because that is what we can easily measure, not because it really matters.

 

Is there any sense that US business Internet is too slow? Well, perhaps the speed of a household’s internet says something about the speed of business Internet, but I doubt it. In all the major cities of the US there is no crisis at all in the provision of broadband.  Broadband speeds in downtown Manhattan are extraordinary, as well as in Wall Street. The Silicon Valley firms who need fast speeds can get them. Same with the firms in Seattle. Hey, the experiments with Google Fiber in Kansas City raise questions about whether entrepreneurship will follow the installation of super high speeds, but that is an open question. It is an interesting question too, but not a crisis.

 

These issues do arise, however, in some small and medium cities in the US, and a few rural areas where there is no broadband. In some places satellite is the best available, or some fixed wireless solutions are available too. These can be ok but not great for many business needs, but it can also limit what a business can do. These issues also have been present for a while, so most of the businesses that really needed the speed simply left the areas where speeds were slow. As a country we just let it happen a many years ago, and, frankly, it will be hard to reverse at this point. (It made me sad at the time; I even spent some time doing research on the topic for a while, though I have stopped in the last few years.) Again, this is an interesting question, but only a crisis in the places where it matters, not a national level.

 

Second, as for household speeds, many people simply don’t want them and do not want to pay for them. There is plenty of evidence that those high speed Korean lines did not get used right away, and lots of fiber goes to waste. Having said that, there are some interesting open questions here as well, namely, what type of speeds are people willing to pay for at their homes? Let’s not get panicked over supply if there is little demand, ok?

The last serious study of the willingness to pay for speed was done at the end of 2009, as part of the national broadband plan. The study was definitive at the time, that only a few households were willing to pay for high speeds. But, of course, that was a while ago. What has changed since then? Well, arguably, demand for data-intensive stuff has risen. That is not coming from the growth in torrent. Recent data are pretty clear about that. It is coming from Netflix, YouTube, and Facebook. Once again, that is a great open question, but panic about speed does nothing to focus on that question. Instead, let’s study demand and whether it goes unsatisfied.

 

Third, if we study demand, can we all acknowledge that demand is very skewed in the US? 10% of the users account for far more than 50% of the data to households, and 20% of the users get most systems to more than 80% of the data use. And it is growing at levels from median to highest part of the skew, so there is good reason to think demand for data is growing for all major users. Will there be capacity to handle those intensive users of data? The answer is unclear.

 

That hints at an open question that is worth debating. Not everyone pays the same price because flat rate pricing has been so common across the US. The top 10% of users pay very low prices per megabit. Even if total expenditure per month for the biggest users is twice as expensive in the US in comparison to other countries, it is still pretty cheap. Just to be clear, I am not saying it is too high or too low, nor am I am not making any comment about whether markets are competitive enough in the US. I am just saying that the international comparisons are flawed for big users in the US.

 

That hints at an even more challenging question. For better or worse, it is these high-intensity users, especially many with young adults or teenagers, who seem to be the early users of new services. So US entrepreneurial edge might actually be coming from the low prices and high speeds our biggest users have enjoyed all these years. Are we in danger of ending that? That is the provocative question to ask, and it is not about the general speed in the country. It is about the highest speeds to select users.

 

Finally, and my last problem with this meme: it’s old and tired and potentially irrelevant. Maybe this concern about wireline is all a tempest in a teapot. Many observers believe wireless is the new frontier for innovative applications. Maybe five years from now everybody will look back on this panic and just shake their heads. How can we have an entire article about broadband speeds to households and not a peep about the experience most people have on a daily level, which is determined by wireless speeds?

 

Just something to think about.

mouseonmouse

 

December 18, 2013

Top Dozen Tech Events of 2013

Filed under: Amusing diversions,Computer and Internet Humor,We call it life — Shane Greenstein @ 10:48 pm

It is time to look back, and give some awards for the best events in information and communications technology. And what a year it was — with Snowden, Healthcare IT, the Twitter IPO, and plenty of other events deserving both recognition and sarcastic observation.

Just like last year, there are four criteria for winning. The winner had to do something in the calendar year. The action had to sally-fields-the-flying-nuninvolve information and communications technology. It had to be notable. That is not asking much, so the final feature is the most important: The award winner has to contain something that deserves a snarky remark or a bit of sarcasm. Like last year, every winner gets a virtual trophy called a “Sally,” affectionately named for Sally Fields. Why her? Because she memorably said, “You like me, you really like me.” That label is meant to convey a simple message: none of this should be taken too seriously.

Here are a dozen. If you disagree with my choices for awards, feel free to suggest your own in the comments section. Let’s get to it. (more…)

October 29, 2013

William C. Lowe

Filed under: biography,We call it life — Shane Greenstein @ 8:37 am

All students of the computer industry have heard of Bill Lowe, the leader of an IBM Boca Raton facility that launched the IBM PC. That launch was a signal event in computing. It catalyzed growth in the small systems market.

William C. LoweI had the great pleasure to interview Bill a few years ago for a research project comparing the response of large firms to external events (Tim Bresnahan, Shane Greenstein, and Rebecca Henderson, 2012. “Schumpeterian Economies and Diseconomies of Scope: Illustrations from the Histories of IBM and Microsoft,” The Rate and Direction of Technical Change, 50 Year Anniversary, Edited by Josh Lerner and Scott Stern, University of Chicago Press. Pp 203-276.)

The news this morning announced Bill Lowe’s passing. I am greatly saddened to learn of his passing, and my sympathies go to his many friends and family. He had a unique role in computing history. In this post I would like to share a few memories of those interviews.

******************
(more…)

Next Page »

The Rubric Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 61 other followers

%d bloggers like this: