November 25, 2017

Archives for November 2007

Radiohead's Low Price Might Mean Higher Profit

Radiohead’s name-your-own-price sale of its new In Rainbows album has generated lots of commentary, especially since comscore released data claiming that 62% of customers set their price at zero, with the remaining 38% setting an average price of $6, which comes to an average price of $2.28 per customer. (There are reasons to question these numbers, but let’s take them as roughly accurate for the sake of argument.)

Bill Rosenblatt bemoaned the low price, calling it a race to the bottom. Tim Lee responded by pointing out that Rosenblatt’s “race to the bottom” is just another name for price competition, which is hardly a sign of an unhealthy market. The music market is more competitive than before, and production costs are lower, so naturally prices will go down.

But there’s another basic economic point missing in this debate: Lower average price does not imply lower profit. Radiohead may well be making more money because the price is lower.

To see why this might be true, imagine that there are 10 customers willing to pay $10 for your album, 100 customers willing to pay only $2, and 1000 customers who will only listen if the price is zero. (For simplicity assume the cost of producing an extra copy is zero.) If you price the album at $10, you get ten buyers and make $100. If you price it at $2, you get 110 buyers and make $220. Lowering the price makes you more money.

Or you can ask each customer to name their own price, with a minimum of $2. If all customers pay their own valuation, then you get $10 from 10 customers and $2 from 100 customers, for a total of $300. You get perfect price discrimination – each customer pays his own valuation – which extracts the maximum possible revenue from these 110 customers.

Of course, in real life some customers who value the album at $10 will name a price of $2, so your revenue won’t reach the full $300. But if even one customer pays more than $2, you’re still better off than you’d be with any fixed price. Your price discrimination is imperfect, but it’s still better than not discriminating at all.

Now imagine that you can extract some nonzero amount of revenue from the customers who aren’t willing to pay at all, perhaps because because listening will make them more likely to buy your next album or recommend it to their friends. If you keep the name-your-own-price deal, and remove the $2 minimum, then you’ll capture this value because customers can name a price of zero. Some of the $10-value or $2-value people might also name a price of zero, but if not too many do so you might be better off removing the minimum and capturing some value from every customer.

If customers are honest about their valuation, this last scenario is the most profitable – you make $300 immediately plus the indirect benefit from the zero-price listeners. Some pundits will be shocked and saddened that your revenue is only 27 cents per customer, and 90% of your customers paid nothing at all. But you won’t care – you’ll be too busy counting your money.

Finally, note that none of this analysis depends on any assumptions about customers’ infringement options. Even if it were physically impossible to make infringing copies of the album, the analysis would still hold because it depends only on how badly customers want to hear your music and how likely they are to name a price close to their true valuation. Indeed, factoring in the possibility of infringement only strengthens the argument for lowering the average price.

By all accounts, Radiohead’s album is a musical and financial success. Sure, it’s a gimmick, but it could very well be a smart pricing strategy.

Verizon Violates Net Neutrality with DNS Deviations

While many of us were discussing Comcast’s partial blocking of BitTorrent Traffic, and debating its implications for the net neutrality debate, a more clear-cut neutrality violation was apparently taking place on Verizon’s network – a redirection of Verizon customers’ failed DNS lookups, to drive traffic to Verizon’s own search engine.

Here’s the background. Suppose you’re browsing the web and you mistype an address – say you type “fredom-to-tinker”. Your browser will try to use DNS, the system that maps textual machine names to numeric IP addresses, to translate the name you typed into an address it can actually connect to across the Net. DNS will return an error, saying that the requested name doesn’t exist. Your browser (if it’s a recent version of IE or Firefox) will respond by doing a search for the text you typed, using your default search engine.

What Verizon did is to change how DNS works (for their residential subscribers) so that when a customer’s computer looks up a DNS name that doesn’t exist, rather than returning the name-doesn’t-exist error DNS says that the (non-existent) name maps to Verizon’s search site. This causes the browser to go to the Verizon search site, which shows the user search results (and ads) related to what they typed.

(This is the same trick used by VeriSign’s ill-fated SiteFinder service a few years ago.)

This is a clear violation of net neutrality: Verizon is interfering with the behavior of the DNS protocol, in order to drive traffic to its own search site. And unlike the Comcast scenario which might possibly have been justifiable as legitimate network management, in this case Verizon cannot claim to be helping its network run more smoothly.

Verizon’s actions have two effects. The obvious effect is to drive traffic from the search engines users chose to Verizon’s own search engine. That harms users (by overriding their choices) and harms browser vendors (by degrading their users’ experiences).

The less obvious effect is to break some other applications. DNS lookups that have nothing to do with browsing will still be redirected, because the DNS infrastructure has no way of knowing which requests relate to browsing and which don’t. So if some other application does a DNS lookup and the result should be a not-found error, Verizon will cause the result to point to a Verizon server instead. If a non-browser program expects to see not-found errors sometimes and has a strategy for dealing with them, it won’t be able to carry out that strategy because it won’t see the errors it should be seeing. This will even cause browsers to misbehave in some circumstances.

The effects of Verizon’s neutrality violation can be summarized simply: they interfer with a standard technical protocol; they cause harm on the whole, in part by breaking unrelated services; and they do this in order to override consumer choice by shifting traffic from consumer-chosen services to Verizon’s own services. This is pretty much the definition of a net neutrality violation.

This example contradicts at least two of the standard arguments against net neutrality regulation. First, it shows that violations do happen, and they do cause harm. Second, it shows that at least sometimes it’s easy to tell a harmful violation apart from legitimate network management.

But it doesn’t defeat all of the arguments against net neutrality regulation. Even though violations do occur, and do cause harm, it might turn out that the regulatory cure is worse than the disease.

How Can Government Improve Cyber-Security?

Wednesday was the kickoff meeting of the Commission on Cyber Security for the 44th Presidency, of which I am a member. The commissionhas thirty-four members and has four co-chairs: Congressmen Jim Langevin and Michael McCaul, Admiral Bobby Inman, and Scott Charney. It was organized by the Center for Strategic and International Studies, a national security think tank in Washington. Our goal is to provide advice about cyber-security policy to the next presidential administration. Eventually we’ll produce a report with our findings and recommendations.

I won’t presume to speak for my fellow members, and it’s way too early to predict the contents of our final report. But the meeting got me thinking about what government can do to improve cyber-security. I’ll offer a few thoughts here.

One of the biggest challenges comes from the broad and porous border between government systems and private systems. Not only are government computers networked pervasively to privately-owner computers; but government relies heavily on off-the-shelf technologies whose characteristics are shaped by the market choices of private parties. While it’s important to better protect the more isolated, high-security government systems, real progress elsewhere will depend on ordinary technologies getting more secure.

Ordinary technologies are designed by the market, and the market is big and very hard to budge. I’ve written before about the market failures that cause security to be under-provided. The market, subject to these failures, controls what happens in private systems, and in practice also in ordinary government systems.

To put it another way, although our national cybersecurity strategy might be announced in Washington, our national cybersecurity practice will be defined in the average Silicon Valley cubicle. It’s hard to see what government can do to affect what happens in that cubicle. Indeed, I’d judge our policy as a success if we have any positive impact, no matter how small, in the cubicle.

I see three basic strategies for doing this. First, government can be a cheerleader, exhorting people to improve security, convening meetings to discuss and publicize best practices, and so on. This is cheap and easy, won’t do any harm, and might help a bit at the margin. Second, government can use its purchasing power. In practice this means deliberately overpaying for security, to boost demand for higher-security products. This might be expensive, and its effects will be limited because the majority of buyers will still be happy to pay less for less secure systems. Third, government can invest in human capital, trying to improve education in computer technology generally and computer security specifically, and supporting programs that train researchers and practitioners. This last strategy is slow but I’m convinced it can be effective.

I’m looking forward to working through these problems with my fellow commission members. And I’m eager to hear what you all think.