Last night I linked to an interview with Rupert Murdoch in which he says that News Corp will probably de-index their sites from Google.
I figured it was all bluster. Search engine traffic is more valuable that Murdoch suggests, and there are probably plenty of people in high places at News Corp who know it.
But Cory Doctorow suggests:
So here’s what I think it going on. Murdoch has no intention of shutting down search-engine traffic to his sites, but he’s still having lurid fantasies inspired by the momentary insanity that caused Google to pay him for the exclusive right to index MySpace (thus momentarily rendering MySpace a visionary business-move instead of a ten-minutes-behind-the-curve cash-dump).
So what he’s hoping is that a second-tier search engine like Bing or Ask (or, better yet, some search tool you’ve never heard of that just got $50MM in venture capital) will give him half a year’s operating budget in exchange for a competitive advantage over Google.
Jason Calacanis has suggested this approach as a means to “kill Google.”
But it may actually be neither the death of Google, nor the death of News Corp if they are so foolish as to carry out this plan. It could be the death of the robots exclusion standard. I would guess News Corp would use robots.txt to de-index their sites. But it’s a “purely advisory” protocol that Google is under no obligation to honor. They could continue indexing News Corps if they so choose. So could every other search engine, big or small. And I’d guess they would if big content providers started going exclusive with search engines.
If News Corps puts all its contend behind a pay wall, this point is moot – Google and other search engines won’t be able to index it, and robots.txt will be fine. But it’s something to think about.
(Hat tips to Jay Rosen for the TimesSelect link and Chris Arkenberg for the Jason Calacanis video)