Eric Schmidt is singing a different tune from his troops this week in an attempt to calm down a storm brewing with the World Association of Newspapers, the defacto global organisation for newspaper publishers, about WAN’s attempt to level/organise the playing field around the way content is distributed across the Web.
Last week, at the Guardian’s Changing Media Summit Rob Jonas, Google’s head of media and publishing partnerships in Europe, said in his keynote that the current standard (robots.txt) “provides everything that most publishers need to do”. In response, Gavin O’Reilly, the chairman of the World Association of Newspapers and COO, questioned this stance as from his view Publishers strongly disagree.
This week, Google CEO Eric Schmidt changed the tune and told ITWire actually the problem is not that Google doesn’t want to implement it, it’s just that they are having problems implementing it.
- Current standard, Robots.txt is limited to telling crawlers that you can or can’t crawl the content
- New proposed standard, Automated Content Access Protocol (ACAP) provides Publisher with much more control, such as putting a time stamp on how long content can be used.
It’s probably too obvious a point to make and maybe AdViking is getting too jaded here. But it seems to be that isn’t in Google’s self-interest to solve the tech issues to implement this new standard that gives Publishers much more control over how their content is used. If I was working at Microsoft Live Search, ASK even Yahoo! in the crawler team, I would be looking to pony up to WAN and the other publisher groups to get ACAP supported and get a quick PR win.