A Serious Threat? Google’s Second Search Box

28 March, 2008

With the launch of Google’s Second Search Box are Publishers under a serious threat to their digital revenue streams? AdViking believes this is the case for two primary reasons:

  • Lost Revenue
  • Loss of User Control

Lost Revenue

Based on the recent crunching of revenue numbers, the general trend seems to be that Google’s revenue growth is at the cost of traditional media owners. Meaning that if any Publisher doesn’t see this as an additional direct and competitive threat to their business, then they really need to wake up and smell the coffee.

From early feedback AdViking has received, Publishers are giving mixed reviews to the Second Search Box. The more forward thinking and strategic are aghast and the less aggressive aren’t too worried and are saying that increased traffic will make up for the lost revenue of that initial hit. This is erroneous thinking for a few reasons:

In the short term, the loss of traffic will lead directly to lost revenue.

In the medium term, by counting on this additional boost in traffic, Publishers will become even more indebted to Google for driving traffic.

In the long term, as Publisher metrics worsen (eCPM, traffic, etc.) then this will lose to the further loss of advertisers as they spend more and more money with Google.

Loss of User Experience Control

Publishers have spent a lot of time and money working on the User Experience, especially as Search as moved to the primary method of finding content, the relevancy models for their search results.

By allowing Google to apply a one-size-fits all model, then all of the benefits of knowing your content, knowing your users, etc. is thrown out the window.

Below are two different examples that show the negative impact of the Google’s Second Search Box on User Experience.

Bear Stearns on New York Times

The New York Times- Search for 'bear stearns'_1206704850814

bear stearns site-nytimes.com - Google Search_1206704880625

Laptop on BestBuy

clip_image001[1]

laptop site-bestbuy.com - Google Search_1206704989247

Postscript:

  • Official Google post about Second Site Search
  • The cynic in AdViking questions, will these feature go away after the end of Q1?

A QR Wish

26 March, 2008

Without knowing about it, AdViking has been writing about the potential use for QR for a couple of years now and while walking to the train station this morning, AdViking was reminded of the need for real world applicable use of QR technology.

The reason being that while walking past what looked like an interesting wine cellar and and while writing this, under 15 minutes later, the name and website URL is gone from memory.

The QR wish is why can’t one do a quick scan of a QR code and automatically the wine cellar is stored into my preferred QR suppliers (e.g.: Live, Delicious, FB, etc). or better a QR server that through OpenID and with correct permissions talks to all of my webtop points of contact.


February Stats Show More Churn for Facebook UK

26 March, 2008

Following on from the decline of traffic in December/January for Facebook UK, Nielsen Online have released provisional February stats and it looks like Facebook UK has continued it’s downward trend of users in and have dropped from 8.9 million in December to 8.3 million unique visitors in February.


Happy Easter

20 March, 2008

In Europe, the weeks surrounding Easter are a bit of a social contract time out.

Meaning, this is going to be last post for a little while.

Have a good one.


Must Read Posts

20 March, 2008

Two must read posts:

Need to digest these over a long holiday weekend… Lots to think about and I have say this should prove a great spark for a wider discussion on ad networks, branding and the next phase of online advertising and publishing.


Worth a Read: Google Sucks Life Out of Old Media

19 March, 2008

This analysis on SAI is worth a read. And I don’t think the sample includes any YPs…


Google Blames Their Technology

19 March, 2008

Eric Schmidt is singing a different tune from his troops this week in an attempt to calm down a storm brewing with the World Association of Newspapers, the defacto global organisation for newspaper publishers, about WAN’s attempt to level/organise the playing field around the way content is distributed across the Web.

Last week, at the Guardian’s Changing Media Summit Rob Jonas, Google’s head of media and publishing partnerships in Europe, said in his keynote that the current standard (robots.txt) “provides everything that most publishers need to do”. In response, Gavin O‚ÄôReilly, the chairman of the World Association of Newspapers and COO, questioned this stance as from his view Publishers strongly disagree.

This week, Google CEO Eric Schmidt changed the tune and told ITWire actually the problem is not that Google doesn’t want to implement it, it’s just that they are having problems implementing it.

  • Current standard, Robots.txt is limited to telling crawlers that you can or can’t crawl the content
  • New proposed standard, Automated Content Access Protocol (ACAP) provides Publisher with much more control, such as putting a time stamp on how long content can be used.

It’s probably too obvious a point to make and maybe AdViking is getting too jaded here. But it seems to be that isn’t in Google’s self-interest to solve the tech issues to implement this new standard that gives Publishers much more control over how their content is used. If I was working at Microsoft Live Search, ASK even Yahoo! in the crawler team, I would be looking to pony up to WAN and the other publisher groups to get ACAP supported and get a quick PR win.