Real-time search: 4 questions that deserve answers - SmartBrief

All Articles Marketing Marketing Strategy Real-time search: 4 questions that deserve answers

Real-time search: 4 questions that deserve answers

6 min read

Marketing Strategy

Last week’s post on Google’s introduction of “real-time” search sparked quite a bit of discussion. SmartBrief on Social Media readers had a number of questions, concerns and fears about what this shift in the search landscape might mean to marketers. For answers, I turned to George Bounacos, COO of Silver Beacon Marketing and a SEM practitioner who has been involved in monetizing online efforts since 1986. Here is an edited transcript of that conversation:

Are real-time results on Google, Bing, etc., a permanent shift or more of an experiment?

This is the beginning of a permanent shift and natural cycle. Home Internet usage grew when home modem prices decreased.  Mapping became prominent on search-engine-results pages (SERPs) when people searched online maps for information.  Search engines offer presumptive results now, like tracking a UPS package, if the search string is recognizable.  Real-time is evolutionary.

The collective wisdom of tens of millions of individuals actively communicating will be packaged.  Marketers previously dealt with PageRank to sort sites, and I expect Google and Bing will create a “SocialRank” that examines a person’s connections and values their real-time contribution.

Epinions modeled something similar a decade ago.  They call their metric the Web of Trust, and that model still dictates a large part of the order and visibility of reviews and people on the site.

Real-time search is simply a function of Main Street acceptance.  Twitter is the world’s biggest chat room with a memory.  Had chat rooms become popular at the scale of Twitter and Facebook, real-time search results would already be published.

Many SEO experts recognize the importance of social media on organic results. Will real-time social results play a larger part in defining organic ranking?

The people who take action and engage the real-time results on a search page will skew the results most. This is part of the long-standing second-click problem.  If a search engine embeds videos, maps and other data in the SERP, they control the search and the outcome of providing actionable information.

For example, I recently posted a traffic warning I heard on local radio on Twitter.  Others re-conveyed that information, but the original creator is not involved in the information exchange. Attribution only solves part of that problem.

That’s because of the people act inside the SERP instead of another site.   If I appropriately attribute a story to a news outlet or Web site, but that tweet or social media comment stays on a SERP for interaction, rankings will be impacted. We’ll also continue driving media entities employing reporters and fact-checkers out of business faster.

Combine this with personalized search as the Google default regardless of an individual’s logged-in state, and organic ranking and trend data appears to have limited practical use in the future.

Marketers must rethink how to describe successful organic efforts. I asked a panel at SES San Jose about this in August, and they agreed: Rankings are easy to show clients and bosses, so we do.  As an industry, we have dug our own hole.  Now we have to find a better way to show success.

We’ve noticed that when “SmartBrief” is searched, the real-time results are displayed first, and in most other cases, it’s toward the middle of the results. What dictates where “real-time results” appear on the SERP?

Part of a search engine’s world is ABT — always be testing — with enough traffic to get statistically valid results fast. They still test placement of maps, calls to action and links.  Our analysis is finding real-time information for brands like Comcast that have great social media outreach alongside their local map, too.

For big brands, engines are balancing news results, maps, videos and sitelinks while adding real-time streams. But it’s critical to remember that advertising is the real balance.

Google generates a billion in cash flow every month and maybe $550 million in net income.  Nothing can diminish those funds.  That’s why the rotated L-shape where ads are displayed on the top and right of a SERP remains consistent.  But real-time advertising components could be tested. I wouldn’t be surprised if Bing tested wedging advertising into a SERP’s real-time results.

We know that Google is now sourcing Twitter, Facebook and MySpace. What’s next? Can we expect to see updates from LinkedIn, blog comments (via BackType, Disqus, etc.)?

Great question! LinkedIn is sometimes ignored in social media, but they have business world credibility and their rich, recent data could make them a content player. I almost always open LinkedIn’s status update e-mail that tells me if someone I know has changed companies or updated their profile.

Plaxo is a social media dark horse.   Many questions were answered when Comcast acquired them.  But they have millions of non-traditional Comcast users, five million as a low number, with unified address books and feeds from blogs or social networks.

Blogging certainly isn’t dead, but almost all have small audiences.  I like Disqus‘s technology and recommend it, but they won’t have impact like LinkedIn or Plaxo. And don’t stay too North America-centric by leaving out Bebo. The Bebo demographic skews much younger than Twitter, and their engagement time is at Facebook levels.

There’s a lot of talk about hackers taking advantage of real-time search. How do you think Google can combat this moving forward? What can business do to combat this?

Technology threats to any brand or reputation always exist.  Search engines will use the same techniques that identify black hat SEOs or devalue sites.  Someone in your company or your agency should already using good monitoring tools to identify trouble spots.

Real-time spam will occur just as there is map spam and other poor web experiences.  But there is too much search engine vigilance for unethical marketers to remain successful against an army of really smart people who study things like word patterns and linking structures for a living.

Businesses should have boilerplate language ready to send a webmaster to defend their intellectual property and be vigilant about monitoring. That’s a great role for a marketing assistant or anyone who wants to be more involved in the organization’s branding because that person is often much closer to the actual keywords being used. Arm them with good analytics and turn them loose on the bad guys!

Image credit, GeofferyHolman, via iStock