Site Search: The Ultimate Mashup

July 24, 2007

So you’ve achieved 100% indexation of your site in all the main search engines. Great.

Now it’s time to recycle that content a bit to boost further the number of page. Yes, I’m talking about the same content, mashed up in creative ways. You can most likely create loads of category pages by segmenting your product catalogues (creating what’s called views); you can split content large pages and create subpages (especially if your pages are long, or if you use tab navigation within them). You can use the current tag cloud hysteria to organize your content by user-defined (or SEO-defined) criteria. And, you can create literally hundred of content pages by allowing SE bots to index the pages with site search results. Just how to go about it?

1. Optimize the onpage content of your site search results: review the title, metas, content. It must be relevant, it must convert (check out those bounce rates in GoogleAnalytics or whatever other web statistics program you use!).

2. Optimize the cross-linking  of your site search result pages (SSRP). Check out amazon: their “user who bought this also bought that” concept can be easily applied to SSRP (users who search for this also searched for that). If you don’t have user data, define the relation between search queries yourself, using some simple similarity algorithm. What queries you ask? There are at least three easily accessible sources: site search query logs(if you don’t have them, talk to your sysadmin or the person who programmed the site search) web server logs of queries people use in public SE to reach your site (again, your sysadmin will help), and other sources (such as the infamous AOL data leak etc.)

3. Optimize the linking of the new SSRP and your other landing pages. There are twofold benefits of extensive SSRP indexing. Additional direct traffic, and additional linking created to your regular landing pages (because the principal content element of your SSRP are links to your regular landing pages, right?) I dare to say that in 90% of cases of site search mashup I’ve analyzed the linking effect exceeds the direct traffic effect.

I would love to hear from people who’ve tried this recipe; drop me a comment/email if you are willing to share (in a strictly confidential manner) your results.

Advertisements

Disorganized Google

February 27, 2007

Funny. No matter how hard google tries to fulfill its mission to organize world’s information, there will always be people who use it to increase chaos and entropy. This guy created a google personalized homepage more than 8.000 pixels long…


Balancing your links – Be Natural

November 9, 2006

Un tal Steven Bradley wrote recently an interesting post about balancing your links. The adage is that you should make them look “natural” and avoid over-optimization.

There is nothing wrong with this advice, if your site is purely designed for Googlebot, Slurp or msnbot.

If your site is designed for real users, and your goal is to sell them something (be it a product, service, or subscription) or to give them something useful (such as valuable posts and articles or some useful utility, plugin or extension), let me give you another advice.

Instead of SEOing your site and worrying about making it look natural, be natural. By that I mean that every link you put on your site should be of value for your users.

There is nothing wrong with Steven’s advice, because by trying to look natural you will probably achieve linking structure that will benefit your users. However, I strongly believe that by being natural and focusing on your user’s needs you can achieve superior results, both in usability of your site and in long-term SEO. It pays well to be authentic and natural. You can stop worrying about ongoing changes in ranking algorithms and focus on building a useful site. SE bots learn and get better every month in distinguishing value from manipulation. Being natural (instead of looking natural) is the safest long-term SEO strategy.

Let me give you a natural alternative to Steven’s advice.

  • Steven: Mix up links to your home page and to deeper pages of your site
  • Ubibene: Try to predict / anticipate user’s informational needs on every page and provide links to satisfy them
  • Steven: Vary the anchor text in links to your pages
  • Ubibene: Put the anchor text in links that best explain the content of the target page. Avoid misleading the user. Different context to different (but related) anchor texts.
  • Steven: Trade a few links to balance out the one way inbound links
  • Ubibene: Mix outlinks and internal links always having in mind the user and his/her needs. There is nothing wrong with NOT having any outlink if the user does not really need them on some page. There is nothing wrong with having ONLY outlinks if it benefits the user.
  • Steven and Ubibene: Link out as well as linking in. Here we completely agree, however, I suspect the motivation is different (Steven focuses on naturally-looking linking structure, ubibene on linking structure that best satisfies user needs).
  • Get site wide and one off links (this one I don’t really get, my English is not that good.)

Google Adwords Website Optimizer

October 25, 2006

This is probably the most important SEO news this month. It allows a simple way to improve SEO payout by improving the conversion rate of landing pages.

 While in itself it does not generate more traffic, it helps to improve user experience and increases the “goals” (such as sales, bookings, subscriptions etc.)

It has an important side effect on quite a lot of services/companies with similar usability improvement goals. So, expect something similar to “analytics earthquake” and “maps earthquake” once tool is released (free of charge) to all interested webmasters.

 Google adwords website optimizer is (at the moment) in invitation-only beta, but if I had a usability services company, I would be worried. Very worried…


Google playing catch-up game

October 25, 2006

Google has finally decided to match the offer of msn search to offer its user to create its own search engines, using subsets of its huge index.

Under the umbrella of Google Co-op, users can create specialized index(es) defining several hundred thousand(!) websites they want to include.

Why this news is interested from SEO point of view? Because it moves gives SEOs the possibility to leverage the traffic from well positioned keywords, with the traffic from well positioned specialized search sites (using index that conveniently leaves out sites that the SEO is not able to beat in the regular index, or that are spammy and useless for the site’s target audience.