Site Search: The Ultimate Mashup

July 24, 2007

So you’ve achieved 100% indexation of your site in all the main search engines. Great.

Now it’s time to recycle that content a bit to boost further the number of page. Yes, I’m talking about the same content, mashed up in creative ways. You can most likely create loads of category pages by segmenting your product catalogues (creating what’s called views); you can split content large pages and create subpages (especially if your pages are long, or if you use tab navigation within them). You can use the current tag cloud hysteria to organize your content by user-defined (or SEO-defined) criteria. And, you can create literally hundred of content pages by allowing SE bots to index the pages with site search results. Just how to go about it?

1. Optimize the onpage content of your site search results: review the title, metas, content. It must be relevant, it must convert (check out those bounce rates in GoogleAnalytics or whatever other web statistics program you use!).

2. Optimize the cross-linking¬† of your site search result pages (SSRP). Check out amazon: their “user who bought this also bought that” concept can be easily applied to SSRP (users who search for this also searched for that). If you don’t have user data, define the relation between search queries yourself, using some simple similarity algorithm. What queries you ask? There are at least three easily accessible sources: site search query logs(if you don’t have them, talk to your sysadmin or the person who programmed the site search) web server logs of queries people use in public SE to reach your site (again, your sysadmin will help), and other sources (such as the infamous AOL data leak etc.)

3. Optimize the linking of the new SSRP and your other landing pages. There are twofold benefits of extensive SSRP indexing. Additional direct traffic, and additional linking created to your regular landing pages (because the principal content element of your SSRP are links to your regular landing pages, right?) I dare to say that in 90% of cases of site search mashup I’ve analyzed the linking effect exceeds the direct traffic effect.

I would love to hear from people who’ve tried this recipe; drop me a comment/email if you are willing to share (in a strictly confidential manner) your results.

Advertisements

Twitter, Jaiku, tumblr for SEO

May 22, 2007

I’ve never been big fan of nano and microblogging. However, in the past two days I started playing with some of these tools because I wanted to explore their potential for SEO purposes.

It is too early to report any SEO results. In fact, I have no plans to measure the impact of this experiment. All I want to do is a kind of “proof of concept” that theses tools might be excellent linkbuilding tools. What you need to start?

1. Sign up for accounts at twitter, jaiku, tumbr etc. Use keywords in your screen names whenever possible. NOTE: There may be other tools equally or better suited for link building purpose). Suggestions anyone?

2. Choose RSS feeds from your target site and import them into your micro/nano blog. If you don’t have RSS feeds implemented yet, stop reading and get some html – RSS generator or talk to your web developer to implement them now.

Both Jaiku and tumblr make RSS import extremely easy. Jaiku seems somewhat better, allowing to define update frequency and filters; tumblr is limited and sometime it publishes duplicate entries as if it did not understand the RSS feed correctly.

Importing RSS in Twitter’s case is more complicated, since it does not have any RSS import feature at the moment (though I suspect it will be implemented in near future). You must use third-party tool, such as twitterfeed to achieve the same. Simple, if you decide that you trust them enough since you will have to let them know your twitter access data… Additionally, twitter, due to post lenght limitation replaces the urls from your RSS feed by tinyurls. Don’t worry though; tinyurls work with 301, i.e. permanent redirect, and the target page will benefit from the incoming link as if the link were direct.

3. Get some inbound links to those newly created link directories so that they get indexed by SE. This may happen even without any inbound link of yours if jaiku/twitter/tumblr have member directory of some sort (I did not have time to check).

For this approach to work in the long run, your “creation” must make sense, be interesting. Remember, these are social media, and people will link to your micro/nano stuff only if they like it.

Since this blog has just one RSS feed, I decided to use another website to demonstrate this approach. For one of Spanish local online travel agents here in Spain (hope they will not get mad at me!) whose web has many many RSS feeds I created 1 tumbr, 1 jaiku and 1 twitter space to show what can be done.
The twitter space blends together content from several feeds on recent flight tickets quotes (vuelos baratos in Spanish). The jaiku space vuelos does the same thing (a selection of special flight ticket deals), while this tumblr subdomain restaurantes.tumblr.com is constructed from restaurant and theater RSS product feed to create an alternative product view.

 Update: to facilitate comparison of the three tools, I decided to create the implement flight tickets quotes in tumblr, too. Here is the result.

P.S. I am sure there are people out there who have been doing this for ages. If you have more experience with microblogging for SEO and are willing to share it, I’d love to hear from you.


New Google link tool

February 6, 2007

So yesterday google announced a link research tool. They included it into their webmaster tools pages, and it was meant to facilitate link research for one’s own domains.

However, the tool was wide wide open and allowed quite detailed analysis of ANY website and its external link structure.

The SEOs clever/fast enough seized this opportunity like crazy. Researching competitor’s inlink is often not easy. The number of useful tools is limited (yahoo site explorer, msn link tool) and Google’s own link command stopped working reliably several years ago. So, having a tool that quickly and comfortably let’s you do the job was a blessing that lasted a bit more than 24 hours. Enough time to research entire sectors, link neighborhoods and domain cities that go usually undetected and rank high in Google because they are simply to complex for the current algo.

What are the consequencies of this error? More spam in Google index. This window of opportunity created by Google helped some SEOs to gain more insight into the way blackhat/greyhat heavyweights link. Those unscrupulous will follow them and exceed them in the methods revealed by the tool, the rest of us will most likely use those principles in a rather conservative (but still very useful) way….

Both groups will contribute to increase competition levels in all high-profile sectors. I already had several inquiries on travel sector (both global and Spanish speaking markets). And while I personally have no intention to benefit from this error, many others will.

This development is very similar to the gaffe by AOL when they released huge quantity of search data that is still available for analysis in some places… Privacy-wise, it is not so bad, but spam-wise, the effect will be stronger.

One wonders, seeing the stupidity of the error, how safe are our Gmail, calendar, spreadsheet, desktop search and any other data we keep in Google’s hands. Hopefully the person responsible for the error is packing his lava lamp and cleaning his cubicle in Googleplex. Firing him/her is the only possible signal that Google strives to keep our data safe…

More stuff about the tool and the official announcement (naive optimism I would call it, it’s clear they had no idea how fast people would move in…)


Paradox of Choice in SEO

September 21, 2006

I am reading a fascinating book by Barry Schwartz. If you are lazy to read/buy the book, check out his google video lecture.

There are important implications of his research to usability and SEO. Highly recommended, especially for Steve and his long-tail SEO approach followers ;-).