Twitter, Jaiku, tumblr for SEO

May 22, 2007

I’ve never been big fan of nano and microblogging. However, in the past two days I started playing with some of these tools because I wanted to explore their potential for SEO purposes.

It is too early to report any SEO results. In fact, I have no plans to measure the impact of this experiment. All I want to do is a kind of “proof of concept” that theses tools might be excellent linkbuilding tools. What you need to start?

1. Sign up for accounts at twitter, jaiku, tumbr etc. Use keywords in your screen names whenever possible. NOTE: There may be other tools equally or better suited for link building purpose). Suggestions anyone?

2. Choose RSS feeds from your target site and import them into your micro/nano blog. If you don’t have RSS feeds implemented yet, stop reading and get some html – RSS generator or talk to your web developer to implement them now.

Both Jaiku and tumblr make RSS import extremely easy. Jaiku seems somewhat better, allowing to define update frequency and filters; tumblr is limited and sometime it publishes duplicate entries as if it did not understand the RSS feed correctly.

Importing RSS in Twitter’s case is more complicated, since it does not have any RSS import feature at the moment (though I suspect it will be implemented in near future). You must use third-party tool, such as twitterfeed to achieve the same. Simple, if you decide that you trust them enough since you will have to let them know your twitter access data… Additionally, twitter, due to post lenght limitation replaces the urls from your RSS feed by tinyurls. Don’t worry though; tinyurls work with 301, i.e. permanent redirect, and the target page will benefit from the incoming link as if the link were direct.

3. Get some inbound links to those newly created link directories so that they get indexed by SE. This may happen even without any inbound link of yours if jaiku/twitter/tumblr have member directory of some sort (I did not have time to check).

For this approach to work in the long run, your “creation” must make sense, be interesting. Remember, these are social media, and people will link to your micro/nano stuff only if they like it.

Since this blog has just one RSS feed, I decided to use another website to demonstrate this approach. For one of Spanish local online travel agents here in Spain (hope they will not get mad at me!) whose web has many many RSS feeds I created 1 tumbr, 1 jaiku and 1 twitter space to show what can be done.
The twitter space blends together content from several feeds on recent flight tickets quotes (vuelos baratos in Spanish). The jaiku space vuelos does the same thing (a selection of special flight ticket deals), while this tumblr subdomain restaurantes.tumblr.com is constructed from restaurant and theater RSS product feed to create an alternative product view.

 Update: to facilitate comparison of the three tools, I decided to create the implement flight tickets quotes in tumblr, too. Here is the result.

P.S. I am sure there are people out there who have been doing this for ages. If you have more experience with microblogging for SEO and are willing to share it, I’d love to hear from you.


Doubleclick, doubledumb?

April 20, 2007

Immediately after I read the news that Google boughtdoubleclick for more than 3 billion dollars, I got this strange feeling of “I don’t get it“. At first sight, the price paid just does not make sense.

 The price is 10x Doubleclick’s annual revenue. Altough it is likely that margins of Doubleclick are high (say, 30-40%?) and it is likely that Doubleclick’s business is growing fast, we would still be looking at return of investment over too long period (for the vast majority of investors).

 Unless… Unless there were other factors at play… Here is a list of some of them:

  • Block Microsoft and other paid search competitors from getting the knowledge and technology they might use to reduce the competitive advantage Google is enjoying right now.
  • Release some other hidden synergies (comments welcome on what they might be) and increase significantly DC revenue growth.
  • Offer tracking software of superior quality (compared to GAnalytics) to some of its advertisers (increasing thus customer loyalty and possibly revenues if they decide to charge for it. 
  • Connect Analytics and Doubleclick and use the common dataset them to analyze universal user behavior. While scary from privacy perspective, this makes a lot of business sense. This would be something Google is uniquely positioned to do, due to its computing cloud and analytic brainpower. This would give Google permanent competitive advantage (knowledge!) and possibly create a natural monopoly situation for Google as The oficial Big Brother of internet. Let’s hope they do it. And let’s hope they would use their knowledge in the “Do no evil” way…

We just have to wait and see if Google gets more of Doubleclick than it got from YouTube (the stupidest Google’s move so far).


New Google link tool

February 6, 2007

So yesterday google announced a link research tool. They included it into their webmaster tools pages, and it was meant to facilitate link research for one’s own domains.

However, the tool was wide wide open and allowed quite detailed analysis of ANY website and its external link structure.

The SEOs clever/fast enough seized this opportunity like crazy. Researching competitor’s inlink is often not easy. The number of useful tools is limited (yahoo site explorer, msn link tool) and Google’s own link command stopped working reliably several years ago. So, having a tool that quickly and comfortably let’s you do the job was a blessing that lasted a bit more than 24 hours. Enough time to research entire sectors, link neighborhoods and domain cities that go usually undetected and rank high in Google because they are simply to complex for the current algo.

What are the consequencies of this error? More spam in Google index. This window of opportunity created by Google helped some SEOs to gain more insight into the way blackhat/greyhat heavyweights link. Those unscrupulous will follow them and exceed them in the methods revealed by the tool, the rest of us will most likely use those principles in a rather conservative (but still very useful) way….

Both groups will contribute to increase competition levels in all high-profile sectors. I already had several inquiries on travel sector (both global and Spanish speaking markets). And while I personally have no intention to benefit from this error, many others will.

This development is very similar to the gaffe by AOL when they released huge quantity of search data that is still available for analysis in some places… Privacy-wise, it is not so bad, but spam-wise, the effect will be stronger.

One wonders, seeing the stupidity of the error, how safe are our Gmail, calendar, spreadsheet, desktop search and any other data we keep in Google’s hands. Hopefully the person responsible for the error is packing his lava lamp and cleaning his cubicle in Googleplex. Firing him/her is the only possible signal that Google strives to keep our data safe…

More stuff about the tool and the official announcement (naive optimism I would call it, it’s clear they had no idea how fast people would move in…)


Balancing your links – Be Natural

November 9, 2006

Un tal Steven Bradley wrote recently an interesting post about balancing your links. The adage is that you should make them look “natural” and avoid over-optimization.

There is nothing wrong with this advice, if your site is purely designed for Googlebot, Slurp or msnbot.

If your site is designed for real users, and your goal is to sell them something (be it a product, service, or subscription) or to give them something useful (such as valuable posts and articles or some useful utility, plugin or extension), let me give you another advice.

Instead of SEOing your site and worrying about making it look natural, be natural. By that I mean that every link you put on your site should be of value for your users.

There is nothing wrong with Steven’s advice, because by trying to look natural you will probably achieve linking structure that will benefit your users. However, I strongly believe that by being natural and focusing on your user’s needs you can achieve superior results, both in usability of your site and in long-term SEO. It pays well to be authentic and natural. You can stop worrying about ongoing changes in ranking algorithms and focus on building a useful site. SE bots learn and get better every month in distinguishing value from manipulation. Being natural (instead of looking natural) is the safest long-term SEO strategy.

Let me give you a natural alternative to Steven’s advice.

  • Steven: Mix up links to your home page and to deeper pages of your site
  • Ubibene: Try to predict / anticipate user’s informational needs on every page and provide links to satisfy them
  • Steven: Vary the anchor text in links to your pages
  • Ubibene: Put the anchor text in links that best explain the content of the target page. Avoid misleading the user. Different context to different (but related) anchor texts.
  • Steven: Trade a few links to balance out the one way inbound links
  • Ubibene: Mix outlinks and internal links always having in mind the user and his/her needs. There is nothing wrong with NOT having any outlink if the user does not really need them on some page. There is nothing wrong with having ONLY outlinks if it benefits the user.
  • Steven and Ubibene: Link out as well as linking in. Here we completely agree, however, I suspect the motivation is different (Steven focuses on naturally-looking linking structure, ubibene on linking structure that best satisfies user needs).
  • Get site wide and one off links (this one I don’t really get, my English is not that good.)

Google Adwords Website Optimizer

October 25, 2006

This is probably the most important SEO news this month. It allows a simple way to improve SEO payout by improving the conversion rate of landing pages.

 While in itself it does not generate more traffic, it helps to improve user experience and increases the “goals” (such as sales, bookings, subscriptions etc.)

It has an important side effect on quite a lot of services/companies with similar usability improvement goals. So, expect something similar to “analytics earthquake” and “maps earthquake” once tool is released (free of charge) to all interested webmasters.

 Google adwords website optimizer is (at the moment) in invitation-only beta, but if I had a usability services company, I would be worried. Very worried…