Download the agency overview...
View our Agency Overview
Agency Overview

Expert Resources

Making AJAX SEO Friendly

On many blogs, SEO professionals say AJAX is evil in SEO, just in the same way Flash is. But ask a Web2.0 designer/developer, AJAX is so cool that people will love it and stay on your site seeing the quick interactive visual feedback without any page reloading. Let’s look deeper into the possibility of making SEO and AJAX BFF.

What is AJAX?

AJAX is a term coined publicly by Jesse James Garrett of AdaptivePath meaning: Asynchronous JavaScript and XML. As posted on Wikipedia:

AJAX (Asynchronous JavaScript and XML), or Ajax, is a web development technique used for creating interactive web applications. The main intent is to make web pages feel more responsive by exchanging small amounts of data with the server behind the scenes so that the entire web page does not have to be reloaded each time the user requests a change. This is intended to increase the web page’s interactivity, speed, functionality, and usability.

AJAX is asynchronous in that extra data is requested from the server and loaded in the background without interfering with the display and behavior of the existing page. JavaScript is the scripting language in which AJAX function calls are usually made. Data is retrieved using the XMLHttpRequest object that is available to scripting languages run in modern browsers. There is, however, no requirement that the asynchronous content be formatted in XML.

AJAX is a cross-platform technique usable on many different operating systems, computer architectures, and web browsers as it is based on open standards such as JavaScript and the DOM. There are free and open source implementations of suitable frameworks and libraries.

Keeping the technical language to a minimum, the main observable benefit you have in running AJAX is having dynamic content load on your page without having the whole page to reload on a new URL. This gives you the benefit to only reload certain parts of a page making the resources load to a minimum aside from having it visually appealing to the user.

Problem 1: SEO Issues with Dynamic Content of AJAX

Search engines crawl websites using bots that are also called crawlers or spiders. These are nothing but programs that visit webpages on the Internet looking at all links and further visiting every valid link found crawlable. And content on the page is read and associated to the URL crawled.
Having that behavior, search engines will only be able to read the initially loaded content when the URL of the page is loaded. Any other content that loads by AJAX requiring a user click or some other event may not be read by the search engines.
Probably a good example of an AJAX site would be the Instant Domain Search. For every letter you type, new content is loaded. It loads fast and is appealing to people as they do not need to wait for a full page reload. Although all generated new content cannot be read by search engines.

Problem 2: Search Engine Crawling Problems of AJAX

Since search engines crawl websites looking at every link found on a page, search engines may not be able to check links going to other pages of an AJAX webpage. First of all, AJAX content does not need to reside on a different URL, it will have the same URL as how it loaded.
Normally a link going to a normal webpage may look like this:

And AJAX powered sites may have links in various forms like:

Having your href value not a valid webpage will just be disregarded by search engines, even if you do have a way to let search engines see the dynamically loaded content, this will still be on the same URL where it has to really be separated.
For my example of this, let me show you the AJAX Powered Agile Photo Gallery. Their demo page shows one URL and that URL will never change when you view the other photos or check the other pages. Their link format uses a JavaScript function right in the href value of the link.

Problem 3: Placing All Content On One Page Dilutes Keyword Focus

Instead of dynamically loading all content from a server-side scripting language or XML file, some people just place all content upfront and use JavaScript tricks to hide and display text. This may solve the problem of problem 1 issue mentioned above since there is no other content to crawl. And, in a way, solves the issue of problem 2 since there are no other links needed to be crawled. But, even if it solves these issues, the new problem of keyword focus arises.
Check the site of the popular Moofx All their main pages are all on one page and if you click on the bars, the content gracefully slides and, in reality, all content is already there and is not loading for every click on the page links. The content is just hidden using CSS and/or JavaScript techniques. In the case of Moofx it is not much of a big issue, but on other corporate sites with pages of various topics that should stand alone on their own page, this technique may not be ideal as the various keywords get all mixed up and loses keyword focus on a page.

Problem 4: Search Engines Spiders Do Not Read JavaScript Code

Depending on how your AJAX developer makes your web pages, they can make content being written by JavaScript or not. And, while it would be ideal to keep JavaScript to dynamically changing visual components of a page, it is not ideal to have JavaScript write the content itself using a document.write function. As much as possible, content and links of a webpage should be plain HTML text as much as possible. Search engines simply disregard the content written within the JavaScript code.

The SEO Solution to the AJAX Problems


Solution 1: Serve Alternative Content

On a single page, if a navigational element is written by AJAX on every page load and is wrapped up in JavaScript, a solution can be having some alternative navigation on the page. Using footer links on a page for instance will work.
And for these alternative links, these should lead to actual pages on their own URLs that would load the same content as what would load in the AJAX links.
This way you please users with the visual appeal of AJAX and you still give alternative navigation and content that search engines can read and index. Where, if people get to visit these URLs, they will still be drawn to the AJAX pages.

Solution 2: Using Hijax – Graceful Degredation of AJAX

A concept first mentioned by Jeremy Keith of Clear:Left has an excellent presentation on this. His technique goes into the specific details of how AJAX should be written so that it degrades well with less AJAX capable browsers. In the same way Search Engines behave, just like a less capable browser. Here are some examples of his unobtrusive JavaScript example from his presentation:

  • JavaScript pseudo-protocol – Awful!
    <a href="javascript:window.open('help.html')">contextual help</a>
  • Pointless link – Bad!

    <a href="#" onclick="window.open('help.html'); return false;">contextual help</a>
  • Using the DOM – Better.
    <a href="help.html"
    onclick="window.open(this.getAttribute('href')); return false;">contextual help</>
  • No inline JavaScript – Best!
    <a href="help.html" class="help">contextual help</a>

He also talks about externalizing JavaScript and CSS code. Another major advantage of having a gracefully degrading AJAX website is not having the need to create alternative content. It is simply having only one content source that degrades well when JavaScript is disabled which is mainly how search engine spider behaves, like browsers with JavaScript disabled. One of the perfect examples I have seen implementing SEO friendly AJAX using Hijax techniques is on AjaxOptimize.