For all of the attention paid to “On-Page Relevance” and “[tag]Link Popularity[/tag]”, SEO begins with accessiblity. Are your websites [tag]search engine friendly[/tag]? Follow our checklist below to measure:
A. Search Engine Point of View
Surf your site from a search engine’s point of view. Download and Install Firefox and the Fangs Extension. By experiencing the web as a non-sighted user, you are also experiencing the web as a search engine would. Can you navigate your site easily?
Experienced [tag]SEO[/tag]s already have the tools and knowledge to examine a site for the items below, but beginners may find our [tag]free[/tag] tool at the SEO Diagnostic website quite helpful.
1. [tag]HTTP codes[/tag]
When any web page is accessed, the server hosting the page responds with a code. Most visitors are familiar with [tag]error 404[/tag], returned when a page is missing, but many users are not aware of the full range of HTTP codes that can be returned. Are your pages returning a code 200, indicating “OK”? Don’t know where to start? Test your site at http://www.seodiagnostic.com.
2. Page size
The size of your HTML page – including all images, stylesheets, and external scripts – should never exceed 90k. A recent study shows only 63% of Americans are online – an even smaller number have a broadband connection. If your target market includes consumers, or impatient business users, it is imperative to keep the size of your pages(including images, scripts and stylesheets) under control.
Frames have usability and accesibility challenges that are rarely overcome. If your site’s HTML code uses frames, you should have an experienced SEO see if they can make it navigable inside a NOFRAMES tag. You may also require a site redesign.
[tag] Flash[/tag] navigation cannot be followed easily by search engines. While flash is becoming more friendly, it still poses challenges to search engine indexing. Where flash is used on a site, make sure that html text links also exist.
6. Dynamic URLs
Google’s webmaster guidelines advise against dynamic URLs. Humans don’t like ’em, and search engine spiders don’t either! Use [tag]Apache[/tag]’s [tag]modrewrite[/tag] – hire an expert if you need to – but get rid of those dynamic URLs!
7. Robot [tag]Metatags[/tag] /[tag]robots.txt[/tag]
While it may not seem that robots.txt is always necessary, consider adding a very basic robots.txt file that welcomes all spiders to index the site. Consider the relatively “open arms” policy our site has, reflected in our robots.txt:
# All robots will spider the domain User-agent: * Disallow: User-Agent: googlebot Disallow: # Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/
8. Google [tag]Sitemap.xml[/tag]
Google’s new sitemap features allow your site to specify page locations and how often to “check back”. For large sites, this can be a dream come true. For smaller sites, it can be an opportunity to see your site as [tag]Google[/tag] sees it. Once your Google sitemap is active, Google will disclose any problems it had with your website. Get started in the Google Webmaster Area.
9. [tag]Yahoo Site Explorer[/tag]
Yahoo Site Explorer is another great tool to see your site as a search engine does. No sitemap creation necessary!
10. Pages Indexed
The number of indexed pages is a telling measurement about your search engine accessibility. To view the number of pages any of the major engines have indexed on your site, do a search for your domain with the “site:” prefix. For example, for seomoz.com you could search for “site:seomoz.com” (no quotes) in msn, yahoo or google. You will see each of the engines has a different number of index pages!
11. [tag]Best Practices[/tag]
Include a HTML sitemap to help your search engine(and other) visitors get to any page fast. A HTML sitemap is simply a html page with links throughout your site. Keep the amount of links under 100, as [tag]Google’s Webmaster Guidelines[/tag] recommend. Structure the links in a categorized outline, well organized for human and search engine visitors.
CSS files can be cached, reducing your bandwidth bill and providing less code for the engines to wade through before they encounter content. For human visitors, we often think about what is “above the fold”. For search engines, try to get as much juicy content “above the fold” as possible, too.
While search engines can navigate some of the worst HTML code, why make it any harder than it needs to be? Try to keep to valid HTML as much as possible.
[tag]Search engine[/tag] accessibility is so very important. First the search engines come, then the human visitors come. It does not matter how pretty your site is, if important visitors like the [tag]Googlebot[/tag] cannot get through it. Take a good look at your site accessibility to determine improvements you might be able to make.
And, of course, many of these [tag]accessibilty metrics[/tag] can be measured with our new tool SEO Diagnostic. If you aren’t sure where to start, start there!