Make sure your site is accessible when Googlebot visits

For all of the attention paid to “On-Page Relevance” and “[tag]Link Popularity[/tag]”, SEO begins with accessiblity. Are your websites [tag]search engine friendly[/tag]? Follow our checklist below to measure: A. Search Engine Point of View Surf your site from a search engine’s point of view. Download and Install Firefox and the Fangs Extension. By experiencing the web as a non-sighted user, you are also experiencing the web as a search engine would. Can you navigate your site easily? Experienced [tag]SEO[/tag]s already have the tools and knowledge to examine a site for the items below, but beginners may find our [tag]free[/tag] tool at the SEO Diagnostic website quite helpful. 1. [tag]HTTP codes[/tag] When any web page is accessed, the server hosting the page responds with a code. Most visitors are familiar with [tag]error 404[/tag], returned when a page is missing, but many users are not aware of the full range of HTTP codes that can be returned. Are your pages returning a code 200, indicating “OK”? Don’t know where to start? Test your site at http://www.seodiagnostic.com. 2. Page size The size of your HTML page – including all images, stylesheets, and external scripts – should never exceed 90k. A recent study shows only 63% of Americans are online – an even smaller number have a broadband connection. If your target market includes consumers, or impatient business users, it is imperative to keep the size of your pages(including images, scripts and stylesheets) under control. 3. Frames Frames have usability and accesibility challenges that are rarely overcome. If your site’s HTML code uses frames, you should have an experienced SEO see if they can make it navigable inside a NOFRAMES tag. You may also require a site redesign. 4. Flash [tag] Flash[/tag] navigation cannot be followed easily by search engines. While flash is becoming more friendly, it still poses challenges to search engine indexing. Where flash is used on a site, make sure that html text links also exist. 5. [tag]JavaScript[/tag] navigation [tag] JavaScript menus[/tag] and rollover images make for stunning visual elements, but be sure to include navigation for your visitors that are not capable of executing JavaScript! While a NOSCRIPT tag may be indexed by some search engines, not all are created equal. 6. Dynamic URLs Google’s webmaster guidelines advise against dynamic URLs. Humans don’t like ’em, and search engine spiders don’t either! Use [tag]Apache[/tag]’s [tag]modrewrite[/tag] – hire an expert if you need to – but get rid of those dynamic URLs! 7. Robot [tag]Metatags[/tag] /[tag]robots.txt[/tag] While it may not seem that robots.txt is always necessary, consider adding a very basic robots.txt file that welcomes all spiders to index the site. Consider the relatively “open arms” policy our site has, reflected in our robots.txt: # All robots will spider the domain User-agent: * Disallow: User-Agent: googlebot Disallow: # Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/ 8. Google [tag]Sitemap.xml[/tag] Google’s new sitemap features allow your site to specify page locations and how often to “check back”. For large sites, this can be a dream come true. For smaller sites, it can be an opportunity to see your site as [tag]Google[/tag] sees it. Once your Google sitemap is active, Google will disclose any problems it had with your website. Get started in the Google Webmaster Area. 9. [tag]Yahoo Site Explorer[/tag] Yahoo Site Explorer is another great tool to see your site as a search engine does. No sitemap creation necessary! 10. Pages Indexed The number of indexed pages is a telling measurement about your search engine accessibility. To view the number of pages any of the major engines have indexed on your site, do a search for your domain with the “site:” prefix. For example, for seomoz.com you could search for “site:seomoz.com” (no quotes) in msn, yahoo or google. You will see each of the engines has a different number of index pages! 11. [tag]Best Practices[/tag] HTML sitemap Include a HTML sitemap to help your search engine(and other) visitors get to any page fast. A HTML sitemap is simply a html page with links throughout your site. Keep the amount of links under 100, as [tag]Google’s Webmaster Guidelines[/tag] recommend. Structure the links in a categorized outline, well organized for human and search engine visitors. CSS and JavaScript relocated to external files CSS files can be cached, reducing your bandwidth bill and providing less code for the engines to wade through before they encounter content. For human visitors, we often think about what is “above the fold”. For search engines, try to get as much juicy content “above the fold” as possible, too. HTML Validation While search engines can navigate some of the worst HTML code, why make it any harder than it needs to be? Try to keep to valid HTML as much as possible. [tag]Search engine[/tag] accessibility is so very important. First the search engines come, then the human visitors come. It does not matter how pretty your site is, if important visitors like the [tag]Googlebot[/tag] cannot get through it. Take a good look at your site accessibility to determine improvements you might be able to make. And, of course, many of these [tag]accessibilty metrics[/tag] can be measured with our new tool SEO Diagnostic. If you aren’t sure where to start, start there!

Yahoo Warns Growth of Internet Advertising Sales slowing in key sectors

Terry Semel, chairman and chief executive officer (CEO) of Yahoo warned that ad sales were growing in two key sectors, but not growing as quickly as they had hoped. Semel’s warning was about the Auto and Financial services sectors. He added: “But Yahoo was careful to note that it cannot tell whether the current slowdown is a sign of broader trouble or is limited to ads from the auto and financial sectors.“

How are you handling 404 pages?

Yahoo has incorporated a new feature in it’s slurp bot to find how your website is handling missing pages. Yahoo occasionally may be requesting random gibberish URLs on your server – usually with slurpconfirm404 in them – to see how missing pages are being handled on your server. Understanding how Search Engine robots are crawling and responding to your site is vital in communicating the accessibility and relevancy of your content. What is a 404 error? A 404 error is a low level code that is supposed to be sent by a webserver when the page that has been requested is not found on the server. It communicates to the web browser (or search engine robot) that the content couldn’t be found. Even a missing graphic should return a 404 error. Don’t worry, your site correctly returns a 404 error unless you (or your “web guy”) has intentionally manipulated it to do otherwise. What should be returned when an item is missing? It makes sense to stick with predictable behaviors when dealing with both users and search engines alike. Some sites may have gotten away from sending 404 errors because that is ALL they were sending. It isn’t very user-friendly to send a page that simply says “404 error”, but with a little tweaking a sitemap and branded page can be sent to your user. This simple step will help you keep more visitors, and communicate to search engines when they have requested missing content. See more about Yahoo and slurpconfirm404.

Yahoo Warns Growth of Internet Advertising Sales slowing in key sectors

Terry Semel, chairman and chief executive officer (CEO) of Yahoo warned that ad sales were growing in two key sectors, but not growing as quickly as they had hoped. Semel’s warning was about the Auto and Financial services sectors. He added: “But Yahoo was careful to note that it cannot tell whether the current slowdown is a sign of broader trouble or is limited to ads from the auto and financial sectors.“

How to go after a competitive niche: Top 10 Cameras on Flickr

A new article titled Top Ten Cameras on Flickr is a brilliant idea. The article has already been featured on Slashdot.org. The website it resides on will receive both excellent exposure(from a great demographic!), and tremendous page relevancy for the keywords “Top Ten Cameras”. It is hard for me to think this happened by accident, of course! What makes the article much more compelling than other articles circulating on the internet with that topic? Well, this article uses info from Flickr to automatically generate the list. Without that nugget, this list would be the same as every other “Top Ten Cameras” on every other Camera review blog. With such an interesting twist, this article is a coup for a very competitive keyphrase. How could you create such a list for your website? What unique twist would captivate your audience?

Keeping track of multiple passwords

RSA Security’s newest password management survey found that one of the greatest threats to corporate security is the weak password. Employees that change their too often, or have to juggle too many passwords for login to various services, are likely to choose weak passwords or even write them on a scrap of paper near their station. I am a little suspicious a survey that highlights RSA security as the solution to this problem, but it is valuable to stop and ask yourself “Do I have too many passwords to keep track of?”. Sure, too many passwords lead to “irresponsible password behavior”. A single login and password for every service is usually a bad idea, too. Once an intruder has access, they could wreak tremendous havoc. A sensible alternative is to choose Four passwords that you can actually remember. Make each password incrementally more random, if possible. Choose the weakest password, and use it to sign up for services that only need a password for the most rudimentary of tasks. Use the “second level” password for sites that may have some personal information – your name, address, etc. Save the “third level” password for sites that have your credit card on file. The final password is to be used only in online banking and/or paypal. Gee, so simple. But who can keep track of Four passwords, anyway?! Good luck out there – no one ever said good security was easy! (More information on the password survey)

Targeting the AdSense bot

Some page content can give Google’s AdSense the wrong impression of what your page may be about. On a page with multiple topics, your AdSense ads can come up with ads that are neither relevant nor profitable. To get around this problem, Google AdSense has a little known feature called “Section Targeting”. To particularly emphasize a section, enclose the text in these tags: <!– google_ad_section_start –> The text to be emphasized is here <!– google_ad_section_end –> To have a section of text ignored, use this variation: <!– google_ad_section_start(weight=ignore) –> Text to ignore goes here <!– google_ad_section_end –> There is currently no limit as to how often you can use these tags. However, it may take one or two weeks before you see any changes. It is important to give AdSense big chunks of text to work with – avoid giving sections of text that are too small.

Know Thy Audience

Slashdot has an interesting discussion about the Ultimate Blog Post. If you want to reap the traffic(and/or server downtime) that can result from a post on slashdot, digg or stumbleupon, it is important to consider what stories get the most exposure. Also consider whether your affiliate ads or products/services are correctly targeted to the demographic that will be visiting.

Social Networking profiles as Marketing

Companies like Burger King are using mySpace and facebook profiles to market directly to certain demographics. As younger audiences are lucrative and maybe a little too marketing saavy, companies are setting up profiles to target customers in a familiar setting. Business Week has an interesting article on Burger King and Chase marketing to kids where they live.