3 Common, Surprising Ways to Get Banned by Google

1. Hidden text in the name of accessibility Many web designers are using a CSS technique that hides text – allegedly for accessibility. The technique uses a background-image to replace text. It is a common CSS technique touted as a way to maintain accessibility while still displaying a graphic instead of the text header. Unfortunately, it fails in many screen readers (see above article), and it is considered by the search engines to be hidden text! 2. Unintentionally spamming keywords Many sites use words in their keywords, title and description that are not ever used on the page. Most webmasters insert keywords into these tags that aren’t used anywhere on the page. The problem occurs when keywords are brainstormed separately from content development. Good SEO involves doing the two techniques together. Don’t expect Google to just trust that you are relevant for these keywords. How relevant could you be for your keywords if you never use them, anyway?! 3. Excessive Links Webmasters have become obsessed with their Google Pagerank, and are trading links at an ever furious pace. Having too many outbound links on a single page makes your site look more like a “link farm” than a legitimate website. You should try not to place too many outbound links on a single page. If you do need to link to 100 or more sites, place the links on separate pages. Instead of focusing on an unhealthy amount of links, create quality content and allow the links to flow in NATURALLY. That is what Google wants to see anyway. For Google’s webmaster guidelines, visit http://www.google.com/webmasters/guidelines.html What to do if you are banned? Google’s Matt Cutts has the answer: After you have fixed your site, file a reincluson request.

Make sure your site is accessible when Googlebot visits

For all of the attention paid to “On-Page Relevance” and “[tag]Link Popularity[/tag]”, SEO begins with accessiblity. Are your websites [tag]search engine friendly[/tag]? Follow our checklist below to measure: A. Search Engine Point of View Surf your site from a search engine’s point of view. Download and Install Firefox and the Fangs Extension. By experiencing the web as a non-sighted user, you are also experiencing the web as a search engine would. Can you navigate your site easily? Experienced [tag]SEO[/tag]s already have the tools and knowledge to examine a site for the items below, but beginners may find our [tag]free[/tag] tool at the SEO Diagnostic website quite helpful. 1. [tag]HTTP codes[/tag] When any web page is accessed, the server hosting the page responds with a code. Most visitors are familiar with [tag]error 404[/tag], returned when a page is missing, but many users are not aware of the full range of HTTP codes that can be returned. Are your pages returning a code 200, indicating “OK”? Don’t know where to start? Test your site at http://www.seodiagnostic.com. 2. Page size The size of your HTML page – including all images, stylesheets, and external scripts – should never exceed 90k. A recent study shows only 63% of Americans are online – an even smaller number have a broadband connection. If your target market includes consumers, or impatient business users, it is imperative to keep the size of your pages(including images, scripts and stylesheets) under control. 3. Frames Frames have usability and accesibility challenges that are rarely overcome. If your site’s HTML code uses frames, you should have an experienced SEO see if they can make it navigable inside a NOFRAMES tag. You may also require a site redesign. 4. Flash [tag] Flash[/tag] navigation cannot be followed easily by search engines. While flash is becoming more friendly, it still poses challenges to search engine indexing. Where flash is used on a site, make sure that html text links also exist. 5. [tag]JavaScript[/tag] navigation [tag] JavaScript menus[/tag] and rollover images make for stunning visual elements, but be sure to include navigation for your visitors that are not capable of executing JavaScript! While a NOSCRIPT tag may be indexed by some search engines, not all are created equal. 6. Dynamic URLs Google’s webmaster guidelines advise against dynamic URLs. Humans don’t like ’em, and search engine spiders don’t either! Use [tag]Apache[/tag]’s [tag]modrewrite[/tag] – hire an expert if you need to – but get rid of those dynamic URLs! 7. Robot [tag]Metatags[/tag] /[tag]robots.txt[/tag] While it may not seem that robots.txt is always necessary, consider adding a very basic robots.txt file that welcomes all spiders to index the site. Consider the relatively “open arms” policy our site has, reflected in our robots.txt: # All robots will spider the domain User-agent: * Disallow: User-Agent: googlebot Disallow: # Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/ 8. Google [tag]Sitemap.xml[/tag] Google’s new sitemap features allow your site to specify page locations and how often to “check back”. For large sites, this can be a dream come true. For smaller sites, it can be an opportunity to see your site as [tag]Google[/tag] sees it. Once your Google sitemap is active, Google will disclose any problems it had with your website. Get started in the Google Webmaster Area. 9. [tag]Yahoo Site Explorer[/tag] Yahoo Site Explorer is another great tool to see your site as a search engine does. No sitemap creation necessary! 10. Pages Indexed The number of indexed pages is a telling measurement about your search engine accessibility. To view the number of pages any of the major engines have indexed on your site, do a search for your domain with the “site:” prefix. For example, for seomoz.com you could search for “site:seomoz.com” (no quotes) in msn, yahoo or google. You will see each of the engines has a different number of index pages! 11. [tag]Best Practices[/tag] HTML sitemap Include a HTML sitemap to help your search engine(and other) visitors get to any page fast. A HTML sitemap is simply a html page with links throughout your site. Keep the amount of links under 100, as [tag]Google’s Webmaster Guidelines[/tag] recommend. Structure the links in a categorized outline, well organized for human and search engine visitors. CSS and JavaScript relocated to external files CSS files can be cached, reducing your bandwidth bill and providing less code for the engines to wade through before they encounter content. For human visitors, we often think about what is “above the fold”. For search engines, try to get as much juicy content “above the fold” as possible, too. HTML Validation While search engines can navigate some of the worst HTML code, why make it any harder than it needs to be? Try to keep to valid HTML as much as possible. [tag]Search engine[/tag] accessibility is so very important. First the search engines come, then the human visitors come. It does not matter how pretty your site is, if important visitors like the [tag]Googlebot[/tag] cannot get through it. Take a good look at your site accessibility to determine improvements you might be able to make. And, of course, many of these [tag]accessibilty metrics[/tag] can be measured with our new tool SEO Diagnostic. If you aren’t sure where to start, start there!