9 Common Web Design Mistakes Prevent Google From Indexing Your Site

Web Designers frequently destroy their clients’ chances of ranking well in Google, without even knowing it! Here are three common mistakes that can ruin a client’s chances of ranking well in Google, Yahoo or MSN – simply by preventing the site from being indexed! Search engines follow regular text links, but web designers like to use these unfriendly search engine navigation methods: 1. JavaScript Menus Search Engines do not follow links reliably in JavaScript, if at all. 2. Imagemaps Search Engines cannot see the image, and so cannot classify the relevance or topic of the link. Lesser search engine robots do not even attempt to follow imagemap links. 3. Image Links / Rollover links These links frequently contain JavaScript, but also are difficuly for search engines to classify. 4. JavaScript popups Search Engines do not follow JavaScript reliably, and do not seem to like popups at all! 5. “Jump menus” These pulldown menus are usually submitting a form. If the form target is sent GET requests, there is a chance that the links will be followed in some manner, but again – this isn’t reliable navigation for Search Engines. 6. NOSCRIPT embedded links We were told that content in NOSCRIPT tags is for those visitors that have JavaScript off. But if you were told this means search engines, you were told wrong! This HTML tags has been abused by spammers early on, and search engines do not reliably follow navigation within these tags. 7. Frames – they’re rarely done in a search friendly manner More on the “right way” in a later post. Frames are challenging for search engines, and we have recently seen Google penalizing framee-based sites, perhaps due to the usability challenges they can present. 8. Java Java cannot be executed by search engines. Many early rollover effects relied on Java, but the navigation cannot be read by search engine robots. 9. Flash Flash navigation cannot be followed by search engines. Splash pages can become a deadend for search engines, and alternatives to Flash navigation should always be given. So what can you do to be sure that search engines will crawl your site? We’ll have answers in a future post, but a frequent supplement to websites that use the above techniques – meant almost entirely for search engines – is a set of footer links for seach engines to follow.

7 untimely ways for a SEO to die

In ancient Rome, the ghosts of the ancestors were appeased during Lemuria on May 9. Not many people know that, and even fewer care. But in the spirit of Lemuria, we offer seven untimely ways a SEO can die(It’s a dangerous world out there, and also I’m low on blog posting ideas): – Bitten by search engine crawlers. – Trampled by googlebots(This is actually the best way to go, if you have to). – Trip over a HTML tag someone forgot to close. (This was funnier last night when I thought of it – go figure) – You get (google)whacked while visiting a bad link neighborhood. – You’re doing the googledance, slip on a banana peel and hit your head. Certainly I’m not the only one who knows the googledance? Please submit your videos if you know it: googledance@hyperdogmedia.com. – You receive a suspicious package in the mail, and it turns out to be a googlebomb. – Setting linkbait traps and you get an arm caught. Please submit any other ideas you might have via email: lemuria@hyperdogmedia.com. So strike up that pun machine, it’s Friday! Update: Debra just suggested you could “overdose on link juice” – if only!

Google Hell: How the supplemental index can kill a company

Google Hell is a term being used to describe a sudden, far drop in a website’s ranking on Google. The ranking is usually for an important – or many different important – keyword terms. I’m pleased an article on Google Hell being covered in the mainstream press. It’s a phenomenon known to main online businesses, tied heavily to changes in the Google algorthym. Some of the excellent points in the article: The criteria for Google’s Supplemental Index can be vague. “Grey-area” techniques are sometime necessary to compete on the internet with larger stores. Duplicate content penalties exist! Newly created sites are especially vulnerable to falling into the supplemental index. Buying links may now be a deciding factor in whether your site ends up in the supplemental results. The article quotes Jim Boykin and Micheal Gray. Besides the great sources, it is refreshing that businesses are starting to see the importance of search engine marketing to the bottom line.

3 things NOT to do: The importance of titles in SEO

Sometimes webdesigners get low blood sugar, or suffer minor head injuries. The effect? Bad HTML title tags. Title tags are an important piece of real estate on your page. In properly structured HTML, it’s the first chance for you to tell human prospects and search engine visitors what your page is about. Depending on the search engine, page titles are someytimes shown prominently is results – your page is likely to be passed up if it doesn’t look relevant to the potential visitor’s search. Think about your page title as an advertisement for your website! Since I’m feeling snarky today, here are three things NOT to do when creating your title tags: 1. “Welcome to our website” It sounds like a friendly greeting for your human visitors, but it completely ignores the wonderful gift that a title tag can be. A title tag is a chance to tell both human and search engine visitors just how helpful your content is. Use this chance to target keywords that BRING and CONVERT traffic. 2. “Unititled Page” If your web designer is using Dreamweaver, hope that they are properly caffeinated when they are working on your page. Otherwise, they may forget to change your HTML title tag from the default. Don’t expect quality traffic when you are one of the almost ONE MILLION pages that have “Untitled Page” as their title. 3. “Welcome to Adobe GoLive” You can probably guess where this default page title came from. Check out the ONE MILLION crappy page titles. Oh, that’s neat: version 6 is out. I think we can see what they DIDN’T improve. What SHOULD you do in your title tags? Keywords, focused sets of keywords. More on that in a later – and less snarky – posting.

How to get indexed in Google: Be friendly, predictable for the googlebot

This post is for server geeks. Everyone else should flee. Here we are talking about the underlying codes that every server sends along with html of a web site design when a page is requested from your website. There are really only a few httpd server codes that should ever be sent on purpose: 1. Code 200 OK This status code tells browsers (and the googlebot) that everything is a-okay. The content sent with the code appears to be just what was requested. Code 200 says “Yes, I have that content right here. This is the right location for requesting it, and I’m sending it to you now.” 2. Code 301 (A redirect) A status code 301 tells the googlebot that content has moved. There isn’t a penalty applied to 301 redirects in the search engines, which makes it ideal for: – Redirecting traffic to the www version of your domain (to solve possible duplicate content issues) – Redirecting traffic from old or broken URLs 3. Code 404 A status code 404 tells visiting search engine spiders like the googlebot that the content is missing. After receiving a 404 error after several visits, most search engines will remove the page from their listings. These are the HTTP status codes that should be sent to the server in most cases. Other status codes – like the dread 302 redirect – will usually only cause problems. One site we recently analyzed sent these codes when the hompage was requested: 302 (Redirected to another page) 404 (Missing. The page they were redirected to was missing!) Then the HTML of the homepage was returned as the 404 error page. What a wild ride for the Googlebot! Curious about what codes are being returned by your server? Try our new SEO Diagnostic tool, currently in beta.

4 Google Adwords Tips: Save money by excluding visitors

Google Adwords opens your advertisement up to a vast audience. Sometimes it’s an audience that is a little too vast. You can save tremendous amounts of money on adwords by excluding the wrong audience: 1. Exclude surfers during the wrong time of day If your product or service is primarily marketed to businesses, be sure to turn off your ads during off hours. Business products and services are only sought during business hours, and there is little need to show ads in evenings and on weekends. 2. Never use broad match Broad match can be a horrible waste of money. If your broad match is for red widgets, your ad will come up in searches that include the word red, and searches that include the word widgets. With so much of the wrong traffic – searching for red gadgets, red ipods, etc. – there are bound to be costly clicks upon your ad. Instead of using broad match, use phrase and exact match. This will help save your clicks for visitors that might actually buy your product or service. 3. Exclude keywords that are unrelated For most any product, you can exclude some keyword. If you sell boats, you should exclude the word “toy” from most of your ads. Be creative, search Google and look for negative keywords. 4. Exclude other countries Make sure you are not showing ads in other countries. Some continents are also notorious for being involved in PPC fraud. More tips to save money with Google AdWords and Yahoo Search Marketing! Get Joy Milkowski’s “Amazing Results with Google AdWords” course – it pays for itself! Or you can continue throwing extra money to Google. 🙂