Penalized: Even Web Giants Aren’t Too Big To Fail

In the early days of the web, having great search engine optimization meant increased visibility for a business. Outranking the competition could lead to higher sales. Any leads that came in via the website were icing on the cake. Now, a strong web presence, leads and web-based revenue have become a vital part of business. In today’s world, the web represents a significant portion of even a brick-and-mortar’s revenue.

For companies such as ebay.com and retailmenot.com, losing Google’s traffic overnight due to a rule infraction can be a killer, but that is exactly what has happened in recent months. Ebay.com lost 33% of their organic traffic after being given a “manual penalty” from Google. Retailmenot.com lost 25% of their revenue, thanks to Google’s Panda 4 Update.

Other penalty stories and analysis continue rolling in after-the-fact. Only Google truly knows what eBay did wrong. Ebay employs some smart SEOs, but they may not truly know what everything to do – or undo. They can file a “reconsideration request”, and wait. But we can only guess when the penalty will be lifted. It’s unlikely that eBay will rise to its former position anytime soon, as the trick is up.

Reailmenot, which is, ironically, funded in part by Google Ventures, has overall guidelines for content quality they can try to adhere to more closely. But these Panda penalties are not always cut and dry, either.

SEO is becoming more and more about risk management. Could your business afford a substantial drop in rankings? Google’s formula is continually being updated and even practices that were recommended by Google in years past are now being penalized. Looking at Google’s future direction is more than a whimsical pastime for business leaders – it’s vital to ensuring future growth or survival.

With arbitrary rules and swift justice, it’s important to future-proof your SEO as much as possible:Hyper Dog Media Denver SEO

  • Create content that your prospects will take the time to read, share, and discuss.
  • Market your content to other sites using social media, outreach and good old fashioned business development.
  • Don’t get clever with Google. If Google hasn’t already started penalizing a certain tactic, know that it will.
  • Stay up to date on Google’s ever-changing rules. Our Hyper Dog Media Monthly Summary of Search is a low bandwidth newsletter to keep you in Google’s good graces.

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes:

  1. “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim.
    Hyper-Dog-Media-Google-Panda-Plan

    .

    How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs!

  2. Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:

    http://hyperdogmedia.com/

    http://www.hyperdogmedia.com/

    http://hyperdogmedia.com/index.html

    http://www.hyperdogmedia.com/index.html

    This can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?
    Never fear, there are easy fixes:
    a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txt
    b. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.
    c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site!

  3. Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.
    The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!
    a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xml
    b. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?
    c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools?
  4. Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest!

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Make your content easily shared, linked and read

Technical SEO is increasingly about helping people share content from a website. After all, content should be shared, linked to, and – dare I say – read. Perhaps a more appropriate usage term is “consumed,” since content strategy increasingly includes visuals, podcasts, webinars and multimedia.

Many sites are not optimized to take full advantage of new and evolving distribution channels for existing content. The “Social Sharability” and “Social Visibility” of content can be maximized by using these techniques:

  • Share buttons: To share the specific URL being viewed.
  • Follow buttons: To follow the website’s brand on social media networks.
  • Facebook Open Graph tags, Twitter cards, Pinterest “Rich pins”: These social networks have specific tags that can be added to on-page website code. Once implemented, posts about your website will feature larger images and tailor-made descriptions to make posts more visible in newsfeeds when shared.
  • Schema.org: Google has indicated that implementation of Schema.org code on your website is of high importance. Much like the other social network cards, tags and pins, URLs using Schema.org code have much better presentation, draw more attention, and are shared more often. Schema.org can also maximize your site’s presence in search results: These tags power the review stars and other features in the search results themselves.

Content should be readable and consumable, especially on mobile devices.

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!