Doing the Pigeon(update): SUMMARY OF SEARCH, July 2014

Last month, Google rolled out one of their largest local search updates in quite some time. Since Google didn’t name the update, Search Engine Land named this one the Google Pigeon Update. It’s seemingly unrelated to Google’s Pigeon Rank, an April Fools joke from back when Google did good and funny things.

This update does not penalize sites, but does change how local results are shown:
- Fewer queries are generating a map listing / “local pack”
- More traditional SEO signals are used, such as title tags and quality inbound links.

Some interesting things are happening with this update:
- When a query includes the word “yelp”, those listings on yelp.com are back at the top. This fixes a recent bug.
- Web design and SEO companies are getting shown in local queries again!

If you depend on local traffic, hopefully your results weren’t negatively impacted by the update. The best approach for local visibility includes these tasks:
- make sure to update and creat local directory listings on authority sites such as yelp.
- Use the highest quality photo on your Google+ business profile, and get more reviews. You might make it into the Carousel listings at the top of Google for some queries.
- Make sure your business Name, Address and Phone(NAP) are consistent on your site, google+ business page, and local directories.
- Be sure your city/state is in site’s title tags

And now for something good, and funny:

PSST! Need a Free Link?
We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE! 

Their site also provides an excellent backlink. You may even get human visitors, website projects and new partners. Now THAT’s business development link building!

Penalized: Even Web Giants Aren’t Too Big To Fail

In the early days of the web, having great search engine optimization meant increased visibility for a business. Outranking the competition could lead to higher sales. Any leads that came in via the website were icing on the cake. Now, a strong web presence, leads and web-based revenue have become a vital part of business. In today’s world, the web represents a significant portion of even a brick-and-mortar’s revenue.

For companies such as ebay.com and retailmenot.com, losing Google’s traffic overnight due to a rule infraction can be a killer, but that is exactly what has happened in recent months. Ebay.com lost 33% of their organic traffic after being given a “manual penalty” from Google. Retailmenot.com lost 25% of their revenue, thanks to Google’s Panda 4 Update.

Other penalty stories and analysis continue rolling in after-the-fact. Only Google truly knows what eBay did wrong. Ebay employs some smart SEOs, but they may not truly know what everything to do – or undo. They can file a “reconsideration request”, and wait. But we can only guess when the penalty will be lifted. It’s unlikely that eBay will rise to its former position anytime soon, as the trick is up.

Reailmenot, which is, ironically, funded in part by Google Ventures, has overall guidelines for content quality they can try to adhere to more closely. But these Panda penalties are not always cut and dry, either.

SEO is becoming more and more about risk management. Could your business afford a substantial drop in rankings? Google’s formula is continually being updated and even practices that were recommended by Google in years past are now being penalized. Looking at Google’s future direction is more than a whimsical pastime for business leaders – it’s vital to ensuring future growth or survival.

With arbitrary rules and swift justice, it’s important to future-proof your SEO as much as possible:Hyper Dog Media Denver SEO

  • Create content that your prospects will take the time to read, share, and discuss.
  • Market your content to other sites using social media, outreach and good old fashioned business development.
  • Don’t get clever with Google. If Google hasn’t already started penalizing a certain tactic, know that it will.
  • Stay up to date on Google’s ever-changing rules. Our Hyper Dog Media Monthly Summary of Search is a low bandwidth newsletter to keep you in Google’s good graces.

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes:

  1. “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim.
    Hyper-Dog-Media-Google-Panda-Plan

    .

    How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs!

  2. Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:

    http://hyperdogmedia.com/

    http://www.hyperdogmedia.com/

    http://hyperdogmedia.com/index.html

    http://www.hyperdogmedia.com/index.html

    This can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?
    Never fear, there are easy fixes:
    a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txt
    b. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.
    c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site!

  3. Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.
    The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!
    a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xml
    b. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?
    c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools?
  4. Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest!

PSST! Need a Free Link?
Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!