5 Biggest SEO Fails seen in 100+ Web Redesigns, and 2 to watch out for! [MONTHLY SUMMARY OF SEARCH]

With Hyper Dog Media turning 11 this month, we having been looking back at the most common SEO
problems created by website redesigns. On some website redesigns, we’ve been on the team preventing
these SEO killers from happening. But in the vast majority of cases, we are brought in after a web redesign kills organic – and sometimes referral – traffic.

Here are four potential problems we see time and again

macbook-624707_12801. 301 redirects of old pages

As website technologies have evolved, so have URLs. An oft forgotten part of website redesigns is the 301 redirecting of old page locations. Traffic can shrink instantly, but the conventional point of view was that Google will figure it out. I’m not sure if that approach ever worked – for anyone – but especially now it is absolutely vital to 301 redirect old page locations to their new equivalents.

Not only should URLs be redirected from the previous version of the site, but of ALL previous versions of a site. Doing so helps these key visitor groups stay happy:

  • Visitors that have bookmarked a page: Don’t make these folks return to Google when they could stay on your site.
  • Search engines that have ranked a page: If a page is ranking well, you don’t want to lose that!
  • Webmasters that have linked to your page: Dead links tend to get removed. But also, 301 redirects preserve the rankings boost from these inbound links.
  • Visitors to other sites that have followed a link to your page: Referral visitors are notoriously impatient when links are dead.

Having dynamic content in various stages of the web’s development has often meant having various suffixes on URLs: .shtml, .pl, .php and/or many different parameters. Have you redirected these? Consider pulling ancient page URLs from analytics, archive.org, and even old backups. We’ve seen rankings boosts among clients that justify this level of obsession with 301 redirects!

2. Handling the development site


During the development phase, Google can sometimes discover new website versions. It is fascinating the many ways Google can discover content… until they find and penalize for duplicate content!


You blocked the development version? Excellent. Now don’t forget to unblock when you go live! Whether it’s a robots.txt file, password authentication, or robots metatags on the pages, we’ve seen these blocking techniques go live with the new site. Make it part of your launch checklist to remove these. The consequences of lost indexed pages, traffic and rankings are severe and all too easy.


In the rush to launch a new website, the development server might be left behind. These old subdomains or subdirectories have a way of showing up, though! Make sure you nuke that old server (from space, it’s the only way to be sure!). Or, just take it offline.

3. 404 error pages

With larger web development changes, the 404 error page can disappear. Or it might start returning a 302 redirect! If your site has changed CMS, web server, or scripting languages make sure a friendly 404 error page comes up for missing pages, has analytics code on it, and returns a code 404.

4. Canonical tags

Canonical tags are a wonderful way to prevent duplicate content penalties. Unfortunately, some things can go wrong. We’ve seen sites that describe every version of a page as canonical, which is like communicating noise to Googlebot. It’s worse than saying nothing at all.

One valid implementation we’ve seen causing trouble is the use of relative canonical tags. We’ve seen a tag such as this:

<link rel=”canonical” href=”/services” /> show up on several subdomains/ protocols:





This can confuse Googlebot, as both pages are describing themselves as the canonical version. It’s best to use an absolute URL, and make sure your server isn’t spitting this out for both http and https: <link rel=”canonical” href=”https://www.site.com//services” />

5. Old dirty sitemap.xml files

The sitemap.xml file is an excellent way to communicate URLs to Google, along with freshness and priority. But we encounter many sitemap.xml files that are full of these problems:

  • Old, dead, missing pages
  • URLs that redirect
  • URLs that do not match what Google can crawl, or those listed in canonical tags


And here are two more problems we can see likely to happen in redesigns this year:

6. HTTPS Implementation

HTTPS was added as a small ranking signal in the last year, and many sites have made the switch. Or have they? Often image files, 3rd party scripts, or other elements mean that not all page elements are https. Google has let this slide, but recently Google said last week that may change.

7. Mobile Friendly pages

The mobile update ranks pages individually, so it’s important to test your site’s most important landing pages on mobile devices. But also check devices are indeed triggering mobile sites to show: Even big brands such as Noodles & Company can discover their mobile site isn’t being triggered.


Websites are meant to be changed. Not only do prospects expect fresh content and design at proper intervals, but search engines do too! With Google’s newest updates, there are more changes happening than ever. Change is good. Embrace change, and redesign that site – but be careful not to make these common mistakes!


PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox?Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The Constant Evolution of Google: What’s In, What’s Out [MONTHLY SUMMARY OF SEARCH]

Google is constantly evolving, as the internet itself has evolved. Some of the many questions Google has asked to keep itself relevant include:

  • How are people searching?
  • How should results be formatted?
  • How many answers does the user want?
  • Are recommendations from friends helpful in this search?
  • Is this a local search?


Website Quality
Google just unveiled a Panda Update the SEO community is calling “Panda version 4.2”. While the update is just beginning a months-long rollout, it is likely to be looking at many of the same technical SEO issues as previous Panda updates. There are likely many more Website Quality criteria being evaluated by Google as well.

User Experience
Metrics such as Click-Through-Rate, Bounce Rate, and “Time On Site” all can give insight as to user experience on a site.  These can be influenced with videos, widgets, and marketing. Do the presence of these mean a high quality site? Not always, but it’s likely possible for Google to understand quite a bit – thanks to human “website quality raters”, big data from analytics, YouTube, so much more.

Inbound Links
Google has invested blood, sweat and tears into cleaning up the link ecosystem. Their previous policy of ignoring poor quality inbound links meant 10 years of quick-and-dirty link building. But in the last three years, link earning and content marketing have become the best way forward. Google isn’t about to abandon inbound links as a major ranking factor in their algorithm at this point: They have invested too much into it!

Recent comments from Googlers have included “I wouldn’t focus on link building just now” and “never ask for a link” are easily misinterpreted. Google treats good quality inbound links as a positive review, and would rather have these happen organically – instead of part of a campaign. Links should happen because of the quality of content, the helpfulness of the site: Seeking undeserved positive reviews, and inbound links without earning them, has been out for some time.

Google+ Less Important
Google+ is no longer required on YouTube. This ramping down of Google+ has been happening since last year, with zero user backlash. Google+ usage was too low to provide great social signals data. Many people were forced to get an account, so the numbers were impressive – but engagement was always horribly low. It always seemed to us that SEOs, and other marketers, used it begrudgingly.

Social signals
Twitter admitted 5% of it’s users are likely fake, with other source setting the number at 10% instead. That’s still pretty low. And if it’s easy for an independent audit to measure, Google can easily see and disregard that data.

In a user’s social feed, they are curating content to help their audience. Links send traffic, and could be a ranking signal at some point. It’s been two years since Google helped debunk a study showing social signals influenced rankings. The future is likely to have social signals as part of the formula for some audiences.

PSST! Need a Free Link? 

Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The SEO Implications of Getting “Hacked” [MONTHLY SUMMARY OF SEARCH]

Websites are increasingly being hacked on autopilot. Intruders are using scripts to crawl the web and infect sites using outdated or insecure software. Including plugins, add-ons, and themes. Security is necessary for web marketing to be successful, and SEO is particularly vulnerable.


1. Spammy Content
Intruders typically want to use a website’s existing authority in Google to push the most spammy content. Usually with affiliate links to casinos, adult content, pharmacies, etc. New pages, outside of the view of your normal website, are often created. Once Google finds this kind of spammy content on your site, your rankings can suffer. And your site might even be classified as “Adult in Nature.” That can mean a complete loss of search viability for prospects with “Safe Search” turned on in their search engine of choice! Google has said that even comments are taken into account when considering overall page content, so having entire sections of pages vulnerable can be particularly dangerous.

2. Thin and Duplicate Content Penalties
The pages that intruders create are usually low quality content. To build pages of unique content on hijacked websites, shortcuts are followed. These shortcuts can mean a Google Panda penalty for your site as well! Thin pages and duplicate content matching other hacked sites are enough to set off Google’s alarms.

3. Ads and Affiliate links
With Google’s new updates centered around quality, it’s easy to also set off alarms when your site is suddenly hosting ads and affiliate links for all sorts of things. Google’s quality guidelines take into account various factors such as ads above the fold, links to known affiliate networks, etc. If these are in your intruder’s monetization strategy, your rankings in Google are very likely to suffer!

4. Over Optimized Content
Outdated and aggressive SEO techniques are still often used by intruders, and that can mean over optimization penalties as well. Repeating a keyword several times in a title tag, or endlessly in page content, is an aggressive SEO technique that used to actually work. But not with modern Google! With spammy automated content created by an intruder, hacked websites are again vulnerable to Google penalties.

5. Growth and Loss of Indexed Pages
For years, Google has been wary of sites that grow their page count by a thousand percent overnight. And when the intrusion is fixed, it can look like a massive cull to Google, as 90% of the site’s content is suddenly uncrawlable. This instability is bad both ways in the world of search engine crawlers, and can take a while to undo.

6. Spammy Inbound Links
To get the intruder’s pages to rank on search engines, an automated link campaign is often created. The words “automated link campaign” carry the connotation of low quality, and that’s especially true here. Links can be from other compromised websites, adult sites, and just the absolute worst of the web! There are various ways to research what links have been created, but it’s difficult to catch them all! Many will have been de-indexed by Google, but still counted. Link cleanup & disavowal could potentially go on for years.

7. Getting Onto a Blacklist
There are sites, including Google, that may be warning off potential visitors to your site. Google will warn potential visitors right from their search results! But Antivirus software programs from Norton, McAfee, and many others are also scanning websites. Once you are on one of their blacklists, they can potentially block visitors. You won’t even see those attempted visits show up and bounce in analytics. They don’t even get to view your site and trigger analytics code before being blocked. And it can be hard to get off of these blacklists, too. Most companies don’t even think to check blacklists after cleaning up an intrusion.

So what can you do about this? Well prevention is key!
When it comes to website intrusions, prevention is crucial. Even large companies do not pay enough attention to security until an intrusion happens. Software updates are just the beginning for prevention. Consider monitoring admin logins, file system changes, and more. Catching an intrusion early on will be vital as well. If warnings are in webmaster tools, it could be a long road back for website visibility.

PSST! Need a Free Link?
We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!