The Constant Evolution of Google: What’s In, What’s Out [MONTHLY SUMMARY OF SEARCH]

Google is constantly evolving, as the internet itself has evolved. Some of the many questions Google has asked to keep itself relevant include:

  • How are people searching?
  • How should results be formatted?
  • How many answers does the user want?
  • Are recommendations from friends helpful in this search?
  • Is this a local search?

search-engine-875183_1280

Website Quality
Google just unveiled a Panda Update the SEO community is calling “Panda version 4.2”. While the update is just beginning a months-long rollout, it is likely to be looking at many of the same technical SEO issues as previous Panda updates. There are likely many more Website Quality criteria being evaluated by Google as well.

User Experience
Metrics such as Click-Through-Rate, Bounce Rate, and “Time On Site” all can give insight as to user experience on a site.  These can be influenced with videos, widgets, and marketing. Do the presence of these mean a high quality site? Not always, but it’s likely possible for Google to understand quite a bit – thanks to human “website quality raters”, big data from analytics, YouTube, so much more.

Inbound Links
Google has invested blood, sweat and tears into cleaning up the link ecosystem. Their previous policy of ignoring poor quality inbound links meant 10 years of quick-and-dirty link building. But in the last three years, link earning and content marketing have become the best way forward. Google isn’t about to abandon inbound links as a major ranking factor in their algorithm at this point: They have invested too much into it!

Recent comments from Googlers have included “I wouldn’t focus on link building just now” and “never ask for a link” are easily misinterpreted. Google treats good quality inbound links as a positive review, and would rather have these happen organically – instead of part of a campaign. Links should happen because of the quality of content, the helpfulness of the site: Seeking undeserved positive reviews, and inbound links without earning them, has been out for some time.

Google+ Less Important
Google+ is no longer required on YouTube. This ramping down of Google+ has been happening since last year, with zero user backlash. Google+ usage was too low to provide great social signals data. Many people were forced to get an account, so the numbers were impressive – but engagement was always horribly low. It always seemed to us that SEOs, and other marketers, used it begrudgingly.

Social signals
Twitter admitted 5% of it’s users are likely fake, with other source setting the number at 10% instead. That’s still pretty low. And if it’s easy for an independent audit to measure, Google can easily see and disregard that data.

In a user’s social feed, they are curating content to help their audience. Links send traffic, and could be a ranking signal at some point. It’s been two years since Google helped debunk a study showing social signals influenced rankings. The future is likely to have social signals as part of the formula for some audiences.

PSST! Need a Free Link? 

Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The SEO Implications of Getting “Hacked” [MONTHLY SUMMARY OF SEARCH]

Websites are increasingly being hacked on autopilot. Intruders are using scripts to crawl the web and infect sites using outdated or insecure software. Including plugins, add-ons, and themes. Security is necessary for web marketing to be successful, and SEO is particularly vulnerable.

code-707069_1280

1. Spammy Content
Intruders typically want to use a website’s existing authority in Google to push the most spammy content. Usually with affiliate links to casinos, adult content, pharmacies, etc. New pages, outside of the view of your normal website, are often created. Once Google finds this kind of spammy content on your site, your rankings can suffer. And your site might even be classified as “Adult in Nature.” That can mean a complete loss of search viability for prospects with “Safe Search” turned on in their search engine of choice! Google has said that even comments are taken into account when considering overall page content, so having entire sections of pages vulnerable can be particularly dangerous.

2. Thin and Duplicate Content Penalties
The pages that intruders create are usually low quality content. To build pages of unique content on hijacked websites, shortcuts are followed. These shortcuts can mean a Google Panda penalty for your site as well! Thin pages and duplicate content matching other hacked sites are enough to set off Google’s alarms.

3. Ads and Affiliate links
With Google’s new updates centered around quality, it’s easy to also set off alarms when your site is suddenly hosting ads and affiliate links for all sorts of things. Google’s quality guidelines take into account various factors such as ads above the fold, links to known affiliate networks, etc. If these are in your intruder’s monetization strategy, your rankings in Google are very likely to suffer!

4. Over Optimized Content
Outdated and aggressive SEO techniques are still often used by intruders, and that can mean over optimization penalties as well. Repeating a keyword several times in a title tag, or endlessly in page content, is an aggressive SEO technique that used to actually work. But not with modern Google! With spammy automated content created by an intruder, hacked websites are again vulnerable to Google penalties.

5. Growth and Loss of Indexed Pages
For years, Google has been wary of sites that grow their page count by a thousand percent overnight. And when the intrusion is fixed, it can look like a massive cull to Google, as 90% of the site’s content is suddenly uncrawlable. This instability is bad both ways in the world of search engine crawlers, and can take a while to undo.

6. Spammy Inbound Links
To get the intruder’s pages to rank on search engines, an automated link campaign is often created. The words “automated link campaign” carry the connotation of low quality, and that’s especially true here. Links can be from other compromised websites, adult sites, and just the absolute worst of the web! There are various ways to research what links have been created, but it’s difficult to catch them all! Many will have been de-indexed by Google, but still counted. Link cleanup & disavowal could potentially go on for years.

7. Getting Onto a Blacklist
There are sites, including Google, that may be warning off potential visitors to your site. Google will warn potential visitors right from their search results! But Antivirus software programs from Norton, McAfee, and many others are also scanning websites. Once you are on one of their blacklists, they can potentially block visitors. You won’t even see those attempted visits show up and bounce in analytics. They don’t even get to view your site and trigger analytics code before being blocked. And it can be hard to get off of these blacklists, too. Most companies don’t even think to check blacklists after cleaning up an intrusion.

So what can you do about this? Well prevention is key!
When it comes to website intrusions, prevention is crucial. Even large companies do not pay enough attention to security until an intrusion happens. Software updates are just the beginning for prevention. Consider monitoring admin logins, file system changes, and more. Catching an intrusion early on will be vital as well. If warnings are in webmaster tools, it could be a long road back for website visibility.

PSST! Need a Free Link?
We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Google’s “Quality Update” Rewards Positive Behavior [MONTHLY SUMMARY OF SEARCH]

Google’s updates have been focused around penalizing bad behavior: low quality links, duplicate or thin content, ad heavy pages, doorway pages, and more. But at the beginning of May, a mysterious Google update was released that looks to be more focused on boosting the right sites. Google officially claimed there was no update, then later admitted to a “quality update” – and to the core algorithm!

shutterstock_252367738
1. Good Design 

  • Structure of your site:  Websites structured around their main audiences tend to be structured (with “silos”) around their main keywords. This site structure makes it easy for both prospects and Googlebot to understand your site. With breadcrumb navigation, it’s very easy to communicate site structure. And with schema support, it’s an even better idea.
  • Variety of content: Are you mixing in photos, videos, infographics, slides and the many other kinds of content? Consider this a quiz (which is another piece of content). Which of the types of content are on your site?

 

2. Good Content

  • Original content/not syndicated: Syndication is more confusing than ever, despite the existence of the canonical tag. Sharing on your site first is vital. For authority sites such as linkedin.com, business2community.com and others, the best results come with rewriting a unique shortened version of your content.
  • Links to related content on your site:  When a site visitor is reading about one topic, it’s a great idea to showcase related articles. This too can have an effect, as you are linking to other articles about your main keyword or concept. Hummingbird can understand concepts, but it’s always been valuable to talk around your main keywords. Having good content all around a certain topic makes a site the authority on that topic.
  • Not too many ads, no deceptive ads:  SEO Glen Gabe noticed sites with ads hidden in the content sunk in this Quality Update. Some sites were hiding ads in with their content, making ad links look very similar to links to content on the site. Gabe points this out in his excellent analysis of sites hit by the Quality Update stating, “I also saw deceiving ads that blended way too much with the content”.
  • Not thin: Thin content such as tag pages or short articles won’t make the cut. It’s ironic that tweets are now at the top of many Google search engine result pages, but that Google also abhors thin content. Don’t be fooled by Google’s new-found love for those 160 characters: Good articles tend to be fully thought out pieces of 1000 WORDS or more! You might get by with 450 words, but really try to make your content deeper articles of 650 words plus. Numbers are nearly as important as the quality of the piece though.
  • No low quality user generated content:  User generated content sounds like a dream come true for many site owners, but policing the quality of comments, uploads, and topics can become quite a task. Sites such as answers.com and wikihow.com were hit in the Quality Update, and, have begun cleanup as well.

Want to learn more about the update? Visit Glenn Gabe’s analysis or this great article at SEMPost.

PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!