Denver

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include…

1. Not having your analytics snippet in the correct place. 

 There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more

2. Filters

Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic.

There are also these problems with getting good data, and you did not even cause them:

1. Incomplete data / views

Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports.

Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time.

In Analytics, sampling can occur in your reports, during your data collection, or in both place.

(Image of sampling)

2. Organic keywords

Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph.

 

3. Referral spam, organic keyword spam, language spam

In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put

– “Vitally was here” in the list of languages your visitors use

– or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware).

Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports.

Want to try our segment? Click this link and scrub your own view of your account:

https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg

(There are other great segments on the Internet too, but we have customized this one for our clients.)

 

Penguin 4 has Arrived: What We Know [Summary of Search]

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1
– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue.

Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasinglpaper-182220_960_720_phixry more harsh.

1. Google still cares tremendously about links

We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account.

2.  It’s now “realtime”

As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful.

3. Realtime can mean weeks

Few have done the math and research in this quora thread, but that sounds like it will be a few weeks.

4. Penguin penalties will now be on the page level, not site level

Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly.

5. Bad links no longer penalize – they just don’t count

This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site.

6. Disavow files are still recommended

Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links.

Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

Summary of Search, August 2013

Summary of Search

Is Google backward compatible? The previous advice from Google, given in their 2008 Starter Guide for SEO, is now “out the window.” Google previously recommended that the underlined text of a link (aka “anchortext”) contained keywords, but now finds that somewhat spammy. The new Google direction is all about authority link building, not keyword-focused link building.

It’s nice to occasionally say: “There was only one major update this month in Google.” It’s an as-yet unnamed update that changed the SERPs (Search Engine Results Pages) in a way similar to Penguin 1.0.

Google did, however, roll-out out an exciting new feature with this update: Special placement in search results for “high-quality, in-depth content” that is properly tagged. See their example:

google in-depth content

How do you take advantage of this special placement? Try this:

Tag everything to make it easy for google to figure out:

Create compelling in-depth content (so easy, right?)

  • Lengthy – Google has given no numbers, specifically, but we recommend text content of 1000-3000 words in length.
  • Engaging – Google is likely looking at many metrics, including time on page, as signals of engagement.
  • Popular – Content that is popular has inbound links, shares, likes, plus-ones, etc. And it probably has links to it from the site’s homepage or other important pages on the site.

See more about the announcement at: http://insidesearch.blogspot.com/2013/08/discover-great-in-depth-articles-on.html

Google is communicating about penalties much better than in the past, too:

  • They have added a feature to Webmaster Tools which will alert webmasters if a manual penalty has been levied.
  • Recent interviews have revealed that disavowed links are not stored. This means that old disavowed links must be included in every new batch submitted.
  • Disavowing some links appears to be a normal part of modern SEO.
  • Multiple reconsideration requests are okay, and are considered independently of past requests every time.

Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!