4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

Preparing For SEO in 2017

  Every year brings new SEO challenges and surprises. The year 2017 won’t be any different, but we do expect these topics to be important considerations in the new year: Interstitials / Popups on Mobile DevicesWe’ve all seen mobile sites with a popup covering the content we were trying to read. These popups will be punished by Google in early 2017. Like ads above the fold, Google feels these popups harm the user experience – and they do not want to send visitors to such sites. Many survey and tool vendors such as ometrics and surveygizmo have been proactive to make sure their clients are not at risk, but some vendors may not be aware. SSL / HTTPSGoogle is really pushing SSL, and this is the year they accelerate their plan to make the web secure. Having your entire website served over HTTPS used to be rare, and only credit card or health privacy transactions were secured. And even that was spotty. But Google has begun a campaign since 2014 to secure everything. Two years ago, Google introduced a rankings boost for sites entirely on SSL. Last year they provided better features in Search Console. And we started to see SSL as “must have“. But progress has been voluntary in many regards, with other business objectives prioritized first. Next year, new developments will force your hand: Warnings will start appearing in Chrome. Come January 2017 the Chrome browser will show increasingly dire warnings for any site that hasn’t moved to HTTPS. Starting with pages that have credit card or password fields: Initially, users will be warned: With more dire warnings for insecure sites later in 2017: JavaScript-based sites There are many great reasons to use one of the new JavaScript frameworks in a web app or site: They tend to be mobile friendly and give a superior user experience in many cases. You’ve seen JavaScript search widgets on ebay and amazon providing “faceted search” – allowing users to easily refine their searches by clicking a few checkboxes. Frameworks needing some help include Angular, Backbone, Meteor, and many of their child/related frameworks. Some frameworks, such as Angular v2, are getting better about being search engine friendly. And Google is crawling ever more javascript, but not well from what we’ve seen. And often sites need help implementing technologies such as prerender.io. We are increasingly seeing more of this kind of work, and expect it to accelerate in 2017. AMP (Accelerated Mobile Pages)AMP is the super-speedy loading of pages you’ve likely seen in some mobile results. After you setup AMP on your site, Googlebot places your content on it’s super-fast servers – but making it look like your URL. AMP was just for news sites, but now Google has opened AMP up to other sorts of sites – and 700k+ sites have been using it! If mobile traffic is important to your site, AMP will likely become vital over the next year. SchemaGoogle just loves schema. We’ve seen over this last year as schema has helped increase pages indexed, and expect it to play a greater role every year. As artificial intelligence is used more and more in the “Rank Brain” algorithm, sites that can be easily categorized by Google will received more visibility. I for one welcome our new overlords… subject to future review. BacklinksLinks are still an important part of Google’s algorithm. But sustainable, authentic link earning is always the best longterm approach in link building. So how can you get these links? 1. Content marketingProduce great content, and reach out to authority sites and influencers in your space. 2. Business Development Link BuildingAll of those traditional activities such as sponsoring a baseball team, joining the chamber, or participating in online communities/forums are actually great ways to get links. 3. PublicityPublicity is that powerful branch of public relations that provides links and visibility from media sites. These methods of earning links have the best longterm potential, and are quite powerful for building and keeping rankings. More effortThe shrinking organic traffic (more ads at the top), increased competition, and ever-changing nature of organic search require more effort than ever. Gone are the days of getting your site “SEO-ed” and expecting free traffic. All traffic is either earned, or easily taken away. May you experience a great new year with SEO!

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

Google, The Internet Police Force, Aims At Mobile

Google is quickly becoming the self-appointed internet police force. To be fair, it sure is nice to have Google warn us when a website may be compromised and spreading malware. Google recently gave some false positives, but otherwise does a good job of keeping the internet a safe and happy place. Now Google is going a step further and is targeting mobile experience. With dramatic increases in mobile search over the last several years (and decreasing desktop search), Google is on a mission to identify mobile-friendly design and usability. Google is again changing the face of the web by mandating these features for sites that wish to rank highly in search results. Text vs Images In the early days of the web, browsers did not support multiple typefaces / fonts. Designers used jpg and gif images to create buttons for their menus and navigation, but search engines couldn’t read the words – missing an important signal about the URLs being linked to. A compromise had to be made, and for designers it felt less than ideal. The advent of web fonts have breathed life back into web design, but it was a difficult transition for many. Site Speed Slow website loading times are repulsive to Google in a couple of different ways: Not only are Googlebot’s crawlers tied up, but user experience suffers as well. Google can see bounce rates increase and knows they didn’t deliver the “right result” in those ten blue links. Ads Above the Fold Google’s own advertising system helped create a world of sites filled with ads. Users developed ad blindness and ad blockers, but usability still suffered. Having ads at the top of the page became a signal of poor quality to Google, and they rolled out an algorithm update specifically targeting these designs. Moving the ads meant a reduction in revenue for many sites, but changes were made to preserve the sweet flow of Google traffic. Mobile Google’s latest improvement for the web is happening in mobile. Last fall, they started testing labeling which results were mobile friendly, showing tags next to sites on mobile devices. Google has announced a big change is coming in April for their mobile search results: sites will be severely penalized for a lack of mobile usability. Labels will be given to mobile friendly sites, too. It’s likely that many sites will see a drop in ranking when this goes into effect. Google and Bing both understand mobile is their most important battleground for marketshare, and Google assures us the change means “users will find it easier to get relevant, high quality search results that are optimized for their devices.” For businesses, it will be vital that all pages pass Google’s Mobile friendly test, check Mobile Error Reports in Webmaster Tools and watch for common mistakes on mobile. Not sure of next steps for your site? Time to start testing – or maybe a redesign from that “good place”. Need a good interactive agency or website design firm? We’ve worked with agencies and designers. And we partner with the best! Talk to us about your needs, and we’ll introduce you to the right match. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Doing the Pigeon (Update)

Last month, Google rolled out one of their largest local search updates in quite some time. Since Google didn’t name the update, Search Engine Land named this one the Google Pigeon Update. It’s seemingly unrelated to Google’s Pigeon Rank, an April Fools joke from back when Google did good and funny things. This update does not penalize sites, but does change how local results are shown: – Fewer queries are generating a map listing / “local pack” – More traditional SEO signals are used, such as title tags and quality inbound links. Some interesting things are happening with this update: – When a query includes the word “yelp”, those listings on yelp.com are back at the top. This fixes a recent bug. – Web design and SEO companies are getting shown in local queries again! If you depend on local traffic, hopefully your results weren’t negatively impacted by the update. The best approach for local visibility includes these tasks: – make sure to update and creat local directory listings on authority sites such as yelp. – Use the highest quality photo on your Google+ business profile, and get more reviews. You might make it into the Carousel listings at the top of Google for some queries. – Make sure your business Name, Address and Phone(NAP) are consistent on your site, google+ business page, and local directories. – Be sure your city/state is in site’s title tags And now for something good, and funny: PSST! Need a Free Link? We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!  Their site also provides an excellent backlink. You may even get human visitors, website projects and new partners. Now THAT’s business development link building!

Changes last month in the world of Organic Search

There weren’t any Penguin updates this last month either, but Google Panda 3.9 happened on July 24, 2012. We didn’t see any impact to client rankings. But Google Panda updates should be a constant reminder: Have you added to your site lately? Have you added something of real value to your visitors, something that will interest them, and something they will “Like” (or plus one!) Penguin v1.2 update is expected to happen any day now. With Google Penguin, websites are more vulnerable to competitors practicing “Negative SEO” than ever before. Since Google Penguin Update actually penalizes websites for links that may have not been created by them, or for them, it is a change for the SEO industry. Some SEO companies are offering “link pruning” services, but it is quite time consuming. Webmasters on these bad websites are bordering on extortion: Asking for compensation to remove links. Bing, for it’s part, has created a tool to disavow bad links. Google claims to be working on a similar feature in Google Webmaster Tools, but no news yet on when it will be ready. Some expect the tool’s release to coincide wih the next Penguin update. Google sent out 20,000 “unnatural link” warnings last month, but then created some confusion by telling webmasters to ignore them. Google’s Matt Cutts explains: “Fundamentally, it means we’re distrusting some links to your site. We often take this action when we see a site that is mostly good but might be might have some spammy or artificial links pointing to it.” The link building techniques he identified are: 1. “widgetbait” This is where sites distribute a badge or other graphic with a link back to their website. Some web stats sites send these out, and Google has noticed. 2. “paid links” Google wants to be the only site selling links, I think. Or maybe they just want to make sure that advertising related links do not help rankings. 3. “blog spam” Blog entries and comments that are spammy detract from the web. 4. “guestbook spam” Guestbook / forum postings that have nothing to do with the conversation are certainly annoying, and Google does not want to encourage them with it’s algorithm. 5. “excessive article directory submissions” We do not submit to article sites. Many SEO firms have been submitted “spun” articles that resemble gibberish. Google does not see this as a good thing for the web, and also is seeking diversity of link types. 6. “excessive link exchanges” Google knows webmasters are likely to exchange links where it makes sense, but do not want to see this on a mass scale. 7. “other types of linkspam” There are always going to be new types of linkspam. Every time there is a new type of website! Google+ Google is also rewarding sites using their Google+ social network. If you haven’t created a profile and/or switched over your Google Local/Maps profile, this is a good time to get it rolling. Need help? Let us know: We’ll steer you to the right partner or help you ourselves.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!

Denver SEMPO: InHouse vs. Agency – Search Engine Marketing Insights Panel

Denver SEMPO is hosting an excellent panel discussion The Denver SEMPO Meetup is hosting a panel discussion of In-House Search Marketing vs. Search Marketing Agencies this month. For all you interested in SEO / SEM, this program will have some valuable information and experiences shared. The panelists are among some of the best SEOs from both sides of the isle. As a top Denver SEO Agency, Hyper Dog Media is also a sponsor of the program. It’s going to be at the Tivoli Center on the Auraria campus. You can see details below and on our Denver SEMPO Meet Up page. There is also a charge of $25 for the program. It will be a good very informative meeting. We’d love to see you there. Date: October 23 — 5:30-7:30 Go to Denver SEMPO Meet Up page: Denver SEMPO Meetup Group InHouse vs. Agency – Search Engine Marketing Insights Panel > Is there a difference between an internet marketing campaign created by an In-House Marketer vs. an Agency Marketer? > Are the challenges different? > Which is more likely to be successful? Learn the perspectives from both sides of the fence! Instead of the normal Denver SEMPO Meet Up we are going to have a panel discussion concerning the difference between in-house search marketers and those from agencies. Your paid RSVP gives you access to an evening of great networking opportunities with likeminded SEM’ers, light refreshments and the chance to “pick the brains” of some of the top people in our profession. The following search marketing professionals will be taking questions from attendees and sharing their professional knowledge and experience in establishing, growing and maintaining their search marketing campaigns: In-House Search Engine Marketers: * Everett Sizemore – Gaiam * Jim Brown – Quark (SEMPO) * Joe Gira – Regis University Agency Search Engine Marketers: * Steve Riegel – Faction Media Digital Marketing Agency (SEMPO) * Jason Lehman – Hyper Dog Media (SEMPO) * Nicholas Yorchak – Lee Ready (SEMPO) The evening is certain to be worth your while. Save the date and spread the word. To Register: Denver SEMPO Panel Discussion Registration