4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Conversion is King

Content is helpful, but conversion is everything. The point of content – and usability in general – is to meet business objectives. Any business objective can be a conversion of sorts: bookmarking, social sharing/liking, video views, time on site, lead generation, add to cart, and hopefully even completing the sale! By measuring each step, brands can understand where their site can improve it’s usability and contribute more to the bottom line. 1. It can be easier to increase conversion than to increase traffic Increasing conversion also increases revenue, and can be easier than increasing traffic – up to a point. 2. Even mobile apps can easily conduct conversion optimization tests Mobile testing platforms now allow conversion and usability testing without rolling out new versions of your app. Solutions exist from Optimizely ,Visual Website Optimizer (VWO), Liquid, and Artisan Optimize Mobile App. 3. You should test EVERYTHING User Experience professionals agree: Take their advice, but “always keep testing”. Conversion case studies show all sorts of factors can influence conversion: Logos and headers Design style of the the site Product page designs Product descriptions and overall copy writing The text of your call to action buttons Images Use of video (usually boosts conversion, but not always!) Purchasing path through the site 4. Website redesigns should use, not reset your data Now if the site is just awful, start with a redesign. But a website redesign that starts over can sometimes be a horrible waste: Another shot in the dark, with hope and prayer. Consider instead a redesign process based on evolving the website with small changes, continually tested for improvement. But definitely start from having your website in a “good place”! Not sure of next steps for your site? Time to start testing – or maybe a redesign from that “good place”. Need a good interactive agency or website design firm? We’ve worked with agencies and designers. And we partner with the best! Talk to us about your needs, and we’ll introduce you to the right match. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! See you at SearchCon 2015! SearchCon! Are you interested in learning about the latest in search from the experts? Join us at SearchCon 2015 – The Digital Marketing and SEO Conference! SearchCon is April 9th and 10th and will be held at Beaver Run Resort in beautiful Breckenridge, Colorado. Register before March 2nd and take advantage of early bird pricing! http://searchcon.events/

Google: All about that mobile

Having a good mobile experience is increasingly important for websites. Advances in technology have made it possible for many more sites to be viewed on mobile devices, but the experience is usually much less pleasurable than viewing via desktop. Google wants to change that, and is again trying to move website design in the correct direction. Google and Bing are currently locked in a battle to be the best search engine for mobile. They know users will judge them by the sites suggested during a search. When searchers encounter unusable sites from their query, they change search engines. Wouldn’t you rather have ten good sites given to you from a search than a hit-and-miss list? Mobile is growing fast: Comscore estimates that mobile usage will outpace desktop usage this year! Google has already started showing “Mobile Friendly” icons in search results – and has even tested “NOT Mobile Friendly” icons recently! So what to do? Here are some quick tips:1. View your site in mobileTry using this free testing tool from Google:https://www.google.com/webmasters/tools/mobile-friendly/ Google tells you if fonts are too small, there are missing “viewport” metatags, and other mobile usability errors. 2. Easy URLsKeyword rich URLs have lost much of their power in the last few years, but are likely to lose much more: They aren’t as easy to type into a smartphone. 3. Responsive designA responsive design is usable at any size. Previous efforts to provide different sites to different kinds of devices have failed as the many types of devices have exploded and crossed over into other categories, such as 2-in-1s and giant phones. Having several versions of your website might have also meant a nightmare in keeping all of them updated, and in sync. Googlebot in all it’s wisdom couldn’t figure out which version was canonical, either – and which to return a certain user to, based on their device. Google’s new Mobile Usability reports (in Webmaster Tools) show the following issues:– Flash content,– missing viewport (a critical meta-tag for mobile pages),– tiny fonts,– fixed-width viewports,– content not sized to viewport,– clickable links/buttons too close to each other. 4. Access to site resourcesGooglebot and Bingbot both want to see into your JavaScript and CSS files. It used to be a best practice to block access, and many have. But as time has passed, bots have missed important information about user experience: Are there ads above the fold? Is the user being redirected, or shown irrelevant content? Bots need to know, all with the framework of ranking “better” sites higher. And you cannot be “better” on mobile if the experience is bad. Need a good interactive agency or website design firm? We’ve worked with many, and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

9 ways Google is discovering the invisible web

There are many parts of the web that Googlebot has not been able to access, but Google has been working to shrink that. Google wants to find content, and while many webmasters do not make it easy, Googlebot finds a way. 1. Crawling flash! Adobe announced today that they have released technology and information to Google and Yahoo enabling them to crawl flash files. It may take the search engines some time before they are able to integrate and implement these abilities, but a time is coming where rich media is less of a liability. I wonder if MSN/Live was left out to prevent them from reverse engineering Flash for their new silverlight competitor? At any rate, MSN is still working on accessing text links, so let’s not swamp them. 2. Crawling forms Googlebot recently started filling out forms on the web in an attempt to discover content hidden behind jump menus and other forms. See our previous article if you’d like to keep Google out of your forms. 3. Working with Government entities to make information more accessible A year or so ago, Google started providing training to government agencies to assist them in getting their information onto the web. I’m assuming much of the information has been hidden by URLs with large amounts of parameters. 4. Crawling JavaScript Many menus and other dynamic navigation features have been created in JavaScript, and googlebot has started crawling those as well. Instead of relying on webmasters to provide search friendly navigation, Google is finally getting to access sites created by neophyte webmasters that haven’t been paying attention. 5. Google’s patent to read text in images Google also knows many newbie webmasters use text buttons for navigation. By attempting to read text in images, the Googlebot will once again be able to open up previously inaccessible areas of a site. 6. Inbound links Of course, Googlebot has always been great at following inbound links to new content. Much of the invisible web has been discovered just through humans linking to a previously unknown resource. 7. Submission Of course, you can always submit a page location of currently invisible content to Google. This is usually the slowest way, especially compared to inbound links. 8. Google toolbar visits, analytics Recently, many Denver SEO professionals have noticed links being indexed that have not been submitted. The only plausible explanation was that Google has been mining it’s toolbar and analytics for information about new URLs. Be careful – Google is watching and sees all! 9. Sitemap.xml files The somewhat new stemap.xml protocol is very helpful for webmasters and googlebots alike in getting formerly invisible content into google’s hands.

Search Marketing Standard: Read it twice

I’m still getting two copies of Search Marketing Standard magazine, but I’m not reporting it. First off, it’s so good that I don’t want to possibly miss an issue by having anyone mess with my subscription. With other magazines, I’ve found that fulfillment centers sometimes get confused, and it’s usually months before I realize a certain issue isn’t coming. I just can’t risk it. Every article is good. Secondly, I’ll probably read through it twice. Might as well have a fresh crisp copy the second time. I wonder if I’ll even dog-ear the same pages? Here are four excellent resources for anyone interested in SEO, internet marketing, ecommerce, and the affiliate scene: 1. Search Marketing Standard. If you’ve thought the SEO world moves too fast for print, think again. 2. Practical Ecommerce. Not just for ecommerce store owners. Every web developer creating ecommerce websites should be in tune with the industry. 3. Revenue. Great for affiliate marketers, ecommerce merchants, or any company creating PPC(Pay Per Click) campaigns on Google AdWords or Yahoo Search Marketing. 4. Internet Retailer. Especially important if you are helping larger companies with their SEO, SEM, PPC, and ROI! This publication is best at industry trends influencing larger retailers and online merchants. It is essential that web designers and web site developers start paying attention to the many facets that can make or break an online business. These publications can help get you serve your clients!

7 Web design techniques that are thankfully being retired

1. Frames Frames were rarely done in a search-friendly manner. In the age of cellphone browsers and section 508 compliance, frames must go. 2. IE 5 Mac hacks Internet Explorer was a miserable little browser on every OS it ran on, but was particularly miserable on the Mac. It required CSS hacks that other browsers tripped over. Some standards it – inexplicably – did not support. Even on MacOSX, it sucked. 3. Splash pages These pieces of eyecandy were frequently skipped by visitors, and even more frequently cursed under their breath. Known to be slow-loading and pointless, it is nice to see them used less often. 4. Microsoft Frontpage Extensions These buggy little replacements for scripting would break if you looked at them funny, and gave years of frustration to unix admins. Even Microsoft is turning it’s back on the Frontpage product, and not a day too soon. 5. Popup and Popunder windows There are still sites that tout the effectiveness of popups and popunders, but let’s face it: We all hate them. Every good browser tries to block them, but every once in a while you’ll see one. They are the junkmail of web browsing, and it’s time for them to go far, far away. 6. Animated layers that block content on page load There are few things as annoying as a layer that suddenly slides over to block content you are reading. They usually make users dismiss the ad to read page content. I’ve gotten so that I dismiss anything that slides over, not even taking the time to read the ad. The web will be a better place when these web design techniques are no longer seen. Have others? Add a comment and let us know!

Web designers must factor in the growing impatience of web surfers

Website visitors have never been more impatient, and I’m the worst. Just today, I was looking up the lyrics to a song. I clicked on the site in #1 position(Like 90% of the rest of the world), but it was too slow. Before I even left the Google SERPS (Search Engine Result Pages), I clicked on the link in position 2. I’m going to bet I’m not the only impatient soul looking for lyrics… or even more important things(as if!). Lucky, mother Google(our gentle overlord) is paying attention. One of the items mentioned in seomoz’s recent survey of perceived ranking factors is the availability of the server hosting a site. In this case lyricbarn, or whatever they were called, lost a visitor and a potential adsense click or two(Ads are fun to click). Web Designers – Yeah, you – Reduce your page load times and keep visitors!

4 Google Adwords Tips: Save money by excluding visitors

Google Adwords opens your advertisement up to a vast audience. Sometimes it’s an audience that is a little too vast. You can save tremendous amounts of money on adwords by excluding the wrong audience: 1. Exclude surfers during the wrong time of day If your product or service is primarily marketed to businesses, be sure to turn off your ads during off hours. Business products and services are only sought during business hours, and there is little need to show ads in evenings and on weekends. 2. Never use broad match Broad match can be a horrible waste of money. If your broad match is for red widgets, your ad will come up in searches that include the word red, and searches that include the word widgets. With so much of the wrong traffic – searching for red gadgets, red ipods, etc. – there are bound to be costly clicks upon your ad. Instead of using broad match, use phrase and exact match. This will help save your clicks for visitors that might actually buy your product or service. 3. Exclude keywords that are unrelated For most any product, you can exclude some keyword. If you sell boats, you should exclude the word “toy” from most of your ads. Be creative, search Google and look for negative keywords. 4. Exclude other countries Make sure you are not showing ads in other countries. Some continents are also notorious for being involved in PPC fraud. More tips to save money with Google AdWords and Yahoo Search Marketing! Get Joy Milkowski’s “Amazing Results with Google AdWords” course – it pays for itself! Or you can continue throwing extra money to Google. 🙂

What are Google’s supplemental results and what’s the problem?

Google defines supplemental results as follows: “A supplemental result is just like a regular web result, except that it’s pulled from our supplemental index. We’re able to place fewer restraints on sites that we crawl for this supplemental index than we do on sites that are crawled for our main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in our main index; however, it could still be crawled and added to our supplemental index. If you’re a webmaster, please note that the index in which a site is included is completely automated; there’s no way to select or change the index in which a site appears. Please also be assured that the index in which a site is included doesn’t affect its PageRank.” If your web pages are listed in the supplemental results then it is likely that your web pages could not be parsed correctly by Google’s standard crawler. The problem with Google’s supplemental results are that they are only supplemental. If your web pages are listed in the supplemental results then they won’t be returned very often for regular search queries. How to find out if your web pages are in the supplemental results An easy way to find out how many of your pages are listed in Google’s supplemental results is to search for the following on Google.com: site:www.domain.com *** Search for that phrase and then proceed to the last result pages to find the supplemental results. Of course, you have to replace www.domain.com with your own domain name. How to get out of Google’s supplemental results Most web sites have pages in Google’s supplemental results. It means that Google had difficulty to index these pages or that Google had other problems with these pages. 1. Make sure that your web pages don’t contain any spam elements and that you don’t use any spam techniques to promote your web site. Using spam techniques to promote your web site is often the reason why a web site doesn’t get good rankings. Better focus on ethical search engine optimization methods. 2. Make it easy for search engines to index your web pages. If possible don’t use web page URLs that contain question marks or the & symbol. Make sure that the HTML code of your web pages offers what search engines need. Use IBP’s Top 10 Optimizer to prepare your web pages. 3. Make these pages easy to find for Google’s web crawler. The more links point to your web pages, the more likely it is that search engine crawlers fill find your web pages. Use ARELIS to get good inbound links to your site. Most web sites have pages in Google’s supplemental results. The easier you make it Google to index your web pages the more pages of your site will be listed in Google’s normal results.