4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Can Google read JavaScript? Yes, but can it really?

Google will eventually crawl all JavaScript, but they haven’t been indexing JavaScript pages very  successfully. Every year, we hear the same story: Google says it’s  getting better at crawling and indexing Javascript. Except crawling JavaScript, and crawling ALL JavaScript are clearly two different accomplishments. Google can crawl it, render it, but just doesn’t seem to use it in the same way as optimized content. JavaScript pages can’t seem to rank as well in search engines, from what we’ve seen. Title tags come through here and there, but not consistently. Although, with the ease of development that JavaScript frameworks offer, it can be difficult to justify optimization with plain text and images. Here are some important questions to consider: 1. Fail gracefully For visitors without JavaScript – either bot or human – offering some sort of page content has always been important. Showing plain text and image content when JavaScript is off embraces the best practice of “failing gracefully.” 2. How quickly do you want results? For many sites, faster rankings means a faster path to revenue. Where pure JavaScript offers a compelling business case, it could be prioritized over “search engine friendliness.” For most sites, the extra visibility is worth extra work optimizing in the most search-friendly ways possible.  3. Is Google responding correctly to a test The entire site doesn’t have to be converted to JavaScript. Instead, use simple one page tests and check Google’s ‘crawlability.” Is Google understanding the DOM, and extracting titles, images and content correctly? 4. What other Google bots need to access your content? There are actually a variety of bots across Google’s many services. Google employs specific bots for their image search, ad services, product listing feeds, etc. Try accessing these with your test. Also, definitely keep your schema/rich snippet code easily accessible: Google has specifically warned that it cannot be found inside of javascript objects.  5. Test with all of Google’s tools: Speaking of Google’s bots, try using Google’s many tools for understanding and analyzing webpages. Seeing any problems here is a serious red flag for your JavaScript. But even if these render JavaScript, Google may not be ranking your pages as well as they would “search friendly” pages. Fetch and render https://www.google.com/webmasters/tools/googlebot-fetch (must be verified and logged into Google Search Console) Page speed Insights https://developers.google.com/speed/pagespeed/insights/ Mobile friendly https://www.google.com/webmasters/tools/mobile-friendly/ Keyword planner: https://adwords.google.com/ko/KeywordPlanner/Home (Ask Google to fetch the keywords from your landing page) Bing is rising Google isn’t the only search engine in town. Even without Yahoo and AOL numbers, Bing’s market share has been increasing steadily year over year. Bing had 21.4 percent market share last year, not counting partnerships with Apple, Yahoo or AOL. That’s getting to be a huge chunk of users. Bing especially has trouble with images inside javascript objects. Bing’s version of the fetch and render tool may display a rendered page, but bing isn’t going to show images in its image results, and the regular results will be inconsistent. Social Media Plain text and image content is also ideal for social media sharing. When a page is shared, most social media sites and can parse the simple text description and image right out – unless there is JavaScript. For most social networks, rich snippets such as open graph and twitter cards could help for the established social networks – but with new social networks (WhatsApp, Snapchat, etc) popping up every year, it would be best to expose the page content as plain text. Google’s JavaScript support is constantly improving. Having a Javascript app on the landing page is often needlessly complex. As of this writing, having an optimized version does appear to still be necessary. Maybe next year’s announcement that Google is crawling JavaScript will be followed by a more robust crawl, but there are plenty of other sites embracing “search engine friendliness”; Your site should too, in order to be competitive.   PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: https://mailchi.mp/3542d97c2fbd/hyper-dog-media-seo-ppc-tips

Denver SEMPO: InHouse vs. Agency – Search Engine Marketing Insights Panel

Denver SEMPO is hosting an excellent panel discussion The Denver SEMPO Meetup is hosting a panel discussion of In-House Search Marketing vs. Search Marketing Agencies this month. For all you interested in SEO / SEM, this program will have some valuable information and experiences shared. The panelists are among some of the best SEOs from both sides of the isle. As a top Denver SEO Agency, Hyper Dog Media is also a sponsor of the program. It’s going to be at the Tivoli Center on the Auraria campus. You can see details below and on our Denver SEMPO Meet Up page. There is also a charge of $25 for the program. It will be a good very informative meeting. We’d love to see you there. Date: October 23 — 5:30-7:30 Go to Denver SEMPO Meet Up page: Denver SEMPO Meetup Group InHouse vs. Agency – Search Engine Marketing Insights Panel > Is there a difference between an internet marketing campaign created by an In-House Marketer vs. an Agency Marketer? > Are the challenges different? > Which is more likely to be successful? Learn the perspectives from both sides of the fence! Instead of the normal Denver SEMPO Meet Up we are going to have a panel discussion concerning the difference between in-house search marketers and those from agencies. Your paid RSVP gives you access to an evening of great networking opportunities with likeminded SEM’ers, light refreshments and the chance to “pick the brains” of some of the top people in our profession. The following search marketing professionals will be taking questions from attendees and sharing their professional knowledge and experience in establishing, growing and maintaining their search marketing campaigns: In-House Search Engine Marketers: * Everett Sizemore – Gaiam * Jim Brown – Quark (SEMPO) * Joe Gira – Regis University Agency Search Engine Marketers: * Steve Riegel – Faction Media Digital Marketing Agency (SEMPO) * Jason Lehman – Hyper Dog Media (SEMPO) * Nicholas Yorchak – Lee Ready (SEMPO) The evening is certain to be worth your while. Save the date and spread the word. To Register: Denver SEMPO Panel Discussion Registration

5 web development techniques to prevent Google from crawling your HTML forms

Google has recently decided to let it’s Googlebot crawl through forms in an effort to index the “Deep Web”. There are numerous stories about wayward crawlers deleting and changing content through submitting forms, and it’s about to get worse. Googlebot is about to start submitting forms in an effort to get to your website’s deeper data. So what’s a web developer to do? 1. Use GET and POST requests correctly Use GET requests in forms to look up information, use POST requests to make changes. Google will only be crawling forms via GET requests, so following this “Best Practice” for forms is vital. 2. Make sure your POST forms do not respond to GET requests It sounds so simple, but many sites are being exploited for XSS (Cross Site Scripting) vulnerabilities because they respond (and return HTML) to both GET and POST requests. Be sure to check your form input carefully on the backend, and for heaven’s sake – do not use globals! 3. Use robots.txt to keep robots OUT robots.txt file keeps Googlebot out of where it doesn’t belong. Luckily, Googlebot will continue it’s excellent support of robots.txt directives when it goes crawling through forms. Be sure not to accidentally restrict your website too much, however. Keep the directives simple, excluding by directory if possible. And test, test, test in Google’s Webmaster Tools! 4. Use robots metatag directives Using the robots metatag directives for more refined control. We recommend “nofollow” and “noindex” directives for both the form submission page and search results pages you want Google to stay out of, even though Google says disallowing the form submission page is enough. Consider using tags and category pages that are Google friendly instead. 5. Use a CAPTCHA where possible Googlebot isn’t going to fill out a CAPTCHA, so it’s an easy way to make sure some bot isn’t filling out your form. Googlebot is, of course, the nicest bot you can hope to have visit your website. This provides a chance to secure forms and take necessary precautions before other – not so polite – bots visit your forms.

Denver Mobile SEO: Goes better with Chocolate, says Yahoo

Yahoo’s “Search Assist” tool is a hoot. Search for “Denver Mobile SEO“. go ahead, I dare you. Now, I’m thinking Yahoo knows me a little better than I’d like. Is this behavioral targeting? Profiling? Something even more sinister? Or is it just that Mobile SEO always goes better when plenty of chocolate is at hand. Now look through the related queries for “Chocolate“. Go ahead – I’ll wait. It appears many of us are writing about chocolate and writing about mobile seo in the same places. I’m going to bet more people are writing about chocolate, and I don’t blame them: Mobile SEO is the (sometimes thankless) task of making sure websites look good on all sorts of mobile devices, including handhelds, cell phones, zunes, and the new ipod touch(which is probably a “no brainer”). Few mobile seo simulators are available online, which means field testing. And then page tweaking. It can be a time consuming and arduous task. It’s best accompanied with plenty of chocolate.

Web Development Roles in Internet Marketing Projects

It takes many different web development / programming skill sets for a successful internet marketing project. For any website to be successful on the web, it requires a combination of stunning web design, usability, web conversion, bulletproof web development, search engine optimization, and project management. A failure at any of these points can destroy the potential of any internet marketing project. The roles each require very specialized skills: Web Design Web designers are popping up everywhere these days, but it is still very hard to find website designers who have stunning artistic and layout skills and just enough web knowledge to make it all work. Implementing some designs on the web can be impossible. It’s important to have a web designer who understands the limits and potential of each web technology. Web designers must also know enough CSS (Cascading Style Sheets) to create web friendly designs that will look great in any web browser. Usability Usability is very important to any website. Web site visitors must be able to understand the navigate the site. Most usability professionals are not great designers, but have a knack at understanding human behavior and expectations on web sites. Having a site that is highly usable encourages repeat visits – or “stickiness”. Conversion Web site conversion is a very important consideration: How do YOU want visitors to use the site? Web site visitors should be eased and encouraged to follow a “desired action” on your website. The action might be to purchase a product, send an email, sign up for a newsletter, or even pickup the phone. Having a great website is still pointless if it does not drive sales, lead capture, or some other desired action. Web Development Web developers are programmers. They create programs that allow interaction with human visitors, like shopping carts, RSS Feeds, image uploading and more. Web development requires a tremendous skill set that is always in need of expansion and updating. Web development languages like PHP, Perl, Flash Actionscript, and the many Java technologies require constant upkeep and training as they develop. Search Engine Optimization Search engine optimization is a set of guidelines, technologies and procedures for ranking well in search engines. The first step is determining which keywords can drive quality traffic to the website. What are prospective visitors searching for? Search engine optimization (SEO) specialists research keywords and optimize the pages to show how relevant the site is to visiting search engines. Denver SEO Specialists are skilled at showing the natural relevance of pages and securing better search rankings. Since many search engines also weigh the amount and quality of links to a website, SEO firms will often create and request links from other websites. Social Media Optimization With the creation of social media websites like myspace, digg, facebook and friendster, websites have an opportunity to capture amazing amounts of targeted web visitors. With millions of searches starting on myspace, it has become an important opportunity for certain niches. There is a social media website for nearly every niche, however. Finding the correct niche full of prospective buyers can drive tremendous amounts of sales. Project Management Project management allows all of the other skill sets to shine. By communicating between clients and the other roles, the project manager helps balance the many roles in the project with the client’s needs. They also serve as the point of contact for many the many questions and deadlines involved in the project. In sum, any great web development project requires a diverse skill set. A balance between the roles is equally important, never sacrificing usability for design, or design for search engine optimization.

4 essential questions when planning a web design

Successful web development projects require a tremendous amount of planning, and planning starts with asking the right questions. Any web design benefits from extra planning, but 4 questions should define the entire project from the start: 1. Who is my target audience? Too many websites try to be all things to all people. Instead, think of your most important visitors and design according to their tastes. They may or may not appreciate animation. They may be on dialup connections or they may be visiting the site via a cell phone. Knowing your website’s target audience is vital to the project, even before a web site design has been created. 2. What do I want them to do? If the purpose of your website is to get prospective customers to call, be sure your phone number is prominently displayed. A link to the “Contact Us” page should also be prominently displayed. Other websites may want to capture email addresses or newsletter signups. Ecommerce websites want to make a sale. Whatever the objective, make it as easy as possible for your customers. 3. How will they get to my site? With competition among websites growing daily, it’s important to plan how you will increase the visibility of your website. Will you blog? Or participate in forums? You might even use pay-per-click advertising on Google AdWords. There are many ways to bring targeted visitors to your website, but they won’t come just because you’ve launched a new website design. Plan ahead, and watch your website bring you new business! 4. How can I measure the project’s success? Many smaller website owners do not measure their web site metrics or statistics. Without an idea of traffic patterns and popular keywords, it is difficult to tell if a new web site design is effective. Are web site visitors converting to leads? Is the web site generating sales? Only by measuring can you know for sure.

Search Marketing Standard: Read it twice

I’m still getting two copies of Search Marketing Standard magazine, but I’m not reporting it. First off, it’s so good that I don’t want to possibly miss an issue by having anyone mess with my subscription. With other magazines, I’ve found that fulfillment centers sometimes get confused, and it’s usually months before I realize a certain issue isn’t coming. I just can’t risk it. Every article is good. Secondly, I’ll probably read through it twice. Might as well have a fresh crisp copy the second time. I wonder if I’ll even dog-ear the same pages? Here are four excellent resources for anyone interested in SEO, internet marketing, ecommerce, and the affiliate scene: 1. Search Marketing Standard. If you’ve thought the SEO world moves too fast for print, think again. 2. Practical Ecommerce. Not just for ecommerce store owners. Every web developer creating ecommerce websites should be in tune with the industry. 3. Revenue. Great for affiliate marketers, ecommerce merchants, or any company creating PPC(Pay Per Click) campaigns on Google AdWords or Yahoo Search Marketing. 4. Internet Retailer. Especially important if you are helping larger companies with their SEO, SEM, PPC, and ROI! This publication is best at industry trends influencing larger retailers and online merchants. It is essential that web designers and web site developers start paying attention to the many facets that can make or break an online business. These publications can help get you serve your clients!

7 Web design techniques that are thankfully being retired

1. Frames Frames were rarely done in a search-friendly manner. In the age of cellphone browsers and section 508 compliance, frames must go. 2. IE 5 Mac hacks Internet Explorer was a miserable little browser on every OS it ran on, but was particularly miserable on the Mac. It required CSS hacks that other browsers tripped over. Some standards it – inexplicably – did not support. Even on MacOSX, it sucked. 3. Splash pages These pieces of eyecandy were frequently skipped by visitors, and even more frequently cursed under their breath. Known to be slow-loading and pointless, it is nice to see them used less often. 4. Microsoft Frontpage Extensions These buggy little replacements for scripting would break if you looked at them funny, and gave years of frustration to unix admins. Even Microsoft is turning it’s back on the Frontpage product, and not a day too soon. 5. Popup and Popunder windows There are still sites that tout the effectiveness of popups and popunders, but let’s face it: We all hate them. Every good browser tries to block them, but every once in a while you’ll see one. They are the junkmail of web browsing, and it’s time for them to go far, far away. 6. Animated layers that block content on page load There are few things as annoying as a layer that suddenly slides over to block content you are reading. They usually make users dismiss the ad to read page content. I’ve gotten so that I dismiss anything that slides over, not even taking the time to read the ad. The web will be a better place when these web design techniques are no longer seen. Have others? Add a comment and let us know!