4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Denver SEO / Colorado SEMPO communities flourishing

Denver SEO Meetup and the Colorado working group of SEMPO have seen tremendous growth in the last year. In the ever developing world of search marketing, the meetups have become excellent resources for search marketing professionals looking to network – as well as the professional development opportunities provided by SEMPO’s excellent speakers. Last week, our president Jim Kreinbrink spoke about “Driving traffic to your blog with SEO techniques”. It was a technical presentation that gave away many great tidbits. The audience was full of experienced search marketers, and we hoped to show the value of collaboration and community. The previous month, two excellent PPC case studies were presented by Alex Porter from Location 3 Media. Seeing the approaches Location 3 took for two PPC campaigns, and the results attained, were very exciting. Search marketing is growing in a recession, so expect a packed house. The focus on measurable, trackable results makes it particularly appealing to agencies and advertisers alike. All this means that the Denver search marketing coomunity will continue to grow and flourish.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!

Why Flash is still a problem in 2009

Flash is less of a problem for search engines, but there are still caveats. Flash’s problems can be easily mitigated by offering footer links, and regular html text content on any pages with flash. It’s only an issue when no alternative content or navigation is offered. Here’s the longer story: Flash’s problems depend on the implementation: If developers do not implement Flash detection, pages can appear broken to visitors. They leave the site and/or do not convert to prospects/leads/sales. If flash detection is done poorly, it can be seen as cloaking to search engines – which is returning different content for search engines than for visitors. This is rare, but possible. If flash is the sole navigation for search engines and human visitors to follow, search engines cannot spider the site. This is the kiss of death you’ve probably heard about. Some claim it isn’t a problem any more because: Adobe has implemented better accessibility in the last few versions. But these links are still hard to follow and rarely rank well in the engines. MSN/LIVE has enough problems with HTML links, and probably will not find the content. Also, the landing page where visitors would land sometimes doesn’t show properly – it could be a part of a flash animation that doesn’t load, etc. Google made a deal with flash that allows flash to be crawled more easily. But again, these links are still hard to follow and rarely rank well in the engines. Google seems to be looking more for hidden redirects and other black hat techniques with their Adobe API deal. So what can you do to make sure your content is accessible to search engines, and seen as a valuable landing page for organic search visitors? Nothing beats good old fashioned HTML: Links that can be followed, and relevant keywords marking the content from it’s anchor text and title tags down to it’s keyword density.

Colorado Search Marketing Training

Hyper Dog Media is presenting a day long Search Marketing Presentation in Las Animas, Colorado on February 6, 2009. Three sessions will cover the basics of Search Engine Optimization, Pay Per Click Advertising, and a revolutionary “Solutions Clinic” – providing quick fixes to attendees’ websites in real time. The first session, Search Engine Optimization, addresses increasing web site rankings in Google, Yahoo, and more. SEO is all about helping the search engines see and understand the content of your website. Search engines want to be successful in directing visitors to quality destinations, and SEO should be focused on connecting with the right visitors. The second session focuses on targeting potential customers with PPC (Pay Per Click) and other advertising. It’s possible to waste enormous amounts of money on Pay Per Click advertising networks like Google AdWords. this session will show how to make your limited budget work most efficiently for your business. The third session builds on the first two. The Solution Clinic is for businesses that already have a web site and want real time evaluation and solutions for their site. Bring your hosting information, and we might just be able to fix it on the spot! The training is sponsored by Southeast Business Retention, Expansion, and Attraction. For more information or to register, call the SEBREA office at 719-336-1523.

Denver SEMPO: InHouse vs. Agency – Search Engine Marketing Insights Panel

Denver SEMPO is hosting an excellent panel discussion The Denver SEMPO Meetup is hosting a panel discussion of In-House Search Marketing vs. Search Marketing Agencies this month. For all you interested in SEO / SEM, this program will have some valuable information and experiences shared. The panelists are among some of the best SEOs from both sides of the isle. As a top Denver SEO Agency, Hyper Dog Media is also a sponsor of the program. It’s going to be at the Tivoli Center on the Auraria campus. You can see details below and on our Denver SEMPO Meet Up page. There is also a charge of $25 for the program. It will be a good very informative meeting. We’d love to see you there. Date: October 23 — 5:30-7:30 Go to Denver SEMPO Meet Up page: Denver SEMPO Meetup Group InHouse vs. Agency – Search Engine Marketing Insights Panel > Is there a difference between an internet marketing campaign created by an In-House Marketer vs. an Agency Marketer? > Are the challenges different? > Which is more likely to be successful? Learn the perspectives from both sides of the fence! Instead of the normal Denver SEMPO Meet Up we are going to have a panel discussion concerning the difference between in-house search marketers and those from agencies. Your paid RSVP gives you access to an evening of great networking opportunities with likeminded SEM’ers, light refreshments and the chance to “pick the brains” of some of the top people in our profession. The following search marketing professionals will be taking questions from attendees and sharing their professional knowledge and experience in establishing, growing and maintaining their search marketing campaigns: In-House Search Engine Marketers: * Everett Sizemore – Gaiam * Jim Brown – Quark (SEMPO) * Joe Gira – Regis University Agency Search Engine Marketers: * Steve Riegel – Faction Media Digital Marketing Agency (SEMPO) * Jason Lehman – Hyper Dog Media (SEMPO) * Nicholas Yorchak – Lee Ready (SEMPO) The evening is certain to be worth your while. Save the date and spread the word. To Register: Denver SEMPO Panel Discussion Registration

4 Places to find keywords for your SEO / PPC campaigns

What is an SEO or PPC campaign without the right keywords? Great keyword targets have a good amount of traffic, and a hopefully small amount of competition. Before you can even start measuring such things, however, you must create a broad list of keywords. Here’s where to start: 1. Keyword research  / suggestion services Services like WordTracker, KeywordDiscovery and even Google Suggest can give a great idea of the traffic surrounding certain keywords, as well as the variations of keywords a site should target. 2. Analytics / Statistics If you currently have analytics or web visitor statistics on your website, it is very helpful to look at how existing customers have found your site. If you haven’t loaded Google Analytics, it is quite easy – and free! 3. Brainstorming / Asking customers Great keywords can also be found, just be interviewing current customers with “How did you find us?” Even a quick glance at your business plan can lead you to a few new ideas on how prospective customers might find you. 4. Competitors Competitor websites can be a treasure trove of keywords. Scan their source code for a keywords metatag, if present. Also look at the keywords in their page titles by searching google for: site:competitor.com These four methods should lead you to plenty of keywords for your next campaign.

9 ways Google is discovering the invisible web

There are many parts of the web that Googlebot has not been able to access, but Google has been working to shrink that. Google wants to find content, and while many webmasters do not make it easy, Googlebot finds a way. 1. Crawling flash! Adobe announced today that they have released technology and information to Google and Yahoo enabling them to crawl flash files. It may take the search engines some time before they are able to integrate and implement these abilities, but a time is coming where rich media is less of a liability. I wonder if MSN/Live was left out to prevent them from reverse engineering Flash for their new silverlight competitor? At any rate, MSN is still working on accessing text links, so let’s not swamp them. 2. Crawling forms Googlebot recently started filling out forms on the web in an attempt to discover content hidden behind jump menus and other forms. See our previous article if you’d like to keep Google out of your forms. 3. Working with Government entities to make information more accessible A year or so ago, Google started providing training to government agencies to assist them in getting their information onto the web. I’m assuming much of the information has been hidden by URLs with large amounts of parameters. 4. Crawling JavaScript Many menus and other dynamic navigation features have been created in JavaScript, and googlebot has started crawling those as well. Instead of relying on webmasters to provide search friendly navigation, Google is finally getting to access sites created by neophyte webmasters that haven’t been paying attention. 5. Google’s patent to read text in images Google also knows many newbie webmasters use text buttons for navigation. By attempting to read text in images, the Googlebot will once again be able to open up previously inaccessible areas of a site. 6. Inbound links Of course, Googlebot has always been great at following inbound links to new content. Much of the invisible web has been discovered just through humans linking to a previously unknown resource. 7. Submission Of course, you can always submit a page location of currently invisible content to Google. This is usually the slowest way, especially compared to inbound links. 8. Google toolbar visits, analytics Recently, many Denver SEO professionals have noticed links being indexed that have not been submitted. The only plausible explanation was that Google has been mining it’s toolbar and analytics for information about new URLs. Be careful – Google is watching and sees all! 9. Sitemap.xml files The somewhat new stemap.xml protocol is very helpful for webmasters and googlebots alike in getting formerly invisible content into google’s hands.

6 PPC Secrets from a $100k campaign

There was an excellent story in the San Francisco Gate in May about Lake Champlain Chocolates, and the lessons they’ve learned with Pay Per Click Advertising. The story title is “PAY-PER-CLICK PROBLEMS: Emeryville gourmet chocolate company has a rough go of it“, but the real value of the article is the PPC secrets it gives away. The article discusses two chocolate retailers: Lake Champlain Chocolates and Charles Chocolates. Lake Champlain Chocolates has experienced successful growth due to their PPC campaign, but Charles Chocolates did see any measurable growth from theirs. 1. Use negative keywords In the article, words like “cheap” and “free” were used as negative keywords to avoid showing ads to less affluent searchers. Every time you show an ad it’s like holding out a dollar bill for your searcher to snatch away. Be sure to get a prospective customer in return! 2. Refrain from using the content network Google AdWords users expect that the content network will show ads in all the right places. In a perfect world, new customers would see your ad and keep you in mind for their next purchase. But it isn’t a perfect world(Don’t even get me started!). Consider: – Visitors probably will not click. Content ads are like billboard ads. How often do you see a billboard and pull off of the highway to make an immediate purchase? It’s highly unlikely. Like the company in the article learned, “The return was never there.” – Visitors who click your ad won’t buy that day. They were reading, not shopping. At best, they will signup for your newsletter or bookmark your page. Is the landing page converting them into bookmarking or signing up? Probably not. Either fix that, or turn off the content network for now. – Clickfraudsters will click your ad and keep half. Click fraud is a plague of the content network. Last June, Outsell estimated that click fraud could be as high as 14 percent. The real estimate is probably a little lower, but click fraud does exist. 3. Use large sets of focused keywords The successful Lake Champlain Chocolate seller had a keyword set as high as 70k at one time, and now has it trimmed down to 30k. That’s  a big keyword set! 4. Use advanced keyword features One of the issues Internet Marketing Consultant Lael Sturm found with the struggling Emerville Chocolate Retailer Charles Chocolates was that they “hadn’t modified the ad text to match each specific keyword.” Be sure to use the advanced keyword options that PPC engines like Google AdWords provide. In Google AdWords, the code is {keyword: your keyword}. This option shows the keyphrase your user was searching for in the text of your ad. 5. Measure and adjust Is money being wasted in your campaign? You won’t know unless you are measuring. Lake Champlain saw they spent money attracting a searcher for “Chocolate covered scorpions,” something they didn’t sell, and decided not to let that happen again. Along with measuring what ads are the most effective, be sure to measure what you are paying for and remove/adjust the ads lacking good ROI. 6. Outsource your campaign to professionals to dramatically increase your sales Even with Lake Champlain Chocolate’s success inhouse, they were able to DOUBLE their sales by outsourcing their PPC management to professionals. You just can’t beat having the right help. Get to your friendly neighborhood search marketing agency today!