Summary of Search, September 2013

Google announced that it rewrote pretty much it’s entire algorithm last month in that “unnamed update.” It’s the biggest change since 2001. Seventy percent of the Search Engine Results Pages were affected! Compare that to Penguin, in which something like 3 percent of SERPs were affected. The new Google algorithm is code-named “Hummingbird.” Many of the basics are the same: 1. Content should be accessible / easy to navigate for search engines. 2. Keywords should be properly tagged, with special boost to those using: a. Semantic markup b. Rich snippets c. Google authorship 3. Authoritative links According to one expert, “Quick SEO” is firmly in the past. We couldn’t agree more: Google has been strongly advocating this direction for some time. And the Panda/Penguin updates began steering the industry more than 2 years ago. Panda & Penguin aren’t going away: They are parts of the new algorithm and are likely to get additional updates in the future. Across our clients, we saw very little change. Certain keywords had light movement up or down on August 20, but not by much. If you follow Google’s rules, you don’t get hit.   WHAT’S NEW IN HUMMINGBIRD? 1. Mobile/Voice/Location queries Google expanded it’s ability to deal with mobile/voice & location based queries like: “What’s the closest place to buy the iPhone 5s?” They also have more comparisons showing via the “knowledge graph” for queries like: “space needle versus empire state building” 2. “Entity search” In keyword based queries of yesteryear (and even “yestermonth”), google sometimes couldn’t figure out queries like “windows replacement” and “windows 7 replacement”: Is it a PC user or a homeowner asking? Google is using a database of facts about specific, unique entities (people, places, businesses, events, etc) to figure out how to return the best results. Think about the broad keywords you are targeting, and consider how you can “talk around” these topics. 3. Hashtag search The only posts that will show up on Google searches are those that were shared publicly, or shared with you (if you’re a Google+ user). Clicking on one of the Google+ posts leads you to Google+ where the search is reproduced. There are also links at the bottom of the sidebar to perform the hashtag search over at twitter or facebook, but these are bumped below the fold in less than 2 seconds – as new Google+ posts fill the sidebar.   MOVING FORWARD: 1. Create content around your “entities” Engaging, shareable, linkable content is now more important than ever. Do you have every kind of content about your subject? Consider videos, images, lists, podcasts, infographics, and articles regarding the entities you want to be found for. These are likely your broad keywords, but don’t go too broad. 2. Tag content with semantic markup & rich snippets Google is smart, indeed. But semantic markup & rich snippets make it easy for Google to understand the correct answers for their users questions. Rich snippets have grown in importance, and are now “must have” for search engine visibility. While Google is still working out the kinks in authorship for brands, it’s becoming increasingly important that authorship be implemented. 3. Content Marketing Link Building & Social Media Marketing Having great content was never enough, and it still isn’t. There are more ways than ever to get the word out. Some will even help you win authority links.   Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Summary of Search, May 2013

Around May 22nd, there was an update to Google’s search algorithms. It’s being called Penguin 2.0 (or sometimes Penguin 4) and is a major update. Matt Cutts said in a recent video that compared to the original Penguin update, this one does go much deeper. While the impact is supposed to be 2.3% of English queries, the effect is very large considering the number of Google keyword searches! Here is the full history: Penguin 1 on April 24, 2012 (impacting ~3.1% of queries) Penguin 2 on May 26, 2012 (impacting less than 0.1%) Penguin 3 on October 5, 2012 (impacting ~0.3% of queries) Penguin 4 on May 22, 2013 (impacting 2.3% of queries) Much of the analysis of Penguin 2.0 is still in progress, but some big brands were hit, including SalvationArmy.org and even Dish.com. As far as we can tell so far, Penguin 2.0 penalized: 1. Exact match anchor text 2. Spammy links to subpages 3. Link networks / schemes 4. Links from de-indexed and banned websites, including old directories. 5. link velocity “spikes” Penguin is impacting sites with unintentional webspam. We’ve seen scraper sites (targeting adsense keywords) delivering the worst links to clients’ profiles. These sites weren’t created for a link building campaign, but instead just adsense revenue for some site owner in a distant land. While they could be ignored before, they cannot be any longer. Now their penalties are our penalties. The approach we recommend is: 1. Protect Authority link building is the only protection against both negative SEO and Penguin penalties in general. Authority links are gained primarily from great content, promotion and involvement. One authority link can beat hundreds of spammy links in the algorithm of “the new Google”. 2. Defend Find and remove as many unnatural links as you can manually before disavowing the rest. 3. Build Over the long term, these strategies will also help protect from Google penalties, and are of course great marketing initiatives: a. Great content Copy writing has gone through an evolution and cheap content is not going to cut it. Could it ever though? b. Promotion & Outreach for Social Media Marketing & Inbound Links Since the web’s inception, much content has been posted with little regard to promotion. Social, link building, and other outreach initiatives are vital to maximize dollars spent on premium content. c. Brand Name Searches Google knows big brands are searched. Their “buzz” is a signal of authority, although not yet on par with link building. d. User Engagement Once a visitor is onsite, engage them. Keep their interest and involvement. Good design and excellent content have never been so important. Google has been watching this for some time. e. Multi-tiered approaches Spread marketing dollars broadly across many initiatives. It creates a variety of signals to Google that you are legit.

Summary of Search December 2012

In 2011, Google said they wouldn’t make large changes around the Holidays. This year, all bets are off. Google released Panda update 23 on December 21, impacting 1.3% of English queries. Another change Google made on December 13 attracted quite a bit of attention. Google could not confirm there was an update, but the most logical assessment might be seomoz’s idea that it was a “PMD update” affecting domains that partially used their targeted keywords. Last month, we summarized 3 kinds of content that Google Panda updates penalize: Scraped content, thin content that isn’t unique, other forms of duplicate content. Befriending Google Panda 1. Improve the site’s text content Remove all lower quality content. Invest in good copywriting, written for prospects instead of search engines. Copy should connect with the right audience, solving problems and informing them. Keywords should be used naturally, prominently, but not according to any specific density formulas. If you need help finding a copywriter, let us know: We recommend Laurie Macomber at Blue Skies Marketing. 2. Fix the site. Broken links, grammatical errors, misspellings, and other aspects of the site must be fixed. 3. Enhance the “richness” of the site. Use internal links to communicate keyword relevance. Videos, images, animations and other forms of media can also communicate relevance. Google’s Penguin update SERoundtable recently conducted a Penguin Recovery Poll: 94% Of Google Penguin Victims Did Not Fully Recover http://www.seroundtable.com/google-penguin-poll-16162.html 81% said they had no recovery while 13% claim a partial recovery and only 6% claim a full recovery. That is based on over 500 responses to our poll. Disavowing links 1. Disavowing links may be seen as a confession. Consider carefully before using the tool. 2. If your site only has spammy links, do not disavow them: You will have no links. Bing snapshot is much like Google’s knowledge graph. Google’s weakness appears to be freshness of data, but only time will tell which service has the most accurate up-to-date answers at the top of SERPs.

The month in Search

There haven’t been any Penguin updates this last month, but Google Panda 3.9.1 happened on August 20, 2012. We didn’t see any impact to most client rankings. Penguin v1.2 update is still expected to happen any day now, and (Google Spokesperson) Matt Cutts says to expect a bumpy ride. The early revisions of Panda were wild and somewhat “wooly”. Is page 1 top 7 now?! Around mid-month, Google started showing only 7 results, and from fewer sites, for a good chunk of queries(Estimated: 18%). Page 1 now means “top 7” for many searches. The percentage of users clicking through from positions 8-10 has been negligible in most studies, but this is a major change in how results are displayed and another clear departure from the 10 blue links of yesteryear. Change is the rule Rankings are more volatile than ever. One SEO shared: “Something like 80% of the Top 10 SERPs we measure change every night, to some degree.” On August 10, Google posted 86 changes they made in June and July. Many were small, but those of interest to us involve the boosting of “trusted sites” (usually means large brands) as well as changes to sitelinks. The new clustering and boosting of trusted sites is often creating monopolies for larger brands. Google used to only show 2-3 links maximum from the same website. Now it is possible for larger brands to dominate the top 7 or 8 results. “Transition Rank” Patent Application Google has a new patent application regarding “transition rank.” It’s aimed at punishing Black Hat SEO techniques through random ranking changes: “Some of the techniques used by rank-modifying spammers include keyword stuffing, invisible text, tiny text, page redirects, META tags stuffing, and link-based manipulation.” Many SEOs are speculating this has been part of the algorithm for some time.

Changes last month in the world of Organic Search

There weren’t any Penguin updates this last month either, but Google Panda 3.9 happened on July 24, 2012. We didn’t see any impact to client rankings. But Google Panda updates should be a constant reminder: Have you added to your site lately? Have you added something of real value to your visitors, something that will interest them, and something they will “Like” (or plus one!) Penguin v1.2 update is expected to happen any day now. With Google Penguin, websites are more vulnerable to competitors practicing “Negative SEO” than ever before. Since Google Penguin Update actually penalizes websites for links that may have not been created by them, or for them, it is a change for the SEO industry. Some SEO companies are offering “link pruning” services, but it is quite time consuming. Webmasters on these bad websites are bordering on extortion: Asking for compensation to remove links. Bing, for it’s part, has created a tool to disavow bad links. Google claims to be working on a similar feature in Google Webmaster Tools, but no news yet on when it will be ready. Some expect the tool’s release to coincide wih the next Penguin update. Google sent out 20,000 “unnatural link” warnings last month, but then created some confusion by telling webmasters to ignore them. Google’s Matt Cutts explains: “Fundamentally, it means we’re distrusting some links to your site. We often take this action when we see a site that is mostly good but might be might have some spammy or artificial links pointing to it.” The link building techniques he identified are: 1. “widgetbait” This is where sites distribute a badge or other graphic with a link back to their website. Some web stats sites send these out, and Google has noticed. 2. “paid links” Google wants to be the only site selling links, I think. Or maybe they just want to make sure that advertising related links do not help rankings. 3. “blog spam” Blog entries and comments that are spammy detract from the web. 4. “guestbook spam” Guestbook / forum postings that have nothing to do with the conversation are certainly annoying, and Google does not want to encourage them with it’s algorithm. 5. “excessive article directory submissions” We do not submit to article sites. Many SEO firms have been submitted “spun” articles that resemble gibberish. Google does not see this as a good thing for the web, and also is seeking diversity of link types. 6. “excessive link exchanges” Google knows webmasters are likely to exchange links where it makes sense, but do not want to see this on a mass scale. 7. “other types of linkspam” There are always going to be new types of linkspam. Every time there is a new type of website! Google+ Google is also rewarding sites using their Google+ social network. If you haven’t created a profile and/or switched over your Google Local/Maps profile, this is a good time to get it rolling. Need help? Let us know: We’ll steer you to the right partner or help you ourselves.

Denver SEO / Colorado SEMPO communities flourishing

Denver SEO Meetup and the Colorado working group of SEMPO have seen tremendous growth in the last year. In the ever developing world of search marketing, the meetups have become excellent resources for search marketing professionals looking to network – as well as the professional development opportunities provided by SEMPO’s excellent speakers. Last week, our president Jim Kreinbrink spoke about “Driving traffic to your blog with SEO techniques”. It was a technical presentation that gave away many great tidbits. The audience was full of experienced search marketers, and we hoped to show the value of collaboration and community. The previous month, two excellent PPC case studies were presented by Alex Porter from Location 3 Media. Seeing the approaches Location 3 took for two PPC campaigns, and the results attained, were very exciting. Search marketing is growing in a recession, so expect a packed house. The focus on measurable, trackable results makes it particularly appealing to agencies and advertisers alike. All this means that the Denver search marketing coomunity will continue to grow and flourish.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!

Denver SEO Meetup – 1 Year Anniversary

It’s been one whole year since our President Jim Kreinbrink founded the Denver SEO Meetup. We have now had 13 meetups, with 119 members and growing. Expectations about the number and types of SEOs we’d meet have been exceeded, as noted Denver SEO professionals large and small have attended. Among our top lessons: 1. We have great synergies with attendees from related industries Several great contributors to the Denver SEO Meetup aren’t even SEOs – they are affiliate or internet marketing professionals from the Denver/ Boulder area. Or SEO folks looking to hire/ be hired. While the group is targeted toward full-time SEO professionals, it’s been a happy accident that we’ve also attracted so many other great members. 2. Denver Web Designers and Webmasters attend, expecting a learning group Several webmasters have attended or joined the group, and left disappointed when free SEO training wasn’t offered. All Denver SEO experts started as beginners at some point, but the meetup is really targeted toward socializing – not educating. Unfortunately, there have been hurt feelings. We have heard the cries, and are working in conjunction with Colorado SEMPO to provide a mixture of educational programs in addition to this social event. 3. SEOs like beer, wine and socializing, not laser tag The Denver SEO meetup was initially a lasertag group. Of one. It didn’t take long to figure out that should change. 4. Denver SEOs are normal people. Even the “Black Hats”. Especially the “Black Hats”. Denver SEOs have families, pet sites, hobbies, etc. Even the black hats. More than just search engine optimization rules their worlds. Some of the best SEO conversations have started about families, pets, travel, and things without any acronyms whatsoever. If you are a Denver SEO Firm, search marketing agency, SEO freelancer – or a curious Black Hat – consider this an invitation to join the group. To socialize, network, and relax a little. Hope to see you there!

Denver SEO Meetup is a success!

Less than a year after starting the Denver SEO Meetup, we are pleased to announce that it is quite successful. The Denver SEO Meetup is a great place for Search Engine Optimization professionals throughout Colorado to network and socialize. Both freelance and agency Denver SEO folks are encouraged to attend. The environment is a very friendly, even laid back. Are you a Denver SEO firm or practicioner? Come on down to our Denver SEO meetup! If you are a web developer, web designer, webmaster, or business owner interested in learning more about SEO, we highly recommend the training program at the SEMPO Institute instead of the meetup. The courses were created by some of the industry’s leading Search Marketing professionals, and can help you build your online business. And coming soon: Those professionals belonging to SEMPO will soon have a Colorado SEMPO group available!

SMX Local & Mobile coming to Denver, October 1st & 2nd

Search Marketing Expo (SMX) is one of the USA’s best conferences. This year’s SMX Local and Mobile is being held right here in Denver, Colorado. The focus on local SEO and mobile SEO couldn’t come at a better time: local search is very hot with small and large businesses alike. Mobile SEO is becoming ever more important as consumers race to the internet via handheld devices. What will your website look like to them? Mobile SEO is important until companies respond to what consumers really want – a desktop experience via their phone! Visit the SMX website to register today!