Summary of Search, August 2013

Summary of Search Is Google backward compatible? The previous advice from Google, given in their 2008 Starter Guide for SEO, is now “out the window.” Google previously recommended that the underlined text of a link (aka “anchortext”) contained keywords, but now finds that somewhat spammy. The new Google direction is all about authority link building, not keyword-focused link building. It’s nice to occasionally say: “There was only one major update this month in Google.” It’s an as-yet unnamed update that changed the SERPs (Search Engine Results Pages) in a way similar to Penguin 1.0. Google did, however, roll-out out an exciting new feature with this update: Special placement in search results for “high-quality, in-depth content” that is properly tagged. See their example:   How do you take advantage of this special placement? Try this: Tag everything to make it easy for google to figure out: Use schema.org “article” markup: http://schema.org/Article Provide authorship markup: https://support.google.com/webmasters/answer/3280182 Include pagination markup, if applicable (rel=next and rel=prev) Create a Google Plus page, linked to your website: https://support.google.com/webmasters/answer/1708844 Provide information about your organization’s logo: (organization markup) http://googlewebmastercentral.blogspot.com/2013/05/using-schemaorg-markup-for-organization.html Create compelling in-depth content (so easy, right?) Lengthy – Google has given no numbers, specifically, but we recommend text content of 1000-3000 words in length. Engaging – Google is likely looking at many metrics, including time on page, as signals of engagement. Popular – Content that is popular has inbound links, shares, likes, plus-ones, etc. And it probably has links to it from the site’s homepage or other important pages on the site. See more about the announcement at: http://insidesearch.blogspot.com/2013/08/discover-great-in-depth-articles-on.html Google is communicating about penalties much better than in the past, too: They have added a feature to Webmaster Tools which will alert webmasters if a manual penalty has been levied. Recent interviews have revealed that disavowed links are not stored. This means that old disavowed links must be included in every new batch submitted. Disavowing some links appears to be a normal part of modern SEO. Multiple reconsideration requests are okay, and are considered independently of past requests every time. Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Summary of Search, July 2013

Remember those tactics that worked so well? And what about the old recommendations in the webmaster guidelines? Well, it’s time to take another look at all of those tactics with the new Google! Google released a “multi-week update” that continued into July, but the “Panda Recovery Update” got far more interest. Google Panda has been heavy handed since it’s inception, and Google finally released a kinder, gentler version. Duplicate Content We see many different ways to deal with duplicate content. Based on results we have seen, we have this recommendation: Use canonical tags whenever possible to deal with duplicate content. Other methods like nofollow, noindex, and robots.txt are prone to leaks or are too aggressive. Despite many Google help articles recommending duplicate content be removed, Matt Cutts this month noted: “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.” Over-Optimization We are seeing more penalties for on-page over-optimization since Penguin 2.  the good news is, they are easily reversed:     Diversify those title tags!     Limit yourself to 2 separators like the | (pipe) character in the title tag.     Do not repeat anything more than once in a title tag.     Do not use excessively long title tags. Try to stay between 60-69 characters.     Look in your code for hidden comments, and usage of keywords with a dash between them (URLs, image names, etc). Consider whether excessive. Authority Links With Google’s upcoming (and continued) emphasis on authority links, we recommend these long term strategies: Link Building for Business Development: Make connections that also build your Google rankings. Think trade shows, associations and resource pages. Content Marketing Link Building: Use compelling content to create brand awareness and links! Think videos, infographics and guest blogging. Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Summary of Search Engine Optimization, June 2013

The world of SEO is still reeling from Google’s latest Penguin update, and many are questioning Google’s new guidelines. Having a well-known worldwide brand wasn’t enough for the Salvation Army – nor was it enough for Dish! We added on two more link building techniques Penguin 2.0 seems to penalize, for a total of 7 so far: 1. Exact match anchor text 2. Spammy links to subpages 3. Link networks / schemes 4. Links from de-indexed and banned websites, including old directories. 5. link velocity “spikes” 6. Paid Links 7. Sitewide links – especially blogroll and footer links Google updates in June: Even though Panda is now “rolled into the main Google algorithm”, there was some sort of refresh event last month. Google is being very tight-lipped about updates lately, and would not comment on at least one event this month that looked like an update. Some updates are being said to rollout over the “next 1-2 months”. Blackhat SEO – where SEOs attempt to fool search engines – are surviving these updates on some level. At a recent conference, Matt Cutts(Google’s engineer in charge of combatting webspam) mentioned specific actions against sites ranking for “Payday loans” in Google UK. A few weeks later, paydayloansfrommrcutts.blog.co.uk started ranking in the top 3! We do not recommend black hat SEO for brands, companies or sites with longterm value – but these blackhats are definitely keeping google on their toes! With Google’s upcoming (and continued) emphasis on authority links, we recommend the longterm strategies of 1. link building for business development, making connections that also build your google rankings 2. Content Marketing Link Building: Using compelling content to create brand awareness and links!

Summary of Search, May 2013

Around May 22nd, there was an update to Google’s search algorithms. It’s being called Penguin 2.0 (or sometimes Penguin 4) and is a major update. Matt Cutts said in a recent video that compared to the original Penguin update, this one does go much deeper. While the impact is supposed to be 2.3% of English queries, the effect is very large considering the number of Google keyword searches! Here is the full history: Penguin 1 on April 24, 2012 (impacting ~3.1% of queries) Penguin 2 on May 26, 2012 (impacting less than 0.1%) Penguin 3 on October 5, 2012 (impacting ~0.3% of queries) Penguin 4 on May 22, 2013 (impacting 2.3% of queries) Much of the analysis of Penguin 2.0 is still in progress, but some big brands were hit, including SalvationArmy.org and even Dish.com. As far as we can tell so far, Penguin 2.0 penalized: 1. Exact match anchor text 2. Spammy links to subpages 3. Link networks / schemes 4. Links from de-indexed and banned websites, including old directories. 5. link velocity “spikes” Penguin is impacting sites with unintentional webspam. We’ve seen scraper sites (targeting adsense keywords) delivering the worst links to clients’ profiles. These sites weren’t created for a link building campaign, but instead just adsense revenue for some site owner in a distant land. While they could be ignored before, they cannot be any longer. Now their penalties are our penalties. The approach we recommend is: 1. Protect Authority link building is the only protection against both negative SEO and Penguin penalties in general. Authority links are gained primarily from great content, promotion and involvement. One authority link can beat hundreds of spammy links in the algorithm of “the new Google”. 2. Defend Find and remove as many unnatural links as you can manually before disavowing the rest. 3. Build Over the long term, these strategies will also help protect from Google penalties, and are of course great marketing initiatives: a. Great content Copy writing has gone through an evolution and cheap content is not going to cut it. Could it ever though? b. Promotion & Outreach for Social Media Marketing & Inbound Links Since the web’s inception, much content has been posted with little regard to promotion. Social, link building, and other outreach initiatives are vital to maximize dollars spent on premium content. c. Brand Name Searches Google knows big brands are searched. Their “buzz” is a signal of authority, although not yet on par with link building. d. User Engagement Once a visitor is onsite, engage them. Keep their interest and involvement. Good design and excellent content have never been so important. Google has been watching this for some time. e. Multi-tiered approaches Spread marketing dollars broadly across many initiatives. It creates a variety of signals to Google that you are legit.

The month in Search

There haven’t been any Penguin updates this last month, but Google Panda 3.9.1 happened on August 20, 2012. We didn’t see any impact to most client rankings. Penguin v1.2 update is still expected to happen any day now, and (Google Spokesperson) Matt Cutts says to expect a bumpy ride. The early revisions of Panda were wild and somewhat “wooly”. Is page 1 top 7 now?! Around mid-month, Google started showing only 7 results, and from fewer sites, for a good chunk of queries(Estimated: 18%). Page 1 now means “top 7” for many searches. The percentage of users clicking through from positions 8-10 has been negligible in most studies, but this is a major change in how results are displayed and another clear departure from the 10 blue links of yesteryear. Change is the rule Rankings are more volatile than ever. One SEO shared: “Something like 80% of the Top 10 SERPs we measure change every night, to some degree.” On August 10, Google posted 86 changes they made in June and July. Many were small, but those of interest to us involve the boosting of “trusted sites” (usually means large brands) as well as changes to sitelinks. The new clustering and boosting of trusted sites is often creating monopolies for larger brands. Google used to only show 2-3 links maximum from the same website. Now it is possible for larger brands to dominate the top 7 or 8 results. “Transition Rank” Patent Application Google has a new patent application regarding “transition rank.” It’s aimed at punishing Black Hat SEO techniques through random ranking changes: “Some of the techniques used by rank-modifying spammers include keyword stuffing, invisible text, tiny text, page redirects, META tags stuffing, and link-based manipulation.” Many SEOs are speculating this has been part of the algorithm for some time.

Changes last month in the world of Organic Search

There weren’t any Penguin updates this last month either, but Google Panda 3.9 happened on July 24, 2012. We didn’t see any impact to client rankings. But Google Panda updates should be a constant reminder: Have you added to your site lately? Have you added something of real value to your visitors, something that will interest them, and something they will “Like” (or plus one!) Penguin v1.2 update is expected to happen any day now. With Google Penguin, websites are more vulnerable to competitors practicing “Negative SEO” than ever before. Since Google Penguin Update actually penalizes websites for links that may have not been created by them, or for them, it is a change for the SEO industry. Some SEO companies are offering “link pruning” services, but it is quite time consuming. Webmasters on these bad websites are bordering on extortion: Asking for compensation to remove links. Bing, for it’s part, has created a tool to disavow bad links. Google claims to be working on a similar feature in Google Webmaster Tools, but no news yet on when it will be ready. Some expect the tool’s release to coincide wih the next Penguin update. Google sent out 20,000 “unnatural link” warnings last month, but then created some confusion by telling webmasters to ignore them. Google’s Matt Cutts explains: “Fundamentally, it means we’re distrusting some links to your site. We often take this action when we see a site that is mostly good but might be might have some spammy or artificial links pointing to it.” The link building techniques he identified are: 1. “widgetbait” This is where sites distribute a badge or other graphic with a link back to their website. Some web stats sites send these out, and Google has noticed. 2. “paid links” Google wants to be the only site selling links, I think. Or maybe they just want to make sure that advertising related links do not help rankings. 3. “blog spam” Blog entries and comments that are spammy detract from the web. 4. “guestbook spam” Guestbook / forum postings that have nothing to do with the conversation are certainly annoying, and Google does not want to encourage them with it’s algorithm. 5. “excessive article directory submissions” We do not submit to article sites. Many SEO firms have been submitted “spun” articles that resemble gibberish. Google does not see this as a good thing for the web, and also is seeking diversity of link types. 6. “excessive link exchanges” Google knows webmasters are likely to exchange links where it makes sense, but do not want to see this on a mass scale. 7. “other types of linkspam” There are always going to be new types of linkspam. Every time there is a new type of website! Google+ Google is also rewarding sites using their Google+ social network. If you haven’t created a profile and/or switched over your Google Local/Maps profile, this is a good time to get it rolling. Need help? Let us know: We’ll steer you to the right partner or help you ourselves.

Notes on the Yahoo / Bing Transition

Yahoo advertisers received an email outlining a few more terms regarding the upcoming transition to the Microsoft Advertising adCenter platform. Some quick points: 1. A Tab will show up in “YSM later this month”, so be sure to login and look around. 2. Yours ads can server in adCenter right away when you transition. 3. Silverlight will continue being used. I still need to test and see what functionality might be missing when I login from my iPhone or an older mac. I hope whatever missing features degrade well! 4. The upcoming changes to organic search are later this month. We anticipate a rocky ride, as Microsoft will likely need to make ongoing tweaks. Dear Advertiser, As your transition to the Microsoft Advertising adCenter platform approaches, we have more details to share to help you prepare for the changes to come. Considerations for your upcoming transition adCenter account Soon, you’ll need to either create a new adCenter account, or link an existing adCenter account to your Yahoo! Search Marketing account. Later this month, you’ll see an “adCenter” tab within your Yahoo! Search Marketing account. Clicking there will take you to the beginning of the account transition process, where we’ll walk you through the simple steps to create or link accounts. Budgeting Once you create your adCenter account, it will be active and your ads will be eligible to serve on Bing right away. As a result, you’ll be managing both your new adCenter account and your existing Yahoo! Search Marketing account in parallel until ad serving for Yahoo! traffic transitions to adCenter, so plan to budget accordingly. Microsoft Silverlight With Silverlight installed, you’ll be able to see and address key differences between your Yahoo! and adCenter accounts as you transition. Download Silverlight now. Organic search transition Yahoo! organic search results will be powered by Bing as early as late August. If organic search results are an important source of referrals to your website, you’ll want to make sure that you’re prepared for this change. For more details, check out this blog post. As we’ve stated previously, our primary goal is to provide a quality transition experience for advertisers in the U.S. and Canada in 2010, while protecting the holiday season. However, please remember that as we continue to go through our series of checkpoints, if we conclude that it would improve the overall experience, we may choose to defer the transition to 2011. We are committed to making this transition as seamless and beneficial for you as possible. We appreciate your business, and look forward to bringing you the benefits of the Yahoo! and Microsoft Search Alliance. Sincerely, Your Partners in the Search Alliance, Yahoo! and Microsoft

4 reasons to 301 redirect old subpages ASAP

After a major website redesign, it’s not uncommon for page locations and even page extensions to change. Maybe you’ve switched web development languages, or changed your website’s structure into a SEO friendly themed set of silos. Whatever the reason page locations have changed, it’s vital that the old page locations are 301 redirected to the appropriate new pages. It’s time sensitive for the developers to make the change, as: 1. Pages will start dropping out of the index (Google hates sending visitors to bad pages, and can see the bounce rate skyrocket). When Googlebot comes to visit your site, it will probably receive a “404 Error Page” as well as a 404 HTTP error code. A 404 error code is the surefire way to get a page out of Google’s index. 2. Humans that have bookmarked the old page will be stranded. Depending on the 404 error page (Your server’s default is simply awful), your loyal return visitor may think the entire site is down. 3. Search engines will stop counting the power of the links coming into broken pages, and rankings will drop. Search engines do not count links to missing pages. The wonderfully diverse link profile you’ve built over the years can disappear as links to subpages are no longer counted. 4. Webmasters linking into subpages might notice the 404 and remove their links. Some webmasters routinely monitor where they are linking to, and remove links to broken destinations. Don’t make the most common of 301 redirect errors: Sending everything to the home page. to preserve a diverse link profile, you’ll want to keep those links spread naturally across your site’s homepage AND subpages. Happy 301 redirecting!

Denver SEO / Colorado SEMPO communities flourishing

Denver SEO Meetup and the Colorado working group of SEMPO have seen tremendous growth in the last year. In the ever developing world of search marketing, the meetups have become excellent resources for search marketing professionals looking to network – as well as the professional development opportunities provided by SEMPO’s excellent speakers. Last week, our president Jim Kreinbrink spoke about “Driving traffic to your blog with SEO techniques”. It was a technical presentation that gave away many great tidbits. The audience was full of experienced search marketers, and we hoped to show the value of collaboration and community. The previous month, two excellent PPC case studies were presented by Alex Porter from Location 3 Media. Seeing the approaches Location 3 took for two PPC campaigns, and the results attained, were very exciting. Search marketing is growing in a recession, so expect a packed house. The focus on measurable, trackable results makes it particularly appealing to agencies and advertisers alike. All this means that the Denver search marketing coomunity will continue to grow and flourish.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!