February 2014 Summary of Search:
Do as I say, not as I do

“Do as I say, not as I do” Sometimes Google does things it warns others not to do:1. Don’t be top heavyGoogle just updated it’s “Top heavy” algorithm. For sites that show many ads at the top, or make users scroll to see content, penalties can apply. 2. Don’t scrape content from other websitesMatt Cutts of Google is actively seeking reports of what would be considered “scraper sites”. One SEO responded with a screenshot of Google scraping wikipedia. 🙂http://www.seroundtable.com/google-scraper-site-report-18184.html In other news, Google will now start showing restaurant menus for those keyword searches. But the restaurant brands do not know exactly where Google is scraping this data from, and how to update it.Read the whole scoop here: http://searchengineland.com/now-official-google-adds-restaurant-menus-search-results-185708 3. Links on user generated content sites that pass pagerankFor most sites, Google insists that links created by site visitors are “nofollow”. But Google+ allows links that are curiously “dofollow”. Other sites could indeed be penalized by this. 4. Sell LinksAlmost $17 billion of Google’s almost $17 billion in revenue from last quarter was from “selling links”. But of course, they aren’t “dofollow”. A couple more items have garnered Google’s attention:1. Rich snippets should be used for good, not evilGoogle has been levying a manual penalty against sites using rich snippets in a spammy fashion.http://www.link-assistant.com/news/rich-snippets-penalty.html 2. Don’t try to insert too many keywords with your business listingThere used to be an distinct advantage in having your keywords in your business name. Now Google wants to make sure the business name you use in your business listing matches you business name.– Your title should reflect your business’s real-world title.– In addition to your business’s real-world title, you may include a single descriptor that helps customers locate your business or understand what your business offers.– Marketing taglines, phone numbers, store codes, or URLs are not valid descriptors.– Examples of acceptable titles with descriptors (in italics for demonstration purposes) are “Starbucks Downtown” or “Joe’s Pizza Delivery”. Examples that would not be accepted would be “#1 Seattle Plumbing”, “Joe’s Pizza Best Delivery”, or “Joe’s Pizza Restaurant Dallas”.See more: https://support.google.com/places/answer/107528?hl=en So what to do?Create a content generating, curating, sharing machine.1. Post full versions of your content to your site, but also Google+, linkedin, and promote your content at other relevant places around the web.2. Tag your content with rich snippets, facebook open graph, twitter cards to increase it’s “sharability” and categorization. PSST! Need a Free Link?We’d like to help you promote your own business, hoping more work for you brings more work our way! Join our newsletter for our suggestion this month: It’s a site with a pagerank of 9!

Summary of Search, May 2013

Around May 22nd, there was an update to Google’s search algorithms. It’s being called Penguin 2.0 (or sometimes Penguin 4) and is a major update. Matt Cutts said in a recent video that compared to the original Penguin update, this one does go much deeper. While the impact is supposed to be 2.3% of English queries, the effect is very large considering the number of Google keyword searches! Here is the full history: Penguin 1 on April 24, 2012 (impacting ~3.1% of queries) Penguin 2 on May 26, 2012 (impacting less than 0.1%) Penguin 3 on October 5, 2012 (impacting ~0.3% of queries) Penguin 4 on May 22, 2013 (impacting 2.3% of queries) Much of the analysis of Penguin 2.0 is still in progress, but some big brands were hit, including SalvationArmy.org and even Dish.com. As far as we can tell so far, Penguin 2.0 penalized: 1. Exact match anchor text 2. Spammy links to subpages 3. Link networks / schemes 4. Links from de-indexed and banned websites, including old directories. 5. link velocity “spikes” Penguin is impacting sites with unintentional webspam. We’ve seen scraper sites (targeting adsense keywords) delivering the worst links to clients’ profiles. These sites weren’t created for a link building campaign, but instead just adsense revenue for some site owner in a distant land. While they could be ignored before, they cannot be any longer. Now their penalties are our penalties. The approach we recommend is: 1. Protect Authority link building is the only protection against both negative SEO and Penguin penalties in general. Authority links are gained primarily from great content, promotion and involvement. One authority link can beat hundreds of spammy links in the algorithm of “the new Google”. 2. Defend Find and remove as many unnatural links as you can manually before disavowing the rest. 3. Build Over the long term, these strategies will also help protect from Google penalties, and are of course great marketing initiatives: a. Great content Copy writing has gone through an evolution and cheap content is not going to cut it. Could it ever though? b. Promotion & Outreach for Social Media Marketing & Inbound Links Since the web’s inception, much content has been posted with little regard to promotion. Social, link building, and other outreach initiatives are vital to maximize dollars spent on premium content. c. Brand Name Searches Google knows big brands are searched. Their “buzz” is a signal of authority, although not yet on par with link building. d. User Engagement Once a visitor is onsite, engage them. Keep their interest and involvement. Good design and excellent content have never been so important. Google has been watching this for some time. e. Multi-tiered approaches Spread marketing dollars broadly across many initiatives. It creates a variety of signals to Google that you are legit.

Changes last month in the world of Organic Search

There weren’t any Penguin updates this last month either, but Google Panda 3.9 happened on July 24, 2012. We didn’t see any impact to client rankings. But Google Panda updates should be a constant reminder: Have you added to your site lately? Have you added something of real value to your visitors, something that will interest them, and something they will “Like” (or plus one!) Penguin v1.2 update is expected to happen any day now. With Google Penguin, websites are more vulnerable to competitors practicing “Negative SEO” than ever before. Since Google Penguin Update actually penalizes websites for links that may have not been created by them, or for them, it is a change for the SEO industry. Some SEO companies are offering “link pruning” services, but it is quite time consuming. Webmasters on these bad websites are bordering on extortion: Asking for compensation to remove links. Bing, for it’s part, has created a tool to disavow bad links. Google claims to be working on a similar feature in Google Webmaster Tools, but no news yet on when it will be ready. Some expect the tool’s release to coincide wih the next Penguin update. Google sent out 20,000 “unnatural link” warnings last month, but then created some confusion by telling webmasters to ignore them. Google’s Matt Cutts explains: “Fundamentally, it means we’re distrusting some links to your site. We often take this action when we see a site that is mostly good but might be might have some spammy or artificial links pointing to it.” The link building techniques he identified are: 1. “widgetbait” This is where sites distribute a badge or other graphic with a link back to their website. Some web stats sites send these out, and Google has noticed. 2. “paid links” Google wants to be the only site selling links, I think. Or maybe they just want to make sure that advertising related links do not help rankings. 3. “blog spam” Blog entries and comments that are spammy detract from the web. 4. “guestbook spam” Guestbook / forum postings that have nothing to do with the conversation are certainly annoying, and Google does not want to encourage them with it’s algorithm. 5. “excessive article directory submissions” We do not submit to article sites. Many SEO firms have been submitted “spun” articles that resemble gibberish. Google does not see this as a good thing for the web, and also is seeking diversity of link types. 6. “excessive link exchanges” Google knows webmasters are likely to exchange links where it makes sense, but do not want to see this on a mass scale. 7. “other types of linkspam” There are always going to be new types of linkspam. Every time there is a new type of website! Google+ Google is also rewarding sites using their Google+ social network. If you haven’t created a profile and/or switched over your Google Local/Maps profile, this is a good time to get it rolling. Need help? Let us know: We’ll steer you to the right partner or help you ourselves.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!

Why Flash is still a problem in 2009

Flash is less of a problem for search engines, but there are still caveats. Flash’s problems can be easily mitigated by offering footer links, and regular html text content on any pages with flash. It’s only an issue when no alternative content or navigation is offered. Here’s the longer story: Flash’s problems depend on the implementation: If developers do not implement Flash detection, pages can appear broken to visitors. They leave the site and/or do not convert to prospects/leads/sales. If flash detection is done poorly, it can be seen as cloaking to search engines – which is returning different content for search engines than for visitors. This is rare, but possible. If flash is the sole navigation for search engines and human visitors to follow, search engines cannot spider the site. This is the kiss of death you’ve probably heard about. Some claim it isn’t a problem any more because: Adobe has implemented better accessibility in the last few versions. But these links are still hard to follow and rarely rank well in the engines. MSN/LIVE has enough problems with HTML links, and probably will not find the content. Also, the landing page where visitors would land sometimes doesn’t show properly – it could be a part of a flash animation that doesn’t load, etc. Google made a deal with flash that allows flash to be crawled more easily. But again, these links are still hard to follow and rarely rank well in the engines. Google seems to be looking more for hidden redirects and other black hat techniques with their Adobe API deal. So what can you do to make sure your content is accessible to search engines, and seen as a valuable landing page for organic search visitors? Nothing beats good old fashioned HTML: Links that can be followed, and relevant keywords marking the content from it’s anchor text and title tags down to it’s keyword density.

Colorado Search Marketing Training

Hyper Dog Media is presenting a day long Search Marketing Presentation in Las Animas, Colorado on February 6, 2009. Three sessions will cover the basics of Search Engine Optimization, Pay Per Click Advertising, and a revolutionary “Solutions Clinic” – providing quick fixes to attendees’ websites in real time. The first session, Search Engine Optimization, addresses increasing web site rankings in Google, Yahoo, and more. SEO is all about helping the search engines see and understand the content of your website. Search engines want to be successful in directing visitors to quality destinations, and SEO should be focused on connecting with the right visitors. The second session focuses on targeting potential customers with PPC (Pay Per Click) and other advertising. It’s possible to waste enormous amounts of money on Pay Per Click advertising networks like Google AdWords. this session will show how to make your limited budget work most efficiently for your business. The third session builds on the first two. The Solution Clinic is for businesses that already have a web site and want real time evaluation and solutions for their site. Bring your hosting information, and we might just be able to fix it on the spot! The training is sponsored by Southeast Business Retention, Expansion, and Attraction. For more information or to register, call the SEBREA office at 719-336-1523.

Denver SEMPO Meetup / Denver SEO Meetup

Why travel outside of Denver for great SEO and Search Engine Marketing events? Last week saw great attendance at the new Denver SEMPO Meetup (Created by the members of SEMPO’s Colorado Working Group). This week’s Denver SEMPO meetup was an excellent educational program provided by Jim Brown, Online Marketing guru for Quark (of QuarkXPress fame). The presentation focused on opportunities in Social Media. Jim provided great information regarding Twitter, Facebook, and Facebook ads. While his presentation was friendly to all audiences, even seasoned Denver SEO professionals left with a new trick or two. And most valuable were the brand ambassador experiences Jim relayed to the group. The Denver SEO Meetup followed, just a few blocks away. Many members attended both meetup groups. The Denver SEO Meetup is not an educational program, but a social function – founded our our President Jim Kreinbrink. Many notable SEO professionals regularly attend, but Search Marketing, Advertising, and Affiliate marketing professionals are also frequenting the meetup. Several SEOs noticed glitches in running Google ranking reports for clients that week, and it was nice to exchange what was working and not working in small informal conversations. Of course, don’t come to the Denver SEO Meetup hoping to learn all about SEO: It’s a more relaxing networking function, not an educational opportunity. With SEO / SEM knowledge and professional networking available in here in Colorado, why travel to search marketing and ad industry conferences every weekend?

Denver SEO Meetup – 1 Year Anniversary

It’s been one whole year since our President Jim Kreinbrink founded the Denver SEO Meetup. We have now had 13 meetups, with 119 members and growing. Expectations about the number and types of SEOs we’d meet have been exceeded, as noted Denver SEO professionals large and small have attended. Among our top lessons: 1. We have great synergies with attendees from related industries Several great contributors to the Denver SEO Meetup aren’t even SEOs – they are affiliate or internet marketing professionals from the Denver/ Boulder area. Or SEO folks looking to hire/ be hired. While the group is targeted toward full-time SEO professionals, it’s been a happy accident that we’ve also attracted so many other great members. 2. Denver Web Designers and Webmasters attend, expecting a learning group Several webmasters have attended or joined the group, and left disappointed when free SEO training wasn’t offered. All Denver SEO experts started as beginners at some point, but the meetup is really targeted toward socializing – not educating. Unfortunately, there have been hurt feelings. We have heard the cries, and are working in conjunction with Colorado SEMPO to provide a mixture of educational programs in addition to this social event. 3. SEOs like beer, wine and socializing, not laser tag The Denver SEO meetup was initially a lasertag group. Of one. It didn’t take long to figure out that should change. 4. Denver SEOs are normal people. Even the “Black Hats”. Especially the “Black Hats”. Denver SEOs have families, pet sites, hobbies, etc. Even the black hats. More than just search engine optimization rules their worlds. Some of the best SEO conversations have started about families, pets, travel, and things without any acronyms whatsoever. If you are a Denver SEO Firm, search marketing agency, SEO freelancer – or a curious Black Hat – consider this an invitation to join the group. To socialize, network, and relax a little. Hope to see you there!

9 ways Google is discovering the invisible web

There are many parts of the web that Googlebot has not been able to access, but Google has been working to shrink that. Google wants to find content, and while many webmasters do not make it easy, Googlebot finds a way. 1. Crawling flash! Adobe announced today that they have released technology and information to Google and Yahoo enabling them to crawl flash files. It may take the search engines some time before they are able to integrate and implement these abilities, but a time is coming where rich media is less of a liability. I wonder if MSN/Live was left out to prevent them from reverse engineering Flash for their new silverlight competitor? At any rate, MSN is still working on accessing text links, so let’s not swamp them. 2. Crawling forms Googlebot recently started filling out forms on the web in an attempt to discover content hidden behind jump menus and other forms. See our previous article if you’d like to keep Google out of your forms. 3. Working with Government entities to make information more accessible A year or so ago, Google started providing training to government agencies to assist them in getting their information onto the web. I’m assuming much of the information has been hidden by URLs with large amounts of parameters. 4. Crawling JavaScript Many menus and other dynamic navigation features have been created in JavaScript, and googlebot has started crawling those as well. Instead of relying on webmasters to provide search friendly navigation, Google is finally getting to access sites created by neophyte webmasters that haven’t been paying attention. 5. Google’s patent to read text in images Google also knows many newbie webmasters use text buttons for navigation. By attempting to read text in images, the Googlebot will once again be able to open up previously inaccessible areas of a site. 6. Inbound links Of course, Googlebot has always been great at following inbound links to new content. Much of the invisible web has been discovered just through humans linking to a previously unknown resource. 7. Submission Of course, you can always submit a page location of currently invisible content to Google. This is usually the slowest way, especially compared to inbound links. 8. Google toolbar visits, analytics Recently, many Denver SEO professionals have noticed links being indexed that have not been submitted. The only plausible explanation was that Google has been mining it’s toolbar and analytics for information about new URLs. Be careful – Google is watching and sees all! 9. Sitemap.xml files The somewhat new stemap.xml protocol is very helpful for webmasters and googlebots alike in getting formerly invisible content into google’s hands.

Fortifying External Links and Laundering Link Juice

This is a guest post by Everett Sizemore. I had an interesting discussion with a friend before publishing this about what the title of the article should be, and how I should present the tactic. Should I present it the way I use it personally (fortifying external links) or what it could be used for (laundering link juice)? In the end I decided to do a bit of both since it is essentially the same tactic with different implementations and intentions. Fortifying External Links: The process of second-degree link building by which an SEO builds links into sites that already link to them for the purpose of increasing the page rank of those sites, thus indirectly increasing their own page rank. The first step in fortifying external links is to find those links. This is the easiest part. Use Google Webmaster Tools, Yahoo Site Explorer or your favorite linkbuilding tool to find out who is linking to you. I’ve found the best way to go about this is by doing the link:yourdomain.com command in Google Blog Search or using a tool like SEO Spyglass to uncover blogs that have linked to you that are in danger of going supplemental (i.e. no-to-low PR, way back in the archives…) or that link to you with your favorite anchor-text. Once you have decided which pages you want to drive links into, the second step is to devise a plan for getting those links. There are several dozens of ways to build links into someone’s domain, but that’s not the topic of this article. Regardless of your means, the end result is that you are driving more links – thus page rank – into the pages that already link to you, thus increasing your own page rank and ensuring that those pages don’t go supplemental. I wouldn’t do this to dozens or hundreds of pages at once. Instead, use Google Alerts and/or Yahoo Alerts to subscribe to link:yourdomain.com so you know whenever someone links to you naturally. If you think the post/page isn’t going to get any page-rank on its own, give it a little help by linking to it from a distributed article, a thoughtful dofollow comment on another site, a social bookmarking profile or any number of linking opportunities. Laundering Link Juice: Creating a degree of separation between your site and less-than-white-hat link building tactics by driving those links into pages that already link to you naturally instead of sending them to your own site. Google makes it a point to say they “try” very hard not to let people harm other domains by using link sabotage. Nevertheless, you should be respectful of sites that link to you. If someone gives you a bit of link love from their unknown blog, please don’t do anything that you know could get their site banned. I have found this tactic to be most effective when used along with white-to-gray linkbuilding techniques like manually submitted blog comments and article distributions. Abuse this and Google – if they don’t already – WILL eventually learn to find the common denominator (a link to your site) among all of these relatively unknown, otherwise clean blogs that you’re laundering link juice through. Call it “fortifying external links”, “laundering link juice” or just common-sense SEO, but it is one of the few optimization tactics that I use as much today as I did three years ago. Socialize with Everett Sizemore: eCommerce SEO http://www.twitter.com/balibones http://www.digg.com/balibones http://www.plurk.com/user/balibones http://www.mixx.com/users/balibones http://balibones.stumbleupon.com/