Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Kick-Start Your SEO in 2015

The search engine optimization (SEO) industry has certainly evolved these last few years. The many Google updates – and their sometimes heavy-handed penalties – in addition to an explosion of mobile traffic have shaped the rules for SEO and online marketing. When we look at what’s working at the end of 2014, we see just how much everything has changed. Big changes in SEO will certainly continue for 2015 and beyond. Here are six things to focus your efforts on in 2015: 1. Mobile If you haven’t already, it’s time to take a mobile-first approach with responsive website design. As mentioned in last month’s blog all about mobile, Google has a new tool (and new expectations) around mobile friendliness. Test your site here:https://www.google.com/webmasters/tools/mobile-friendly/ 2. Rich SnippetsThese underlying webpage code elements help Google and other sites understand when to show review stars, customized descriptions, and more. All of which are vital to your site ranking and click through rate. Consider: A study last year showed an average rankings increase of 4 positions when rich snippets were implemented. In one case study, 30% more visitors clicked through from search results to a site with rich snippets. John Mueller of Google recently requested that examples of rich snippet “spam” in Google be sent directly to him. It must be working, and it must be valuable, if Google is looking for spam! There are many examples of different rich snippets at http://schema.org, a site and format created by Google, Yahoo and Bing. Some types include recipes, products, events, locations, people, ratings, etc. And other formats are also being provided by social media sites: Facebook open graph tags, LinkedIn cards, Twitter cards, and even Pinterest pincards. Consider how this tweet of a site using twitter cards looks better than the standard tweet: When twitter is given data in a twitter card format, they provide a much richer experience for viewers of that tweet. And there are many different types of twitter cards too: Galleries, large images, video players, etc. 3. Universal Analytics Google analytics is finally getting an upgrade. In the past, data about site visitors was lost if they visited several of a brand’s website properties, switched devices, or had an extended period of time between visits. Universal Analytics fixes that and even allows custom dimensions, as well as extreme customization. The system came out of beta testing in 2014, and will be a requirement at some point. Is it on your radar to transition? If not, better get to it! Google will not be providing new features to regular analytics and will eventually force webmasters to make the switch. 4. Link Disavowal Google’s Penguin penalty has made this a necessity. Do you know where your site has links? Most webmasters do not. And many links that were key in the past must now be disavowed in Google’s Webmaster Tools. That is the price we pay for Google’s ever-changing formula! Here are some possible sources of problematic links: “Site wide” footer linksAre other sites linking to you from every page or in their footer? Google no longer sees this as a positive thing. Links from 2004-2012If your SEO plan included creating links during this period, you should get a link analysis performed. Even if Google’s guidelines were being followed, it’s vital to make sure these links are still the kind Google wants to see. Low quality linksYou know these when you see them. Would you visit the site a link is on? Does Google still see any authority there? These are important considerations for your links! Links from penalized sitesSites that were once in Google’s good graces might now have switched hands or been penalized. Negative SEOSEOs used to debate whether any site’s rankings could be hurt from the outside. Now, it’s commonly accepted that negative SEO is possible and happening throughout the web. Some sites are building low quality links, links on penalized sites, etc. pointing to competitors’ websites! 5. Migrate Your Site to HTTPS Are you planning to migrate your entire site to HTTPS? Recent thoughts from Google are making this a more important consideration! A member of the Google Chrome browser team recently commented that anything less than HTTPS is like leaving the front door unlocked. On the search side, HTTPS has been identified as a minor ranking signal – and migrating your site should be considered. Be sure you don’t create duplicate content by accident though! 6. Use Content Marketing for Link Authority Content marketing is  the new link building. It’s authentic marketing that can also boost your site’s rankings (but it must be done with an emphasis on quality outreach). When done correctly, content marketing brings: social sharing brand visibility inbound links (with authority) referral traffic Search Engine Optimization will always be ever-changing: Technology is moving at breakneck speeds and search engines have ever-changing criteria and expectations. Having these six items on your radar will help carry you nicely into the new year. And then some. The year 2016 may be completely different, but these are good solid investments of time and money. Need a good interactive agency or website design firm? We’ve worked with many and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Penguin 3.0: A year in the waiting

Google’s “Penguin Updates” target the easiest link building practices. Since Google’s algorithm uses links to determine whether a website deserves to rank, they use the Penguin Updates to punish sites that might be getting links in an automated fashion. Penguin Update 1: April 24, 2012, dubbed v1.0Penguin Update 2: May 25, 2012Penguin Update 3: October 5, 2012Penguin Update 4 : May 22, 2013, dubbed v2.0Penguin Update 5: October 4, 2013Penguin Update 6: October 17, 2014, dubbed v3.0 Penguin 3.0 was the sixth Penguin Update from Google, and actually much smaller than the original Penguin Update. It started on October 17, and is still rolling out. But it hasn’t been as much of a hit as previous updates:1. Google says less than 1% of queries will be affected. That’s less than a third of the original Penguin Update. 2. No new “signals” have been added. It was more of a “refresh” than an update. For those sites that disavowed or removed heavy amounts of links, it was a welcome change. 3. Talk of a larger Penguin update has already started, expected in Spring of 2015. Vigilance and Risk ManagementLast year’s update also opened sites up to more dirty tricks from competitors. Negative SEO has been possible for a long time, and only recently acknowledged by Google. The newest forms of Negative SEO put a competitor’s site into Google’s crosshairs with:– Links from the worst kinds of sites– Links targeting the worst kinds of keywords– Links targeting the right keywords, but in unnatural amounts PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The Walking Dead, Google Authorship Edition

Summary of Search Google recently announced the end of Google Authorship, a feature the SEO community thought might become a major part of Google’s ranking formula. With Google Authorship, photos of writers were shown in Google’s search results – when rel=”author” and rel=”me” tags were embedded pointing to their Google plus profile. In December 2013, Google reduced the amount of authorship photos showing in their search results. Then photos were removed altogether in June. And finally, Google completely removed Authorship from their search results last week. Low Adoption Rates by Webmaster and AuthorsAuthorship was sometimes difficult to implement, and not appropriate for all sites. Many brands didn’t feel a person’s photo was the best representation in Google’s search results. Provided Low Value for SearchersSome studies showed an increase in click-throughs for listings with Google Authorship. But Google found users were often being distracted from the best content. Snippets that MatterGoogle’s Representative John Mueller did provide Google’s future direction: Expanding support of Schema.org: “This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.” The rich snippets for “People” and “Organization” are certainly something to include where possible/applicable. Implications for Google PlusGoogle plus adoption is well below expectations, especially considering the tie in with popular services such as gmail and youtube. Google authorship was also tied in, and meant to improve the social rank in search results for those producing great content. With the death of Google Authorship, it looks like one more “nail in the coffin” for Google plus. Are Authors Important?Some interesting bits of information have been given away by Google. Amit Singhal, the head of Google Search, said that Author Rank was used for the “In-depth articles” section – which appears in 12% of Google’s search results. Google has also long been able to read bylines: These were used before Google patented “Author Rank” in 2007, are more naturally included where applicable, and are likely to continue being used. PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes: “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim. How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs! Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/index.htmlhttps://www.hyperdogmedia.com/index.htmlThis can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?Never fear, there are easy fixes: a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txtb. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site! Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xmlb. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools? Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

February 2014 Summary of Search:
Do as I say, not as I do

“Do as I say, not as I do” Sometimes Google does things it warns others not to do:1. Don’t be top heavyGoogle just updated it’s “Top heavy” algorithm. For sites that show many ads at the top, or make users scroll to see content, penalties can apply. 2. Don’t scrape content from other websitesMatt Cutts of Google is actively seeking reports of what would be considered “scraper sites”. One SEO responded with a screenshot of Google scraping wikipedia. 🙂http://www.seroundtable.com/google-scraper-site-report-18184.html In other news, Google will now start showing restaurant menus for those keyword searches. But the restaurant brands do not know exactly where Google is scraping this data from, and how to update it.Read the whole scoop here: http://searchengineland.com/now-official-google-adds-restaurant-menus-search-results-185708 3. Links on user generated content sites that pass pagerankFor most sites, Google insists that links created by site visitors are “nofollow”. But Google+ allows links that are curiously “dofollow”. Other sites could indeed be penalized by this. 4. Sell LinksAlmost $17 billion of Google’s almost $17 billion in revenue from last quarter was from “selling links”. But of course, they aren’t “dofollow”. A couple more items have garnered Google’s attention:1. Rich snippets should be used for good, not evilGoogle has been levying a manual penalty against sites using rich snippets in a spammy fashion.http://www.link-assistant.com/news/rich-snippets-penalty.html 2. Don’t try to insert too many keywords with your business listingThere used to be an distinct advantage in having your keywords in your business name. Now Google wants to make sure the business name you use in your business listing matches you business name.– Your title should reflect your business’s real-world title.– In addition to your business’s real-world title, you may include a single descriptor that helps customers locate your business or understand what your business offers.– Marketing taglines, phone numbers, store codes, or URLs are not valid descriptors.– Examples of acceptable titles with descriptors (in italics for demonstration purposes) are “Starbucks Downtown” or “Joe’s Pizza Delivery”. Examples that would not be accepted would be “#1 Seattle Plumbing”, “Joe’s Pizza Best Delivery”, or “Joe’s Pizza Restaurant Dallas”.See more: https://support.google.com/places/answer/107528?hl=en So what to do?Create a content generating, curating, sharing machine.1. Post full versions of your content to your site, but also Google+, linkedin, and promote your content at other relevant places around the web.2. Tag your content with rich snippets, facebook open graph, twitter cards to increase it’s “sharability” and categorization. PSST! Need a Free Link?We’d like to help you promote your own business, hoping more work for you brings more work our way! Join our newsletter for our suggestion this month: It’s a site with a pagerank of 9!

Spam-Fighting Always Continues – December 2013 Summary of Search

Spam-Fighting Always Continues Google’s Matt Cutts promised a month free of major updates, but added that “spam-fighting always continues.” Indeed, there were some complaints from webmasters around the 17th and 19th that could have been Google taking out another link network. This month, Google made an example out of Rap Genius. The site was offering traffic for blog links. To participate, you had to link to their Justin Bieber page. And somehow feel good about yourself. Oh, and send them the link. Rap Genius would then tweet your link to their followers, sending traffic to your blog. Google caught wind of the link scheme, and severely punished Rap Genius in the rankings. The moral is that Google will always, usually, catch you! So how do you invest in search engine traffic for the long term? 1. Create Content Google wants compelling content: images, blog posts, videos, podcasts, surveys and more. Good content is long (1000 words plus for articles) and holds your visitor’s attention. Google does not want visitors leaving the site quickly (but will probably forgive if it’s an ad click!). 2. Tag Your Content Search engines are getting better at understanding what we humans create on the internet. But communication directly with “search engine bots” has never been easier. These technologies could be better implemented on almost every website: – Internal linking structures – Sitemap.xml – Title tags – Meta descriptions – Rich snippets   – Authorship 3. Get the Word Out Content outreach and marketing has never been more important. Content today is where websites were in 1998: Many build, and then are disappointed at the results. Good content competes against a dizzying array of distractions in an always-connected world, and must be actively marketed – even AGGRESSIVELY marketed – to make an impression. Content must be spread via social media (especially Google+), and marketed specifically for links. These are “earned links” and outreach for the purpose of links wonderful way to promote your content. As a bonus, this promotion of content will also promote rankings! Get a free link for your business: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Summary of Search, October 2013

(Not provided) Google recently started encrypting all searches, and is now showing “(Not provided)” in Google Analytics for most organic traffic. Some referral traffic will show up from Google.com, and is also organic traffic (But analytics cannot tell if the browser is being ultra-secure). There is no easy solution, but at the next Boulder SEO MeetUp we will be leading a presentation and discussion of alternatives.   Penguin Update Around October 4th, there was an update to Google’s search algorithms. It’s being called Penguin 2.1 (or sometimes Penguin 5) and is a major update. The Penguin updates penalize “over-optimization” and “web spam”, both on websites and looking at website links.   What is “over-optimization?” Using keywords too much in title tags and content Links with anchor-text (the blue underline) focused around too few phrases Anything with your site’s link profile that does not show a narural amount of diversity (duplicate pages titles, inbound links only from press release sites, etc).   What is “Web Spam”? Link networks / schemes Links from de-indexed and banned websites, including old directories, blogs & article sites.   While the impact is supposed to be 1% of English queries, the effect is very large considering the number of Google keyword searches!   The approach we recommend is: 1. Protect Authority link building is the only protection against both negative SEO and Penguin penalties in general. Authority links are gained primarily from great content, promotion and involvement. One authority link can beat hundreds of spammy links in the algorithm of “the new Google”.   2. Defend Find and remove as many unnatural links as you can manually before disavowing the rest. Watch for “Negative SEO” campaigns where an unscrupulous competitor might be creating links to your site just to penalize you!   3. Build Over the long term, these strategies will also help protect from Google penalties, and are, of course, great marketing initiatives: Great content: Copy writing has gone through an evolution and cheap content is not going to cut it. Could it ever though? Promotion & Outreach for Social Media Marketing & Inbound Links: Since the web’s inception, much content has been posted with little regard to promotion. Social, link building, and other outreach initiatives are vital to maximize dollars spent on premium content. Brand Name Searches: Google knows big brands are searched. Their “buzz” is a signal of authority, although not yet on par with link building. User Engagement: Once a visitor is onsite, engage them. Keep their interest and involvement. Good design and excellent content have never been so important. Google has been watching this for some time. Multi-tiered approaches: Spread marketing dollars broadly across many initiatives. It creates a variety of signals to Google that you are legit.   Bing While Google+ is trying to understand social connections & influence from it’s own network, Bing is leveraging Klout. Bing has announced deeper integration with Klout and more control regarding how profiles show up.   Get a free link for your business: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! .

Summary of Search, August 2013

Summary of Search Is Google backward compatible? The previous advice from Google, given in their 2008 Starter Guide for SEO, is now “out the window.” Google previously recommended that the underlined text of a link (aka “anchortext”) contained keywords, but now finds that somewhat spammy. The new Google direction is all about authority link building, not keyword-focused link building. It’s nice to occasionally say: “There was only one major update this month in Google.” It’s an as-yet unnamed update that changed the SERPs (Search Engine Results Pages) in a way similar to Penguin 1.0. Google did, however, roll-out out an exciting new feature with this update: Special placement in search results for “high-quality, in-depth content” that is properly tagged. See their example:   How do you take advantage of this special placement? Try this: Tag everything to make it easy for google to figure out: Use schema.org “article” markup: http://schema.org/Article Provide authorship markup: https://support.google.com/webmasters/answer/3280182 Include pagination markup, if applicable (rel=next and rel=prev) Create a Google Plus page, linked to your website: https://support.google.com/webmasters/answer/1708844 Provide information about your organization’s logo: (organization markup) http://googlewebmastercentral.blogspot.com/2013/05/using-schemaorg-markup-for-organization.html Create compelling in-depth content (so easy, right?) Lengthy – Google has given no numbers, specifically, but we recommend text content of 1000-3000 words in length. Engaging – Google is likely looking at many metrics, including time on page, as signals of engagement. Popular – Content that is popular has inbound links, shares, likes, plus-ones, etc. And it probably has links to it from the site’s homepage or other important pages on the site. See more about the announcement at: http://insidesearch.blogspot.com/2013/08/discover-great-in-depth-articles-on.html Google is communicating about penalties much better than in the past, too: They have added a feature to Webmaster Tools which will alert webmasters if a manual penalty has been levied. Recent interviews have revealed that disavowed links are not stored. This means that old disavowed links must be included in every new batch submitted. Disavowing some links appears to be a normal part of modern SEO. Multiple reconsideration requests are okay, and are considered independently of past requests every time. Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Summary of Search, July 2013

Remember those tactics that worked so well? And what about the old recommendations in the webmaster guidelines? Well, it’s time to take another look at all of those tactics with the new Google! Google released a “multi-week update” that continued into July, but the “Panda Recovery Update” got far more interest. Google Panda has been heavy handed since it’s inception, and Google finally released a kinder, gentler version. Duplicate Content We see many different ways to deal with duplicate content. Based on results we have seen, we have this recommendation: Use canonical tags whenever possible to deal with duplicate content. Other methods like nofollow, noindex, and robots.txt are prone to leaks or are too aggressive. Despite many Google help articles recommending duplicate content be removed, Matt Cutts this month noted: “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.” Over-Optimization We are seeing more penalties for on-page over-optimization since Penguin 2.  the good news is, they are easily reversed:     Diversify those title tags!     Limit yourself to 2 separators like the | (pipe) character in the title tag.     Do not repeat anything more than once in a title tag.     Do not use excessively long title tags. Try to stay between 60-69 characters.     Look in your code for hidden comments, and usage of keywords with a dash between them (URLs, image names, etc). Consider whether excessive. Authority Links With Google’s upcoming (and continued) emphasis on authority links, we recommend these long term strategies: Link Building for Business Development: Make connections that also build your Google rankings. Think trade shows, associations and resource pages. Content Marketing Link Building: Use compelling content to create brand awareness and links! Think videos, infographics and guest blogging. Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!