5 Vital Steps Toward Google’s “Mobile First” Indexing

“Mobile is exploding,” said every headline for the last decade. Google is all about traffic and mobile is both largest segment of traffic, as well as the fastest growing! Google’s search results will be based on the mobile versions of web pages, including the results that are shown to desktop users. This is even if your prospects are primarily using desktop (if you are in manufacturing and a few other industries), desktop drives most of your actual conversions, or maybe you just like the look of your desktop site better. Up to now, Google has been indexing web pages as desktop browsers see them. With the new ‘mobile first’ approach, Google will start indexing web pages as mobile phones see them. The rankings will be calculated based on the mobile results. Google says there will be minimal rankings changes, but this is a pretty major announcement. It is likely that mobile-friendly sites will see minimal ranking changes, but mobile unfriendly sites are likely to see an increasing loss of visibility. Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic and here are some important tips to make sure: 1. Check your mobile rankings, check your risk Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic. It’s only an indicator, however: Google is basing mobile rankings to some extent on crawls of the Desktop version of your site. So better keep reading… 2. Be accessible Some sites hide content behind popups / interstitials. Google is specifically planning on penalizing intrusive popups on January 10, 2017. If you have an email subscription popup or survey layer, you may be penalized. And we all experience frustration with those ads that come up when we are trying to read a news article. Some vendors, such as Ometrics have been on top of this since the day of Google’s announcement! Make sure all of your vendors are. If you have a separate mobile site, make sure it is crawlable and be sure to register it in Google Search Console! Old best practices – blocking the duplicate content on a mobile version of your site – could potentially kill your traffic. 3. Be responsive Responsive mobile design allows for the best (compromise of) user experience across the many mobile, tablet and desktop displays. It adapts the page, and allows a single URL for mobile and desktop versions of the site. If you haven’t changed to responsive mobile design, ask us for a list of great web designers. 4. Be fast Speed on mobile is quite important. Research has shown that 40% of consumers will leave a page that takes longer than three seconds to load. Wireless internet connections are usually not nearly as fast as wired connections that desktop users experience. Optimizing image file sizes and resolutions hasn’t been this important since the days of the modem. 5. Don’t mess up AMP Staying ahead of the curve takes advantage of the greatest opportunities: Being the first among your competitors to implement mobile-friendly, mobile responsive, schema and AMP creates traffic. The period in which your site is in Google’s favor – and competitors are playing catch-up – can mean serious revenue. With these 5 tips, you will be ahead of the pack (for a short while). As Google implements more changes, search is likely to keep changing at a breakneck pace. Watch your indexing, ranking, traffic and conversion to keep ahead of the curve. Oh and PS: Bing will still use Desktop crawling to determine mobile rankings.

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The Walking Dead, Google Authorship Edition

Summary of Search Google recently announced the end of Google Authorship, a feature the SEO community thought might become a major part of Google’s ranking formula. With Google Authorship, photos of writers were shown in Google’s search results – when rel=”author” and rel=”me” tags were embedded pointing to their Google plus profile. In December 2013, Google reduced the amount of authorship photos showing in their search results. Then photos were removed altogether in June. And finally, Google completely removed Authorship from their search results last week. Low Adoption Rates by Webmaster and AuthorsAuthorship was sometimes difficult to implement, and not appropriate for all sites. Many brands didn’t feel a person’s photo was the best representation in Google’s search results. Provided Low Value for SearchersSome studies showed an increase in click-throughs for listings with Google Authorship. But Google found users were often being distracted from the best content. Snippets that MatterGoogle’s Representative John Mueller did provide Google’s future direction: Expanding support of Schema.org: “This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.” The rich snippets for “People” and “Organization” are certainly something to include where possible/applicable. Implications for Google PlusGoogle plus adoption is well below expectations, especially considering the tie in with popular services such as gmail and youtube. Google authorship was also tied in, and meant to improve the social rank in search results for those producing great content. With the death of Google Authorship, it looks like one more “nail in the coffin” for Google plus. Are Authors Important?Some interesting bits of information have been given away by Google. Amit Singhal, the head of Google Search, said that Author Rank was used for the “In-depth articles” section – which appears in 12% of Google’s search results. Google has also long been able to read bylines: These were used before Google patented “Author Rank” in 2007, are more naturally included where applicable, and are likely to continue being used. PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes: “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim. How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs! Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/index.htmlhttps://www.hyperdogmedia.com/index.htmlThis can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?Never fear, there are easy fixes: a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txtb. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site! Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xmlb. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools? Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

February 2014 Summary of Search:
Do as I say, not as I do

“Do as I say, not as I do” Sometimes Google does things it warns others not to do:1. Don’t be top heavyGoogle just updated it’s “Top heavy” algorithm. For sites that show many ads at the top, or make users scroll to see content, penalties can apply. 2. Don’t scrape content from other websitesMatt Cutts of Google is actively seeking reports of what would be considered “scraper sites”. One SEO responded with a screenshot of Google scraping wikipedia. 🙂http://www.seroundtable.com/google-scraper-site-report-18184.html In other news, Google will now start showing restaurant menus for those keyword searches. But the restaurant brands do not know exactly where Google is scraping this data from, and how to update it.Read the whole scoop here: http://searchengineland.com/now-official-google-adds-restaurant-menus-search-results-185708 3. Links on user generated content sites that pass pagerankFor most sites, Google insists that links created by site visitors are “nofollow”. But Google+ allows links that are curiously “dofollow”. Other sites could indeed be penalized by this. 4. Sell LinksAlmost $17 billion of Google’s almost $17 billion in revenue from last quarter was from “selling links”. But of course, they aren’t “dofollow”. A couple more items have garnered Google’s attention:1. Rich snippets should be used for good, not evilGoogle has been levying a manual penalty against sites using rich snippets in a spammy fashion.http://www.link-assistant.com/news/rich-snippets-penalty.html 2. Don’t try to insert too many keywords with your business listingThere used to be an distinct advantage in having your keywords in your business name. Now Google wants to make sure the business name you use in your business listing matches you business name.– Your title should reflect your business’s real-world title.– In addition to your business’s real-world title, you may include a single descriptor that helps customers locate your business or understand what your business offers.– Marketing taglines, phone numbers, store codes, or URLs are not valid descriptors.– Examples of acceptable titles with descriptors (in italics for demonstration purposes) are “Starbucks Downtown” or “Joe’s Pizza Delivery”. Examples that would not be accepted would be “#1 Seattle Plumbing”, “Joe’s Pizza Best Delivery”, or “Joe’s Pizza Restaurant Dallas”.See more: https://support.google.com/places/answer/107528?hl=en So what to do?Create a content generating, curating, sharing machine.1. Post full versions of your content to your site, but also Google+, linkedin, and promote your content at other relevant places around the web.2. Tag your content with rich snippets, facebook open graph, twitter cards to increase it’s “sharability” and categorization. PSST! Need a Free Link?We’d like to help you promote your own business, hoping more work for you brings more work our way! Join our newsletter for our suggestion this month: It’s a site with a pagerank of 9!

Summary of Search, October 2013

(Not provided) Google recently started encrypting all searches, and is now showing “(Not provided)” in Google Analytics for most organic traffic. Some referral traffic will show up from Google.com, and is also organic traffic (But analytics cannot tell if the browser is being ultra-secure). There is no easy solution, but at the next Boulder SEO MeetUp we will be leading a presentation and discussion of alternatives.   Penguin Update Around October 4th, there was an update to Google’s search algorithms. It’s being called Penguin 2.1 (or sometimes Penguin 5) and is a major update. The Penguin updates penalize “over-optimization” and “web spam”, both on websites and looking at website links.   What is “over-optimization?” Using keywords too much in title tags and content Links with anchor-text (the blue underline) focused around too few phrases Anything with your site’s link profile that does not show a narural amount of diversity (duplicate pages titles, inbound links only from press release sites, etc).   What is “Web Spam”? Link networks / schemes Links from de-indexed and banned websites, including old directories, blogs & article sites.   While the impact is supposed to be 1% of English queries, the effect is very large considering the number of Google keyword searches!   The approach we recommend is: 1. Protect Authority link building is the only protection against both negative SEO and Penguin penalties in general. Authority links are gained primarily from great content, promotion and involvement. One authority link can beat hundreds of spammy links in the algorithm of “the new Google”.   2. Defend Find and remove as many unnatural links as you can manually before disavowing the rest. Watch for “Negative SEO” campaigns where an unscrupulous competitor might be creating links to your site just to penalize you!   3. Build Over the long term, these strategies will also help protect from Google penalties, and are, of course, great marketing initiatives: Great content: Copy writing has gone through an evolution and cheap content is not going to cut it. Could it ever though? Promotion & Outreach for Social Media Marketing & Inbound Links: Since the web’s inception, much content has been posted with little regard to promotion. Social, link building, and other outreach initiatives are vital to maximize dollars spent on premium content. Brand Name Searches: Google knows big brands are searched. Their “buzz” is a signal of authority, although not yet on par with link building. User Engagement: Once a visitor is onsite, engage them. Keep their interest and involvement. Good design and excellent content have never been so important. Google has been watching this for some time. Multi-tiered approaches: Spread marketing dollars broadly across many initiatives. It creates a variety of signals to Google that you are legit.   Bing While Google+ is trying to understand social connections & influence from it’s own network, Bing is leveraging Klout. Bing has announced deeper integration with Klout and more control regarding how profiles show up.   Get a free link for your business: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! .

Summary of Search, July 2013

Remember those tactics that worked so well? And what about the old recommendations in the webmaster guidelines? Well, it’s time to take another look at all of those tactics with the new Google! Google released a “multi-week update” that continued into July, but the “Panda Recovery Update” got far more interest. Google Panda has been heavy handed since it’s inception, and Google finally released a kinder, gentler version. Duplicate Content We see many different ways to deal with duplicate content. Based on results we have seen, we have this recommendation: Use canonical tags whenever possible to deal with duplicate content. Other methods like nofollow, noindex, and robots.txt are prone to leaks or are too aggressive. Despite many Google help articles recommending duplicate content be removed, Matt Cutts this month noted: “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.” Over-Optimization We are seeing more penalties for on-page over-optimization since Penguin 2.  the good news is, they are easily reversed:     Diversify those title tags!     Limit yourself to 2 separators like the | (pipe) character in the title tag.     Do not repeat anything more than once in a title tag.     Do not use excessively long title tags. Try to stay between 60-69 characters.     Look in your code for hidden comments, and usage of keywords with a dash between them (URLs, image names, etc). Consider whether excessive. Authority Links With Google’s upcoming (and continued) emphasis on authority links, we recommend these long term strategies: Link Building for Business Development: Make connections that also build your Google rankings. Think trade shows, associations and resource pages. Content Marketing Link Building: Use compelling content to create brand awareness and links! Think videos, infographics and guest blogging. Would you like our monthly take on the changing world of SEO delivered to your inbox?  Subscribe to the Hyper Dog Media SEO Newsletter HERE!

Summary of Search, January 2013

Summary of Search, January 2013 It’s almost been 2 years since the first Google Panda update, and it looks like there is a new update about every 4 weeks now. Update 24 was released on Jan 22nd, impacting 1.2% of English queries. Branding and Content Other updates have included a “brand signal update” on the 17th. Some felt this update more than the Panda update 5 days later! Image search Google rolled out changes to their image search interface, touting these changes as better for webmasters. The consensus among webmasters has been that visitors from image search are down, and that’s not a good thing. Google hasn’t been forthcoming in how this was to help – were they trying to help reduce server bandwidth?! At any rate, Google is not likely to change image search interface back. Moving Forward Businesses expect perhaps less marketing over time, that there will be efficiencies and even possibly a “maintenance mode” for online marketing. With the new Google, nothing could be further from the truth: 1. Google continues to reward branding expertise, social signals, and authority linking. 2. Even larger brands have to pay ever more attention to their technical SEO. No longer can duplicate content or thin content be overlooked. 3. Google expects sites that engage users: Videos, images, animations, and other forms of engaging media are important in the new Google. New SEO Approaches: If you have a content creation initiative (or can get one started), our Content Marketing Link Building is the best way to go. If not, see our Business Development SEO Cycle. 1. “Content Marketing” Link Building cycle: KW Research Help with analysis. Content Creation Integrate/tag with keywords, as well as connect you with copywriters. Content Publishing Make sure content being posted is being indexed by search engines. Content Sharing & Distribution Get content indexed, and create links to the content to build authority. Share and distribute to maximize link authority. Campaign measurement Provide analysis, always optimizing the approach. 2.”Business Development” SEO cycle: KW Research Send possible link ideas every month, based on your keyword targets and industry. Business Development Link Building Provide possible sponsorships, “hub pages” in your vertical, and related conversations in forums and blogs. Link Contact Information Provide the best contact/submission info, which can be as hard to find as the link! Link Outreach Strategies Suggest approach, based on our experience in link outreach. For companies, link outreach is best done in-house – but let us know if you need us to do the outreach instead. Campaign measurement Our end-of-month reports will help measure and optimize the approach for each next month.

Upcoming Denver SEO Presentation: An Excellent Value

Hyper Dog Media is providing Search Engine Optimization tips at the Association of Strategic Marketing’s upcoming seminar. The full agenda includes information from experts in PPC (Pay Per Click), Web Analytics, and more: Proven Strategies for Improving Your Search Engine Marketing Are you optimizing your greatest asset? Website content is an essential part of online success. Help search engines see the relevance of your pages, articles, press releases and more. Learn to identify and target ranking opportunities with titles, headings, bolding and additional techniques. Also, HTML can be used to communicate the relevance of your website and content to search engines. You don’t need to be an HTML whiz either! Once you have the content, you must know how to maximize your search engine exposure. Find out how aggressive search engine submission may harm your ability to get into Google’s listings, as well as modern strategies on how to get your site indexed safely. Learn how to take an active role in getting pages indexed quickly in the major search engines as you add new content. Finally, links from other websites are an important source of traffic and search rankings. Several kinds of links will be discussed and you are sure to leave with new link building ideas! 5 reasons to attend! Translate the user experience to all online channels Learn about online measurement and analytics tools Use your SEM campaign to maximize your ROI Ensure you are paying for profitable clicks Discover 26 sources of links to target BONUS! Free manual with registration Hope to see you there!

7 Web design techniques that are thankfully being retired

1. Frames Frames were rarely done in a search-friendly manner. In the age of cellphone browsers and section 508 compliance, frames must go. 2. IE 5 Mac hacks Internet Explorer was a miserable little browser on every OS it ran on, but was particularly miserable on the Mac. It required CSS hacks that other browsers tripped over. Some standards it – inexplicably – did not support. Even on MacOSX, it sucked. 3. Splash pages These pieces of eyecandy were frequently skipped by visitors, and even more frequently cursed under their breath. Known to be slow-loading and pointless, it is nice to see them used less often. 4. Microsoft Frontpage Extensions These buggy little replacements for scripting would break if you looked at them funny, and gave years of frustration to unix admins. Even Microsoft is turning it’s back on the Frontpage product, and not a day too soon. 5. Popup and Popunder windows There are still sites that tout the effectiveness of popups and popunders, but let’s face it: We all hate them. Every good browser tries to block them, but every once in a while you’ll see one. They are the junkmail of web browsing, and it’s time for them to go far, far away. 6. Animated layers that block content on page load There are few things as annoying as a layer that suddenly slides over to block content you are reading. They usually make users dismiss the ad to read page content. I’ve gotten so that I dismiss anything that slides over, not even taking the time to read the ad. The web will be a better place when these web design techniques are no longer seen. Have others? Add a comment and let us know!