Google: All about that mobile

Having a good mobile experience is increasingly important for websites. Advances in technology have made it possible for many more sites to be viewed on mobile devices, but the experience is usually much less pleasurable than viewing via desktop. Google wants to change that, and is again trying to move website design in the correct direction. Google and Bing are currently locked in a battle to be the best search engine for mobile. They know users will judge them by the sites suggested during a search. When searchers encounter unusable sites from their query, they change search engines. Wouldn’t you rather have ten good sites given to you from a search than a hit-and-miss list? Mobile is growing fast: Comscore estimates that mobile usage will outpace desktop usage this year! Google has already started showing “Mobile Friendly” icons in search results – and has even tested “NOT Mobile Friendly” icons recently! So what to do? Here are some quick tips:1. View your site in mobileTry using this free testing tool from Google:https://www.google.com/webmasters/tools/mobile-friendly/ Google tells you if fonts are too small, there are missing “viewport” metatags, and other mobile usability errors. 2. Easy URLsKeyword rich URLs have lost much of their power in the last few years, but are likely to lose much more: They aren’t as easy to type into a smartphone. 3. Responsive designA responsive design is usable at any size. Previous efforts to provide different sites to different kinds of devices have failed as the many types of devices have exploded and crossed over into other categories, such as 2-in-1s and giant phones. Having several versions of your website might have also meant a nightmare in keeping all of them updated, and in sync. Googlebot in all it’s wisdom couldn’t figure out which version was canonical, either – and which to return a certain user to, based on their device. Google’s new Mobile Usability reports (in Webmaster Tools) show the following issues:– Flash content,– missing viewport (a critical meta-tag for mobile pages),– tiny fonts,– fixed-width viewports,– content not sized to viewport,– clickable links/buttons too close to each other. 4. Access to site resourcesGooglebot and Bingbot both want to see into your JavaScript and CSS files. It used to be a best practice to block access, and many have. But as time has passed, bots have missed important information about user experience: Are there ads above the fold? Is the user being redirected, or shown irrelevant content? Bots need to know, all with the framework of ranking “better” sites higher. And you cannot be “better” on mobile if the experience is bad. Need a good interactive agency or website design firm? We’ve worked with many, and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Penguin 3.0: A year in the waiting

Google’s “Penguin Updates” target the easiest link building practices. Since Google’s algorithm uses links to determine whether a website deserves to rank, they use the Penguin Updates to punish sites that might be getting links in an automated fashion. Penguin Update 1: April 24, 2012, dubbed v1.0Penguin Update 2: May 25, 2012Penguin Update 3: October 5, 2012Penguin Update 4 : May 22, 2013, dubbed v2.0Penguin Update 5: October 4, 2013Penguin Update 6: October 17, 2014, dubbed v3.0 Penguin 3.0 was the sixth Penguin Update from Google, and actually much smaller than the original Penguin Update. It started on October 17, and is still rolling out. But it hasn’t been as much of a hit as previous updates:1. Google says less than 1% of queries will be affected. That’s less than a third of the original Penguin Update. 2. No new “signals” have been added. It was more of a “refresh” than an update. For those sites that disavowed or removed heavy amounts of links, it was a welcome change. 3. Talk of a larger Penguin update has already started, expected in Spring of 2015. Vigilance and Risk ManagementLast year’s update also opened sites up to more dirty tricks from competitors. Negative SEO has been possible for a long time, and only recently acknowledged by Google. The newest forms of Negative SEO put a competitor’s site into Google’s crosshairs with:– Links from the worst kinds of sites– Links targeting the worst kinds of keywords– Links targeting the right keywords, but in unnatural amounts PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

“How do you write great title tags and meta descriptions?”

[Updated Nov 1, 2016] “How do you write great title tags and meta descriptions?” That is the question that clients ask me most frequently. And it’s a complicated question, for sure! There are several components to writing great titles and descriptions, but there are also a few specifications that each company will want to consider for themselves. I’ll address the considerations first. The goal is to write title tags that are Google-bot-pleasing, but you also want to have titles and descriptions that are functional and helpful to the human visitors to your website. This can be tricky when the approach is different when thinking of writing for bots versus humans. My best advice: somewhere right in the middle is your best bet! Write naturally and use the same voice that you are using in your page content, but include keyword phrases that are specific to the page. Title tags must fall in a range of characters, but also need to fall into a size range to appear complete in Google search. This size range has to do with the number of pixels that a title tag takes up on the page. For example, if you’ve got a title tag with a couple of w’s in it, that will take up far more space than a title with several lower case l’s and i’s. Just look at this spacing difference:  www lil. The three skinnier letters take up about as much space as one of the w’s! Why does this matter? Well, in Google search results, you are allotted a specific amount of space for the title of your page. This went into effect in early 2014 when Google updated its search results page. There was another update to the format of Google’s search results in 2016. Now, search results have a bit more space on the page. Yay, but, wait, there are also some other things to consider: like how many words you use, where the break might show up in those words (if you use too many) and the fact that Google is now appending the brand name to the end of the title tag in some cases. You want your page titles to appear complete in the results, while getting you the most out of this limited function. Unfortunately, this all makes it really tricky to say that there is a specific number of characters that you should use for each title tag. Around 52-55 characters is probably a pretty safe bet, but if you think you might be using a lot of wide characters (or if you test and find that Google is appending your brand name to every title), choose to use a few less letters. Meta descriptions also have a size range that you want to target for full effect in Google search results. Meta descriptions are not used in Google’s algorithm, but a good meta description raises your organic click-through-rate. Google can tell human searchers are clicking through to your site, and likely takes that into account with your ranking. Google also does see short or duplicate meta descriptions as a site quality issue – so I guess it is indeed part of their overall formula. Recently, Google has made some changes to how they display descriptions and in some cases, they are chopping up your beautiful descriptions and taking bits and pieces of your content and adding that to the description so that they can highlight more of the search terms a user typed into the search bar. In addition, Google will sometimes add a date to the beginning or end of the description field in search results. Considering all of this, however, I still recommend meta descriptions of between 139 and 156 characters. The seem to work best, no matter what Google decides to do with them. Again, strive to convey your message to human visitors with your natural writing style, but include those keyword targets specific to the page. When writing meta descriptions, entice users to click on your search engine result by listing benefits and a call to action. In addition, the meta description should be different for each page of your website. I have written a plethora of title tags and meta descriptions for a wide range of clients and what I’ve learned is that if you are organized and set up systems, even the largest websites can have all new titles and descriptions before you know it. I recommend setting up a spreadsheet and setting columns for old titles, new titles, character count, old description, new description and character count. Once you get used to using the spreadsheet, you can set the width of the columns to help guide you to the right size while you are writing. If you are still feeling overwhelmed about getting your titles and descriptions in order, just give me a call. I’ve just about got it down to an art and I’ve also got a few tools in my tool belt that can automate some of the process that may be bogging you down. I’m here to help! Questions? Shoot me an email or a message at @jannavance on Twitter. Good luck!

Summary of Search: Who is Syndicating Who? What to know about syndicating your blog.

SUMMARY OF SEARCH Google released a new Panda 4.1 update this month and unique, relevant content and overall site quality has never been more vital. Syndication actually plays a large part in what Google sees as duplicate content. Done correctly, syndication can mean new visitors, brand exposure, social shares, and links to your site (which are seen as “Votes” by Google). When implemented poorly, another site may look to Google like the authoritative source for your content – and your site is seen as a spammy “scraper” site. Why does it matter? Google prefers to show a piece of content only once in the top ten results. When Google finds the same content in two places on the internet, it will typically show the most authoritative site in the higher position, and other sites on page 2 or 3 (or 20). But a site with more authority doesn’t necessarily deserve credit for all content it posts. Canonical tag A few years ago, Google helped create the “canonical tag” to provide authors a chance to specify the original source for articles that could be syndicated, scraped, or otherwise end up all over the web. It’s a tag that can be placed on other websites, but point back to yours. This could work well, but many larger sites either 1. cannot (will not) accept a canonical tag pointing back to your website – or 2. They insert their own canonical tag pointing to their own site! What does Google do when two canonical tags are encountered for the same content? Revert back to looking at authority, and the smaller site loses out. If using business2community.com or LinkedIn to syndicate your content, your own site/blog is likely to lose the authority test! Syndication used to be much easier. In the “old days”, the deal was that if you gave my site unique content, I gave you a link. In 2013, you could still get the link but it might be nofollow. In 2014, the deal is that you probably do not even get the canonical tag. What to do? Syndicating your content can provide amazing exposure for your business. Don’t walk away from syndication, but certainly use it in a way that will not harm your own rankings. 1. Ask about policies with the canonical tag Some sites, such as business2community.com and linkedin.com do indeed want to place a canonical tag pointing to their own URL as the one true source of the content. 2. Post unique summaries on syndication sites Everyone wants unique content, so give it to ’em. Just, do it in summarized form. Post the long, full version of your article on your own website, with a summary or intro on the syndication websites. Both locations should have canonical tags and unique content. In this case, linkedin.com might have a canonical tag pointing to it’s own page but it will be the only place that unique content is located. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

The Walking Dead, Google Authorship Edition

Summary of Search Google recently announced the end of Google Authorship, a feature the SEO community thought might become a major part of Google’s ranking formula. With Google Authorship, photos of writers were shown in Google’s search results – when rel=”author” and rel=”me” tags were embedded pointing to their Google plus profile. In December 2013, Google reduced the amount of authorship photos showing in their search results. Then photos were removed altogether in June. And finally, Google completely removed Authorship from their search results last week. Low Adoption Rates by Webmaster and AuthorsAuthorship was sometimes difficult to implement, and not appropriate for all sites. Many brands didn’t feel a person’s photo was the best representation in Google’s search results. Provided Low Value for SearchersSome studies showed an increase in click-throughs for listings with Google Authorship. But Google found users were often being distracted from the best content. Snippets that MatterGoogle’s Representative John Mueller did provide Google’s future direction: Expanding support of Schema.org: “This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.” The rich snippets for “People” and “Organization” are certainly something to include where possible/applicable. Implications for Google PlusGoogle plus adoption is well below expectations, especially considering the tie in with popular services such as gmail and youtube. Google authorship was also tied in, and meant to improve the social rank in search results for those producing great content. With the death of Google Authorship, it looks like one more “nail in the coffin” for Google plus. Are Authors Important?Some interesting bits of information have been given away by Google. Amit Singhal, the head of Google Search, said that Author Rank was used for the “In-depth articles” section – which appears in 12% of Google’s search results. Google has also long been able to read bylines: These were used before Google patented “Author Rank” in 2007, are more naturally included where applicable, and are likely to continue being used. PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Penalized: Even Web Giants Aren’t Too Big To Fail

In the early days of the web, having great search engine optimization meant increased visibility for a business. Outranking the competition could lead to higher sales. Any leads that came in via the website were icing on the cake. Now, a strong web presence, leads and web-based revenue have become a vital part of business. In today’s world, the web represents a significant portion of even a brick-and-mortar’s revenue. For companies such as ebay.com and retailmenot.com, losing Google’s traffic overnight due to a rule infraction can be a killer, but that is exactly what has happened in recent months. Ebay.com lost 33% of their organic traffic after being given a “manual penalty” from Google. Retailmenot.com lost 25% of their revenue, thanks to Google’s Panda 4 Update. Other penalty stories and analysis continue rolling in after-the-fact. Only Google truly knows what eBay did wrong. Ebay employs some smart SEOs, but they may not truly know what everything to do – or undo. They can file a “reconsideration request”, and wait. But we can only guess when the penalty will be lifted. It’s unlikely that eBay will rise to its former position anytime soon, as the trick is up. Reailmenot, which is, ironically, funded in part by Google Ventures, has overall guidelines for content quality they can try to adhere to more closely. But these Panda penalties are not always cut and dry, either. SEO is becoming more and more about risk management. Could your business afford a substantial drop in rankings? Google’s formula is continually being updated and even practices that were recommended by Google in years past are now being penalized. Looking at Google’s future direction is more than a whimsical pastime for business leaders – it’s vital to ensuring future growth or survival. With arbitrary rules and swift justice, it’s important to future-proof your SEO as much as possible: Create content that your prospects will take the time to read, share, and discuss. Market your content to other sites using social media, outreach and good old fashioned business development. Don’t get clever with Google. If Google hasn’t already started penalizing a certain tactic, know that it will. Stay up to date on Google’s ever-changing rules. Our Hyper Dog Media Monthly Summary of Search is a low bandwidth newsletter to keep you in Google’s good graces. PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Doing the Pigeon (Update)

Last month, Google rolled out one of their largest local search updates in quite some time. Since Google didn’t name the update, Search Engine Land named this one the Google Pigeon Update. It’s seemingly unrelated to Google’s Pigeon Rank, an April Fools joke from back when Google did good and funny things. This update does not penalize sites, but does change how local results are shown: – Fewer queries are generating a map listing / “local pack” – More traditional SEO signals are used, such as title tags and quality inbound links. Some interesting things are happening with this update: – When a query includes the word “yelp”, those listings on yelp.com are back at the top. This fixes a recent bug. – Web design and SEO companies are getting shown in local queries again! If you depend on local traffic, hopefully your results weren’t negatively impacted by the update. The best approach for local visibility includes these tasks: – make sure to update and creat local directory listings on authority sites such as yelp. – Use the highest quality photo on your Google+ business profile, and get more reviews. You might make it into the Carousel listings at the top of Google for some queries. – Make sure your business Name, Address and Phone(NAP) are consistent on your site, google+ business page, and local directories. – Be sure your city/state is in site’s title tags And now for something good, and funny: PSST! Need a Free Link? We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!  Their site also provides an excellent backlink. You may even get human visitors, website projects and new partners. Now THAT’s business development link building!

How to Keep Google’s Panda from Ruining Your Rankings

It used to be that Google let many crawling problems slide. Not anymore! Their Panda Updates, now almost 3 years old, penalize websites for communicating poorly with Googlebot. Panda 4.0 just rolled out last month, and has gotten quite a bit of press. Here are some tips to prevent a penalty on your clients’ sites. Panda is always evolving, but typically penalizes: “Thin” content: If you heard “thin is in,” think again: Google DISLIKES pages with little content. Before Panda, the recommendation was that articles should be around 250 words in length. After Panda, those were increased to a minimum of 450 words in length. As time has passed, some studies have shown Google favoring pages 1000 words in length! Of course, you shouldn’t sacrifice readability to meet such a quota: Keep content easy to browse and skim. How do you Panda-proof content? Pages should be built out into 450-1000 words. Where that’s not possible, try consolidating content. And don’t forget to 301 redirect the old locations to the new URLs! Duplicate content: Google doesn’t like to find two pages that say the exact same thing. Google doesn’t like to find two pages that say the exact same… well, you get the point. It’s easy for sites to accidentally expose duplicate content to search engines: Tag pages, categories, and search results within a website can all lead to duplicate content. Even homepages can sometimes be found at multiple URLs such as:https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/https://www.hyperdogmedia.com/index.htmlhttps://www.hyperdogmedia.com/index.htmlThis can be very confusing to Googlebot. Which version should be shown? Do the inbound links point to one, but onsite links to another?Never fear, there are easy fixes: a. Block Googlebot from finding the content – Check and fix your internal links. Try to prevent Google from discovering duplicate content during crawling. – Use robots metatags with a “NOINDEX” attribute and/or use robots.txtb. Use 301 Redirects to redirect one location to another. 301 redirects are a special redirect that passes on link authority one from URL to another. The many other kinds of redirects simply send a visitor to a new location, and are usually not the right solution for duplicate content issues.c. Canonical tags can also help These tags help Google sort out the final, canonical URL for content it finds. Where content is on multiple websites, canonical tags are still the solution: They work cross-site! Sitemap.xml files in disarray Google allows webmasters to verify their identity and submit this special xml file full of useful information. Webmasters can list the pages they want Google to index, as well as: – Define their pages’ modification dates – Set priorities for pages – Tell Google how often the page is usually updated Here we are able to actually define what Googlebot has been trying to figure out on its own for eons. But with great power comes great responsibility. For webmasters that submit (or have left submitted) an outdated sitemap.xml file full of errors, missing pages, duplicate or thin content the situation can become dire.The fix? Put your best foot forward and submit a good sitemap.xml file to Googlebot!a. Visit the most likely location for your sitemap.xml file: http://www.domain.com/sitemap.xmlb. Are the URLs good quality content, or is your sitemap.xml file filed with thin, duplicate and missing pages?c. Also check Google Webmaster Tools: Is Google reporting errors with your sitemap.xml file in Webmaster Tools? Large amounts of 404 errors, crawl errors The sitemap.xml file is just a starting point for Google’s crawling. You should certainly have your most valuable URLs in there, but know that other URLs will indeed be crawled. Watch carefully in webmaster tools for crawl errors, and use other crawling tools such as MOZ.com to diagnose your website. Preparing your site for future Panda updates requires thinking like Googlebot. And once a website is in “tip-top shape,” ongoing vigilance is usually needed. In this age of dynamic websites and ever-changing algorithms, you can’t afford to rest! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Make your content easily shared, linked and read

Technical SEO is increasingly about helping people share content from a website. After all, content should be shared, linked to, and – dare I say – read. Perhaps a more appropriate usage term is “consumed,” since content strategy increasingly includes visuals, podcasts, webinars and multimedia. Many sites are not optimized to take full advantage of new and evolving distribution channels for existing content. The “Social Sharability” and “Social Visibility” of content can be maximized by using these techniques: Share buttons: To share the specific URL being viewed. Follow buttons: To follow the website’s brand on social media networks. Facebook Open Graph tags, Twitter cards, Pinterest “Rich pins”: These social networks have specific tags that can be added to on-page website code. Once implemented, posts about your website will feature larger images and tailor-made descriptions to make posts more visible in newsfeeds when shared. Schema.org: Google has indicated that implementation of Schema.org code on your website is of high importance. Much like the other social network cards, tags and pins, URLs using Schema.org code have much better presentation, draw more attention, and are shared more often. Schema.org can also maximize your site’s presence in search results: These tags power the review stars and other features in the search results themselves. Content should be readable and consumable, especially on mobile devices. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Google Moves Against Guest Blogging – March 2014

Google made two more moves against guest blogging in the last month: 1. Google penalizes a site connecting content marketers and webmasters Content marketing is all about getting your information out to interested webmasters. Google recently penalized a site that simply connects those with content to those with websites. The content varied in form: guest posts, infographics, eBooks, etc. What was the issue exactly?! Having a meeting place to connect great content with great websites SHOULD be win-win. It’s a much better option than spam email hawking content or seeking links from any webmaster that will listen. So, is it wrong to try to connect authors and publishers? Is the editorial value of a link lessened where it was easier to connect to the webmaster? 2. Google penalizes an entire website based upon one guest post it considered off-topic Doc Sheldon, a longtime SEO copywriter, was penalized based on a single guest post he hosted. The posting was about social media to Hispanic audiences, but aroused the interest of Google. And not in a good way. Social media marketing is closely aligned with SEO, and the penalty feels arbitrary – if not confusing. Is Google spreading Fear, Uncertainty and Doubt? Only one thing is clear: Webmasters and business owners are being held accountable 100% for the content on their own website. So what to do? Create a content generating, curating, sharing machine. Sharing content can be a minefield these days, but a safe way forward is: 1. Post full versions of your content to your site, but also Google+, LinkedIn, and promote your content at other relevant places around the web. 2. Tag your content with rich snippets, Facebook open graph, and Twitter cards to increase it’s “sharability” and categorization. Get a free link for your business: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!