4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

After Keyword Research – What do I do with these keywords?!

Getting a keyword research report is just the first step in enhancing your on site SEO. Once the research is complete, it is important to use those words to build out new pages – or improve tagging on existing pages. DomainsBuying a keyword rich domain name is not as lucrative as it once was, but there are still good opportunities. See last month’s article: Do Minisites still work? NamingSavvy business owners may use words and phrases found in their keyword research to name products, services, and even companies. There is no better way to show your audience that you have their solution than to name it (or the whole company!) appropriately. Social DestinationsSocial sites can rank for your keywords and act as informational channels. While your best prospects are not likely searching Pintrest or YouTube for solutions, certain keyword searches might be good content channels. Even in the long buying cycles of business to business sales, social media content will help inform and qualify prospects. Consider which of these channels might work well for your keywords:– Pintrest boards– YouTube channels– LinkedIn groups– SlideShare presentations Consider that a keyword-focused social destination may not be appropriate for your entire brand: You may want a brand focused YouTube channel and a campaign channel focused on a specific keyword phrase. Blogging TopicsRanking at the top of search engine results for any competitive keyword phrase requires you to be “all about that phrase.” To be relevant for the many topics and categories of your targeted phrase, you will need many different pieces of content around that phrase. Consider online tools such as HubSpot’s blog topic generator to help inspire your next article:http://www.hubspot.com/blog-topic-generator to generate “clickable” blogging ideas. Here is another nice post: https://www.authorityhacker.com/blog-post-ideas/ be sure to check that the blogging titles themselves have search volume. That’s a nice bonus you don’t want to pass up! Content FormatsSome key phrases give away hints as to what kind of content would be best to produce. “How to” searches may lend themselves to tutorials and videos. Other topics are worthy of any entire channel or perhaps a white paper. For any keyword phrase you may want to target, taking the searchers’ needs into account is always the best approach: Consider what content your audience is looking for with each query. A keyword research report is the beginning of any good SEO campaign. Depending on the site, audience and available resources any number of tactics could be deployed. For each of the above methods, however, focus should always come back to your target audience. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Conversion is King

Content is helpful, but conversion is everything. The point of content – and usability in general – is to meet business objectives. Any business objective can be a conversion of sorts: bookmarking, social sharing/liking, video views, time on site, lead generation, add to cart, and hopefully even completing the sale! By measuring each step, brands can understand where their site can improve it’s usability and contribute more to the bottom line. 1. It can be easier to increase conversion than to increase traffic Increasing conversion also increases revenue, and can be easier than increasing traffic – up to a point. 2. Even mobile apps can easily conduct conversion optimization tests Mobile testing platforms now allow conversion and usability testing without rolling out new versions of your app. Solutions exist from Optimizely ,Visual Website Optimizer (VWO), Liquid, and Artisan Optimize Mobile App. 3. You should test EVERYTHING User Experience professionals agree: Take their advice, but “always keep testing”. Conversion case studies show all sorts of factors can influence conversion: Logos and headers Design style of the the site Product page designs Product descriptions and overall copy writing The text of your call to action buttons Images Use of video (usually boosts conversion, but not always!) Purchasing path through the site 4. Website redesigns should use, not reset your data Now if the site is just awful, start with a redesign. But a website redesign that starts over can sometimes be a horrible waste: Another shot in the dark, with hope and prayer. Consider instead a redesign process based on evolving the website with small changes, continually tested for improvement. But definitely start from having your website in a “good place”! Not sure of next steps for your site? Time to start testing – or maybe a redesign from that “good place”. Need a good interactive agency or website design firm? We’ve worked with agencies and designers. And we partner with the best! Talk to us about your needs, and we’ll introduce you to the right match. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! See you at SearchCon 2015! SearchCon! Are you interested in learning about the latest in search from the experts? Join us at SearchCon 2015 – The Digital Marketing and SEO Conference! SearchCon is April 9th and 10th and will be held at Beaver Run Resort in beautiful Breckenridge, Colorado. Register before March 2nd and take advantage of early bird pricing! http://searchcon.events/

Kick-Start Your SEO in 2015

The search engine optimization (SEO) industry has certainly evolved these last few years. The many Google updates – and their sometimes heavy-handed penalties – in addition to an explosion of mobile traffic have shaped the rules for SEO and online marketing. When we look at what’s working at the end of 2014, we see just how much everything has changed. Big changes in SEO will certainly continue for 2015 and beyond. Here are six things to focus your efforts on in 2015: 1. Mobile If you haven’t already, it’s time to take a mobile-first approach with responsive website design. As mentioned in last month’s blog all about mobile, Google has a new tool (and new expectations) around mobile friendliness. Test your site here:https://www.google.com/webmasters/tools/mobile-friendly/ 2. Rich SnippetsThese underlying webpage code elements help Google and other sites understand when to show review stars, customized descriptions, and more. All of which are vital to your site ranking and click through rate. Consider: A study last year showed an average rankings increase of 4 positions when rich snippets were implemented. In one case study, 30% more visitors clicked through from search results to a site with rich snippets. John Mueller of Google recently requested that examples of rich snippet “spam” in Google be sent directly to him. It must be working, and it must be valuable, if Google is looking for spam! There are many examples of different rich snippets at http://schema.org, a site and format created by Google, Yahoo and Bing. Some types include recipes, products, events, locations, people, ratings, etc. And other formats are also being provided by social media sites: Facebook open graph tags, LinkedIn cards, Twitter cards, and even Pinterest pincards. Consider how this tweet of a site using twitter cards looks better than the standard tweet: When twitter is given data in a twitter card format, they provide a much richer experience for viewers of that tweet. And there are many different types of twitter cards too: Galleries, large images, video players, etc. 3. Universal Analytics Google analytics is finally getting an upgrade. In the past, data about site visitors was lost if they visited several of a brand’s website properties, switched devices, or had an extended period of time between visits. Universal Analytics fixes that and even allows custom dimensions, as well as extreme customization. The system came out of beta testing in 2014, and will be a requirement at some point. Is it on your radar to transition? If not, better get to it! Google will not be providing new features to regular analytics and will eventually force webmasters to make the switch. 4. Link Disavowal Google’s Penguin penalty has made this a necessity. Do you know where your site has links? Most webmasters do not. And many links that were key in the past must now be disavowed in Google’s Webmaster Tools. That is the price we pay for Google’s ever-changing formula! Here are some possible sources of problematic links: “Site wide” footer linksAre other sites linking to you from every page or in their footer? Google no longer sees this as a positive thing. Links from 2004-2012If your SEO plan included creating links during this period, you should get a link analysis performed. Even if Google’s guidelines were being followed, it’s vital to make sure these links are still the kind Google wants to see. Low quality linksYou know these when you see them. Would you visit the site a link is on? Does Google still see any authority there? These are important considerations for your links! Links from penalized sitesSites that were once in Google’s good graces might now have switched hands or been penalized. Negative SEOSEOs used to debate whether any site’s rankings could be hurt from the outside. Now, it’s commonly accepted that negative SEO is possible and happening throughout the web. Some sites are building low quality links, links on penalized sites, etc. pointing to competitors’ websites! 5. Migrate Your Site to HTTPS Are you planning to migrate your entire site to HTTPS? Recent thoughts from Google are making this a more important consideration! A member of the Google Chrome browser team recently commented that anything less than HTTPS is like leaving the front door unlocked. On the search side, HTTPS has been identified as a minor ranking signal – and migrating your site should be considered. Be sure you don’t create duplicate content by accident though! 6. Use Content Marketing for Link Authority Content marketing is  the new link building. It’s authentic marketing that can also boost your site’s rankings (but it must be done with an emphasis on quality outreach). When done correctly, content marketing brings: social sharing brand visibility inbound links (with authority) referral traffic Search Engine Optimization will always be ever-changing: Technology is moving at breakneck speeds and search engines have ever-changing criteria and expectations. Having these six items on your radar will help carry you nicely into the new year. And then some. The year 2016 may be completely different, but these are good solid investments of time and money. Need a good interactive agency or website design firm? We’ve worked with many and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Google: All about that mobile

Having a good mobile experience is increasingly important for websites. Advances in technology have made it possible for many more sites to be viewed on mobile devices, but the experience is usually much less pleasurable than viewing via desktop. Google wants to change that, and is again trying to move website design in the correct direction. Google and Bing are currently locked in a battle to be the best search engine for mobile. They know users will judge them by the sites suggested during a search. When searchers encounter unusable sites from their query, they change search engines. Wouldn’t you rather have ten good sites given to you from a search than a hit-and-miss list? Mobile is growing fast: Comscore estimates that mobile usage will outpace desktop usage this year! Google has already started showing “Mobile Friendly” icons in search results – and has even tested “NOT Mobile Friendly” icons recently! So what to do? Here are some quick tips:1. View your site in mobileTry using this free testing tool from Google:https://www.google.com/webmasters/tools/mobile-friendly/ Google tells you if fonts are too small, there are missing “viewport” metatags, and other mobile usability errors. 2. Easy URLsKeyword rich URLs have lost much of their power in the last few years, but are likely to lose much more: They aren’t as easy to type into a smartphone. 3. Responsive designA responsive design is usable at any size. Previous efforts to provide different sites to different kinds of devices have failed as the many types of devices have exploded and crossed over into other categories, such as 2-in-1s and giant phones. Having several versions of your website might have also meant a nightmare in keeping all of them updated, and in sync. Googlebot in all it’s wisdom couldn’t figure out which version was canonical, either – and which to return a certain user to, based on their device. Google’s new Mobile Usability reports (in Webmaster Tools) show the following issues:– Flash content,– missing viewport (a critical meta-tag for mobile pages),– tiny fonts,– fixed-width viewports,– content not sized to viewport,– clickable links/buttons too close to each other. 4. Access to site resourcesGooglebot and Bingbot both want to see into your JavaScript and CSS files. It used to be a best practice to block access, and many have. But as time has passed, bots have missed important information about user experience: Are there ads above the fold? Is the user being redirected, or shown irrelevant content? Bots need to know, all with the framework of ranking “better” sites higher. And you cannot be “better” on mobile if the experience is bad. Need a good interactive agency or website design firm? We’ve worked with many, and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

“How do you write great title tags and meta descriptions?”

[Updated Nov 1, 2016] “How do you write great title tags and meta descriptions?” That is the question that clients ask me most frequently. And it’s a complicated question, for sure! There are several components to writing great titles and descriptions, but there are also a few specifications that each company will want to consider for themselves. I’ll address the considerations first. The goal is to write title tags that are Google-bot-pleasing, but you also want to have titles and descriptions that are functional and helpful to the human visitors to your website. This can be tricky when the approach is different when thinking of writing for bots versus humans. My best advice: somewhere right in the middle is your best bet! Write naturally and use the same voice that you are using in your page content, but include keyword phrases that are specific to the page. Title tags must fall in a range of characters, but also need to fall into a size range to appear complete in Google search. This size range has to do with the number of pixels that a title tag takes up on the page. For example, if you’ve got a title tag with a couple of w’s in it, that will take up far more space than a title with several lower case l’s and i’s. Just look at this spacing difference:  www lil. The three skinnier letters take up about as much space as one of the w’s! Why does this matter? Well, in Google search results, you are allotted a specific amount of space for the title of your page. This went into effect in early 2014 when Google updated its search results page. There was another update to the format of Google’s search results in 2016. Now, search results have a bit more space on the page. Yay, but, wait, there are also some other things to consider: like how many words you use, where the break might show up in those words (if you use too many) and the fact that Google is now appending the brand name to the end of the title tag in some cases. You want your page titles to appear complete in the results, while getting you the most out of this limited function. Unfortunately, this all makes it really tricky to say that there is a specific number of characters that you should use for each title tag. Around 52-55 characters is probably a pretty safe bet, but if you think you might be using a lot of wide characters (or if you test and find that Google is appending your brand name to every title), choose to use a few less letters. Meta descriptions also have a size range that you want to target for full effect in Google search results. Meta descriptions are not used in Google’s algorithm, but a good meta description raises your organic click-through-rate. Google can tell human searchers are clicking through to your site, and likely takes that into account with your ranking. Google also does see short or duplicate meta descriptions as a site quality issue – so I guess it is indeed part of their overall formula. Recently, Google has made some changes to how they display descriptions and in some cases, they are chopping up your beautiful descriptions and taking bits and pieces of your content and adding that to the description so that they can highlight more of the search terms a user typed into the search bar. In addition, Google will sometimes add a date to the beginning or end of the description field in search results. Considering all of this, however, I still recommend meta descriptions of between 139 and 156 characters. The seem to work best, no matter what Google decides to do with them. Again, strive to convey your message to human visitors with your natural writing style, but include those keyword targets specific to the page. When writing meta descriptions, entice users to click on your search engine result by listing benefits and a call to action. In addition, the meta description should be different for each page of your website. I have written a plethora of title tags and meta descriptions for a wide range of clients and what I’ve learned is that if you are organized and set up systems, even the largest websites can have all new titles and descriptions before you know it. I recommend setting up a spreadsheet and setting columns for old titles, new titles, character count, old description, new description and character count. Once you get used to using the spreadsheet, you can set the width of the columns to help guide you to the right size while you are writing. If you are still feeling overwhelmed about getting your titles and descriptions in order, just give me a call. I’ve just about got it down to an art and I’ve also got a few tools in my tool belt that can automate some of the process that may be bogging you down. I’m here to help! Questions? Shoot me an email or a message at @jannavance on Twitter. Good luck!