4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Conversion is King

Content is helpful, but conversion is everything. The point of content – and usability in general – is to meet business objectives. Any business objective can be a conversion of sorts: bookmarking, social sharing/liking, video views, time on site, lead generation, add to cart, and hopefully even completing the sale! By measuring each step, brands can understand where their site can improve it’s usability and contribute more to the bottom line. 1. It can be easier to increase conversion than to increase traffic Increasing conversion also increases revenue, and can be easier than increasing traffic – up to a point. 2. Even mobile apps can easily conduct conversion optimization tests Mobile testing platforms now allow conversion and usability testing without rolling out new versions of your app. Solutions exist from Optimizely ,Visual Website Optimizer (VWO), Liquid, and Artisan Optimize Mobile App. 3. You should test EVERYTHING User Experience professionals agree: Take their advice, but “always keep testing”. Conversion case studies show all sorts of factors can influence conversion: Logos and headers Design style of the the site Product page designs Product descriptions and overall copy writing The text of your call to action buttons Images Use of video (usually boosts conversion, but not always!) Purchasing path through the site 4. Website redesigns should use, not reset your data Now if the site is just awful, start with a redesign. But a website redesign that starts over can sometimes be a horrible waste: Another shot in the dark, with hope and prayer. Consider instead a redesign process based on evolving the website with small changes, continually tested for improvement. But definitely start from having your website in a “good place”! Not sure of next steps for your site? Time to start testing – or maybe a redesign from that “good place”. Need a good interactive agency or website design firm? We’ve worked with agencies and designers. And we partner with the best! Talk to us about your needs, and we’ll introduce you to the right match. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! See you at SearchCon 2015! SearchCon! Are you interested in learning about the latest in search from the experts? Join us at SearchCon 2015 – The Digital Marketing and SEO Conference! SearchCon is April 9th and 10th and will be held at Beaver Run Resort in beautiful Breckenridge, Colorado. Register before March 2nd and take advantage of early bird pricing! http://searchcon.events/

Kick-Start Your SEO in 2015

The search engine optimization (SEO) industry has certainly evolved these last few years. The many Google updates – and their sometimes heavy-handed penalties – in addition to an explosion of mobile traffic have shaped the rules for SEO and online marketing. When we look at what’s working at the end of 2014, we see just how much everything has changed. Big changes in SEO will certainly continue for 2015 and beyond. Here are six things to focus your efforts on in 2015: 1. Mobile If you haven’t already, it’s time to take a mobile-first approach with responsive website design. As mentioned in last month’s blog all about mobile, Google has a new tool (and new expectations) around mobile friendliness. Test your site here:https://www.google.com/webmasters/tools/mobile-friendly/ 2. Rich SnippetsThese underlying webpage code elements help Google and other sites understand when to show review stars, customized descriptions, and more. All of which are vital to your site ranking and click through rate. Consider: A study last year showed an average rankings increase of 4 positions when rich snippets were implemented. In one case study, 30% more visitors clicked through from search results to a site with rich snippets. John Mueller of Google recently requested that examples of rich snippet “spam” in Google be sent directly to him. It must be working, and it must be valuable, if Google is looking for spam! There are many examples of different rich snippets at http://schema.org, a site and format created by Google, Yahoo and Bing. Some types include recipes, products, events, locations, people, ratings, etc. And other formats are also being provided by social media sites: Facebook open graph tags, LinkedIn cards, Twitter cards, and even Pinterest pincards. Consider how this tweet of a site using twitter cards looks better than the standard tweet: When twitter is given data in a twitter card format, they provide a much richer experience for viewers of that tweet. And there are many different types of twitter cards too: Galleries, large images, video players, etc. 3. Universal Analytics Google analytics is finally getting an upgrade. In the past, data about site visitors was lost if they visited several of a brand’s website properties, switched devices, or had an extended period of time between visits. Universal Analytics fixes that and even allows custom dimensions, as well as extreme customization. The system came out of beta testing in 2014, and will be a requirement at some point. Is it on your radar to transition? If not, better get to it! Google will not be providing new features to regular analytics and will eventually force webmasters to make the switch. 4. Link Disavowal Google’s Penguin penalty has made this a necessity. Do you know where your site has links? Most webmasters do not. And many links that were key in the past must now be disavowed in Google’s Webmaster Tools. That is the price we pay for Google’s ever-changing formula! Here are some possible sources of problematic links: “Site wide” footer linksAre other sites linking to you from every page or in their footer? Google no longer sees this as a positive thing. Links from 2004-2012If your SEO plan included creating links during this period, you should get a link analysis performed. Even if Google’s guidelines were being followed, it’s vital to make sure these links are still the kind Google wants to see. Low quality linksYou know these when you see them. Would you visit the site a link is on? Does Google still see any authority there? These are important considerations for your links! Links from penalized sitesSites that were once in Google’s good graces might now have switched hands or been penalized. Negative SEOSEOs used to debate whether any site’s rankings could be hurt from the outside. Now, it’s commonly accepted that negative SEO is possible and happening throughout the web. Some sites are building low quality links, links on penalized sites, etc. pointing to competitors’ websites! 5. Migrate Your Site to HTTPS Are you planning to migrate your entire site to HTTPS? Recent thoughts from Google are making this a more important consideration! A member of the Google Chrome browser team recently commented that anything less than HTTPS is like leaving the front door unlocked. On the search side, HTTPS has been identified as a minor ranking signal – and migrating your site should be considered. Be sure you don’t create duplicate content by accident though! 6. Use Content Marketing for Link Authority Content marketing is  the new link building. It’s authentic marketing that can also boost your site’s rankings (but it must be done with an emphasis on quality outreach). When done correctly, content marketing brings: social sharing brand visibility inbound links (with authority) referral traffic Search Engine Optimization will always be ever-changing: Technology is moving at breakneck speeds and search engines have ever-changing criteria and expectations. Having these six items on your radar will help carry you nicely into the new year. And then some. The year 2016 may be completely different, but these are good solid investments of time and money. Need a good interactive agency or website design firm? We’ve worked with many and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Google: All about that mobile

Having a good mobile experience is increasingly important for websites. Advances in technology have made it possible for many more sites to be viewed on mobile devices, but the experience is usually much less pleasurable than viewing via desktop. Google wants to change that, and is again trying to move website design in the correct direction. Google and Bing are currently locked in a battle to be the best search engine for mobile. They know users will judge them by the sites suggested during a search. When searchers encounter unusable sites from their query, they change search engines. Wouldn’t you rather have ten good sites given to you from a search than a hit-and-miss list? Mobile is growing fast: Comscore estimates that mobile usage will outpace desktop usage this year! Google has already started showing “Mobile Friendly” icons in search results – and has even tested “NOT Mobile Friendly” icons recently! So what to do? Here are some quick tips:1. View your site in mobileTry using this free testing tool from Google:https://www.google.com/webmasters/tools/mobile-friendly/ Google tells you if fonts are too small, there are missing “viewport” metatags, and other mobile usability errors. 2. Easy URLsKeyword rich URLs have lost much of their power in the last few years, but are likely to lose much more: They aren’t as easy to type into a smartphone. 3. Responsive designA responsive design is usable at any size. Previous efforts to provide different sites to different kinds of devices have failed as the many types of devices have exploded and crossed over into other categories, such as 2-in-1s and giant phones. Having several versions of your website might have also meant a nightmare in keeping all of them updated, and in sync. Googlebot in all it’s wisdom couldn’t figure out which version was canonical, either – and which to return a certain user to, based on their device. Google’s new Mobile Usability reports (in Webmaster Tools) show the following issues:– Flash content,– missing viewport (a critical meta-tag for mobile pages),– tiny fonts,– fixed-width viewports,– content not sized to viewport,– clickable links/buttons too close to each other. 4. Access to site resourcesGooglebot and Bingbot both want to see into your JavaScript and CSS files. It used to be a best practice to block access, and many have. But as time has passed, bots have missed important information about user experience: Are there ads above the fold? Is the user being redirected, or shown irrelevant content? Bots need to know, all with the framework of ranking “better” sites higher. And you cannot be “better” on mobile if the experience is bad. Need a good interactive agency or website design firm? We’ve worked with many, and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Penguin 3.0: A year in the waiting

Google’s “Penguin Updates” target the easiest link building practices. Since Google’s algorithm uses links to determine whether a website deserves to rank, they use the Penguin Updates to punish sites that might be getting links in an automated fashion. Penguin Update 1: April 24, 2012, dubbed v1.0Penguin Update 2: May 25, 2012Penguin Update 3: October 5, 2012Penguin Update 4 : May 22, 2013, dubbed v2.0Penguin Update 5: October 4, 2013Penguin Update 6: October 17, 2014, dubbed v3.0 Penguin 3.0 was the sixth Penguin Update from Google, and actually much smaller than the original Penguin Update. It started on October 17, and is still rolling out. But it hasn’t been as much of a hit as previous updates:1. Google says less than 1% of queries will be affected. That’s less than a third of the original Penguin Update. 2. No new “signals” have been added. It was more of a “refresh” than an update. For those sites that disavowed or removed heavy amounts of links, it was a welcome change. 3. Talk of a larger Penguin update has already started, expected in Spring of 2015. Vigilance and Risk ManagementLast year’s update also opened sites up to more dirty tricks from competitors. Negative SEO has been possible for a long time, and only recently acknowledged by Google. The newest forms of Negative SEO put a competitor’s site into Google’s crosshairs with:– Links from the worst kinds of sites– Links targeting the worst kinds of keywords– Links targeting the right keywords, but in unnatural amounts PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Summary of Search: Who is Syndicating Who? What to know about syndicating your blog.

SUMMARY OF SEARCH Google released a new Panda 4.1 update this month and unique, relevant content and overall site quality has never been more vital. Syndication actually plays a large part in what Google sees as duplicate content. Done correctly, syndication can mean new visitors, brand exposure, social shares, and links to your site (which are seen as “Votes” by Google). When implemented poorly, another site may look to Google like the authoritative source for your content – and your site is seen as a spammy “scraper” site. Why does it matter? Google prefers to show a piece of content only once in the top ten results. When Google finds the same content in two places on the internet, it will typically show the most authoritative site in the higher position, and other sites on page 2 or 3 (or 20). But a site with more authority doesn’t necessarily deserve credit for all content it posts. Canonical tag A few years ago, Google helped create the “canonical tag” to provide authors a chance to specify the original source for articles that could be syndicated, scraped, or otherwise end up all over the web. It’s a tag that can be placed on other websites, but point back to yours. This could work well, but many larger sites either 1. cannot (will not) accept a canonical tag pointing back to your website – or 2. They insert their own canonical tag pointing to their own site! What does Google do when two canonical tags are encountered for the same content? Revert back to looking at authority, and the smaller site loses out. If using business2community.com or LinkedIn to syndicate your content, your own site/blog is likely to lose the authority test! Syndication used to be much easier. In the “old days”, the deal was that if you gave my site unique content, I gave you a link. In 2013, you could still get the link but it might be nofollow. In 2014, the deal is that you probably do not even get the canonical tag. What to do? Syndicating your content can provide amazing exposure for your business. Don’t walk away from syndication, but certainly use it in a way that will not harm your own rankings. 1. Ask about policies with the canonical tag Some sites, such as business2community.com and linkedin.com do indeed want to place a canonical tag pointing to their own URL as the one true source of the content. 2. Post unique summaries on syndication sites Everyone wants unique content, so give it to ’em. Just, do it in summarized form. Post the long, full version of your article on your own website, with a summary or intro on the syndication websites. Both locations should have canonical tags and unique content. In this case, linkedin.com might have a canonical tag pointing to it’s own page but it will be the only place that unique content is located. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!