4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

Preparing For SEO in 2017

  Every year brings new SEO challenges and surprises. The year 2017 won’t be any different, but we do expect these topics to be important considerations in the new year: Interstitials / Popups on Mobile DevicesWe’ve all seen mobile sites with a popup covering the content we were trying to read. These popups will be punished by Google in early 2017. Like ads above the fold, Google feels these popups harm the user experience – and they do not want to send visitors to such sites. Many survey and tool vendors such as ometrics and surveygizmo have been proactive to make sure their clients are not at risk, but some vendors may not be aware. SSL / HTTPSGoogle is really pushing SSL, and this is the year they accelerate their plan to make the web secure. Having your entire website served over HTTPS used to be rare, and only credit card or health privacy transactions were secured. And even that was spotty. But Google has begun a campaign since 2014 to secure everything. Two years ago, Google introduced a rankings boost for sites entirely on SSL. Last year they provided better features in Search Console. And we started to see SSL as “must have“. But progress has been voluntary in many regards, with other business objectives prioritized first. Next year, new developments will force your hand: Warnings will start appearing in Chrome. Come January 2017 the Chrome browser will show increasingly dire warnings for any site that hasn’t moved to HTTPS. Starting with pages that have credit card or password fields: Initially, users will be warned: With more dire warnings for insecure sites later in 2017: JavaScript-based sites There are many great reasons to use one of the new JavaScript frameworks in a web app or site: They tend to be mobile friendly and give a superior user experience in many cases. You’ve seen JavaScript search widgets on ebay and amazon providing “faceted search” – allowing users to easily refine their searches by clicking a few checkboxes. Frameworks needing some help include Angular, Backbone, Meteor, and many of their child/related frameworks. Some frameworks, such as Angular v2, are getting better about being search engine friendly. And Google is crawling ever more javascript, but not well from what we’ve seen. And often sites need help implementing technologies such as prerender.io. We are increasingly seeing more of this kind of work, and expect it to accelerate in 2017. AMP (Accelerated Mobile Pages)AMP is the super-speedy loading of pages you’ve likely seen in some mobile results. After you setup AMP on your site, Googlebot places your content on it’s super-fast servers – but making it look like your URL. AMP was just for news sites, but now Google has opened AMP up to other sorts of sites – and 700k+ sites have been using it! If mobile traffic is important to your site, AMP will likely become vital over the next year. SchemaGoogle just loves schema. We’ve seen over this last year as schema has helped increase pages indexed, and expect it to play a greater role every year. As artificial intelligence is used more and more in the “Rank Brain” algorithm, sites that can be easily categorized by Google will received more visibility. I for one welcome our new overlords… subject to future review. BacklinksLinks are still an important part of Google’s algorithm. But sustainable, authentic link earning is always the best longterm approach in link building. So how can you get these links? 1. Content marketingProduce great content, and reach out to authority sites and influencers in your space. 2. Business Development Link BuildingAll of those traditional activities such as sponsoring a baseball team, joining the chamber, or participating in online communities/forums are actually great ways to get links. 3. PublicityPublicity is that powerful branch of public relations that provides links and visibility from media sites. These methods of earning links have the best longterm potential, and are quite powerful for building and keeping rankings. More effortThe shrinking organic traffic (more ads at the top), increased competition, and ever-changing nature of organic search require more effort than ever. Gone are the days of getting your site “SEO-ed” and expecting free traffic. All traffic is either earned, or easily taken away. May you experience a great new year with SEO!

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

Kick-Start Your SEO in 2015

The search engine optimization (SEO) industry has certainly evolved these last few years. The many Google updates – and their sometimes heavy-handed penalties – in addition to an explosion of mobile traffic have shaped the rules for SEO and online marketing. When we look at what’s working at the end of 2014, we see just how much everything has changed. Big changes in SEO will certainly continue for 2015 and beyond. Here are six things to focus your efforts on in 2015: 1. Mobile If you haven’t already, it’s time to take a mobile-first approach with responsive website design. As mentioned in last month’s blog all about mobile, Google has a new tool (and new expectations) around mobile friendliness. Test your site here:https://www.google.com/webmasters/tools/mobile-friendly/ 2. Rich SnippetsThese underlying webpage code elements help Google and other sites understand when to show review stars, customized descriptions, and more. All of which are vital to your site ranking and click through rate. Consider: A study last year showed an average rankings increase of 4 positions when rich snippets were implemented. In one case study, 30% more visitors clicked through from search results to a site with rich snippets. John Mueller of Google recently requested that examples of rich snippet “spam” in Google be sent directly to him. It must be working, and it must be valuable, if Google is looking for spam! There are many examples of different rich snippets at http://schema.org, a site and format created by Google, Yahoo and Bing. Some types include recipes, products, events, locations, people, ratings, etc. And other formats are also being provided by social media sites: Facebook open graph tags, LinkedIn cards, Twitter cards, and even Pinterest pincards. Consider how this tweet of a site using twitter cards looks better than the standard tweet: When twitter is given data in a twitter card format, they provide a much richer experience for viewers of that tweet. And there are many different types of twitter cards too: Galleries, large images, video players, etc. 3. Universal Analytics Google analytics is finally getting an upgrade. In the past, data about site visitors was lost if they visited several of a brand’s website properties, switched devices, or had an extended period of time between visits. Universal Analytics fixes that and even allows custom dimensions, as well as extreme customization. The system came out of beta testing in 2014, and will be a requirement at some point. Is it on your radar to transition? If not, better get to it! Google will not be providing new features to regular analytics and will eventually force webmasters to make the switch. 4. Link Disavowal Google’s Penguin penalty has made this a necessity. Do you know where your site has links? Most webmasters do not. And many links that were key in the past must now be disavowed in Google’s Webmaster Tools. That is the price we pay for Google’s ever-changing formula! Here are some possible sources of problematic links: “Site wide” footer linksAre other sites linking to you from every page or in their footer? Google no longer sees this as a positive thing. Links from 2004-2012If your SEO plan included creating links during this period, you should get a link analysis performed. Even if Google’s guidelines were being followed, it’s vital to make sure these links are still the kind Google wants to see. Low quality linksYou know these when you see them. Would you visit the site a link is on? Does Google still see any authority there? These are important considerations for your links! Links from penalized sitesSites that were once in Google’s good graces might now have switched hands or been penalized. Negative SEOSEOs used to debate whether any site’s rankings could be hurt from the outside. Now, it’s commonly accepted that negative SEO is possible and happening throughout the web. Some sites are building low quality links, links on penalized sites, etc. pointing to competitors’ websites! 5. Migrate Your Site to HTTPS Are you planning to migrate your entire site to HTTPS? Recent thoughts from Google are making this a more important consideration! A member of the Google Chrome browser team recently commented that anything less than HTTPS is like leaving the front door unlocked. On the search side, HTTPS has been identified as a minor ranking signal – and migrating your site should be considered. Be sure you don’t create duplicate content by accident though! 6. Use Content Marketing for Link Authority Content marketing is  the new link building. It’s authentic marketing that can also boost your site’s rankings (but it must be done with an emphasis on quality outreach). When done correctly, content marketing brings: social sharing brand visibility inbound links (with authority) referral traffic Search Engine Optimization will always be ever-changing: Technology is moving at breakneck speeds and search engines have ever-changing criteria and expectations. Having these six items on your radar will help carry you nicely into the new year. And then some. The year 2016 may be completely different, but these are good solid investments of time and money. Need a good interactive agency or website design firm? We’ve worked with many and partnered with the best. Talk to us about your needs, and we’ll introduce you to the right match! PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

“How do you write great title tags and meta descriptions?”

[Updated Nov 1, 2016] “How do you write great title tags and meta descriptions?” That is the question that clients ask me most frequently. And it’s a complicated question, for sure! There are several components to writing great titles and descriptions, but there are also a few specifications that each company will want to consider for themselves. I’ll address the considerations first. The goal is to write title tags that are Google-bot-pleasing, but you also want to have titles and descriptions that are functional and helpful to the human visitors to your website. This can be tricky when the approach is different when thinking of writing for bots versus humans. My best advice: somewhere right in the middle is your best bet! Write naturally and use the same voice that you are using in your page content, but include keyword phrases that are specific to the page. Title tags must fall in a range of characters, but also need to fall into a size range to appear complete in Google search. This size range has to do with the number of pixels that a title tag takes up on the page. For example, if you’ve got a title tag with a couple of w’s in it, that will take up far more space than a title with several lower case l’s and i’s. Just look at this spacing difference:  www lil. The three skinnier letters take up about as much space as one of the w’s! Why does this matter? Well, in Google search results, you are allotted a specific amount of space for the title of your page. This went into effect in early 2014 when Google updated its search results page. There was another update to the format of Google’s search results in 2016. Now, search results have a bit more space on the page. Yay, but, wait, there are also some other things to consider: like how many words you use, where the break might show up in those words (if you use too many) and the fact that Google is now appending the brand name to the end of the title tag in some cases. You want your page titles to appear complete in the results, while getting you the most out of this limited function. Unfortunately, this all makes it really tricky to say that there is a specific number of characters that you should use for each title tag. Around 52-55 characters is probably a pretty safe bet, but if you think you might be using a lot of wide characters (or if you test and find that Google is appending your brand name to every title), choose to use a few less letters. Meta descriptions also have a size range that you want to target for full effect in Google search results. Meta descriptions are not used in Google’s algorithm, but a good meta description raises your organic click-through-rate. Google can tell human searchers are clicking through to your site, and likely takes that into account with your ranking. Google also does see short or duplicate meta descriptions as a site quality issue – so I guess it is indeed part of their overall formula. Recently, Google has made some changes to how they display descriptions and in some cases, they are chopping up your beautiful descriptions and taking bits and pieces of your content and adding that to the description so that they can highlight more of the search terms a user typed into the search bar. In addition, Google will sometimes add a date to the beginning or end of the description field in search results. Considering all of this, however, I still recommend meta descriptions of between 139 and 156 characters. The seem to work best, no matter what Google decides to do with them. Again, strive to convey your message to human visitors with your natural writing style, but include those keyword targets specific to the page. When writing meta descriptions, entice users to click on your search engine result by listing benefits and a call to action. In addition, the meta description should be different for each page of your website. I have written a plethora of title tags and meta descriptions for a wide range of clients and what I’ve learned is that if you are organized and set up systems, even the largest websites can have all new titles and descriptions before you know it. I recommend setting up a spreadsheet and setting columns for old titles, new titles, character count, old description, new description and character count. Once you get used to using the spreadsheet, you can set the width of the columns to help guide you to the right size while you are writing. If you are still feeling overwhelmed about getting your titles and descriptions in order, just give me a call. I’ve just about got it down to an art and I’ve also got a few tools in my tool belt that can automate some of the process that may be bogging you down. I’m here to help! Questions? Shoot me an email or a message at @jannavance on Twitter. Good luck!

The Walking Dead, Google Authorship Edition

Summary of Search Google recently announced the end of Google Authorship, a feature the SEO community thought might become a major part of Google’s ranking formula. With Google Authorship, photos of writers were shown in Google’s search results – when rel=”author” and rel=”me” tags were embedded pointing to their Google plus profile. In December 2013, Google reduced the amount of authorship photos showing in their search results. Then photos were removed altogether in June. And finally, Google completely removed Authorship from their search results last week. Low Adoption Rates by Webmaster and AuthorsAuthorship was sometimes difficult to implement, and not appropriate for all sites. Many brands didn’t feel a person’s photo was the best representation in Google’s search results. Provided Low Value for SearchersSome studies showed an increase in click-throughs for listings with Google Authorship. But Google found users were often being distracted from the best content. Snippets that MatterGoogle’s Representative John Mueller did provide Google’s future direction: Expanding support of Schema.org: “This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.” The rich snippets for “People” and “Organization” are certainly something to include where possible/applicable. Implications for Google PlusGoogle plus adoption is well below expectations, especially considering the tie in with popular services such as gmail and youtube. Google authorship was also tied in, and meant to improve the social rank in search results for those producing great content. With the death of Google Authorship, it looks like one more “nail in the coffin” for Google plus. Are Authors Important?Some interesting bits of information have been given away by Google. Amit Singhal, the head of Google Search, said that Author Rank was used for the “In-depth articles” section – which appears in 12% of Google’s search results. Google has also long been able to read bylines: These were used before Google patented “Author Rank” in 2007, are more naturally included where applicable, and are likely to continue being used. PSST! Need a Free Link?Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Doing the Pigeon (Update)

Last month, Google rolled out one of their largest local search updates in quite some time. Since Google didn’t name the update, Search Engine Land named this one the Google Pigeon Update. It’s seemingly unrelated to Google’s Pigeon Rank, an April Fools joke from back when Google did good and funny things. This update does not penalize sites, but does change how local results are shown: – Fewer queries are generating a map listing / “local pack” – More traditional SEO signals are used, such as title tags and quality inbound links. Some interesting things are happening with this update: – When a query includes the word “yelp”, those listings on yelp.com are back at the top. This fixes a recent bug. – Web design and SEO companies are getting shown in local queries again! If you depend on local traffic, hopefully your results weren’t negatively impacted by the update. The best approach for local visibility includes these tasks: – make sure to update and creat local directory listings on authority sites such as yelp. – Use the highest quality photo on your Google+ business profile, and get more reviews. You might make it into the Carousel listings at the top of Google for some queries. – Make sure your business Name, Address and Phone(NAP) are consistent on your site, google+ business page, and local directories. – Be sure your city/state is in site’s title tags And now for something good, and funny: PSST! Need a Free Link? We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!  Their site also provides an excellent backlink. You may even get human visitors, website projects and new partners. Now THAT’s business development link building!

Spam-Fighting Always Continues – December 2013 Summary of Search

Spam-Fighting Always Continues Google’s Matt Cutts promised a month free of major updates, but added that “spam-fighting always continues.” Indeed, there were some complaints from webmasters around the 17th and 19th that could have been Google taking out another link network. This month, Google made an example out of Rap Genius. The site was offering traffic for blog links. To participate, you had to link to their Justin Bieber page. And somehow feel good about yourself. Oh, and send them the link. Rap Genius would then tweet your link to their followers, sending traffic to your blog. Google caught wind of the link scheme, and severely punished Rap Genius in the rankings. The moral is that Google will always, usually, catch you! So how do you invest in search engine traffic for the long term? 1. Create Content Google wants compelling content: images, blog posts, videos, podcasts, surveys and more. Good content is long (1000 words plus for articles) and holds your visitor’s attention. Google does not want visitors leaving the site quickly (but will probably forgive if it’s an ad click!). 2. Tag Your Content Search engines are getting better at understanding what we humans create on the internet. But communication directly with “search engine bots” has never been easier. These technologies could be better implemented on almost every website: – Internal linking structures – Sitemap.xml – Title tags – Meta descriptions – Rich snippets   – Authorship 3. Get the Word Out Content outreach and marketing has never been more important. Content today is where websites were in 1998: Many build, and then are disappointed at the results. Good content competes against a dizzying array of distractions in an always-connected world, and must be actively marketed – even AGGRESSIVELY marketed – to make an impression. Content must be spread via social media (especially Google+), and marketed specifically for links. These are “earned links” and outreach for the purpose of links wonderful way to promote your content. As a bonus, this promotion of content will also promote rankings! Get a free link for your business: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!