The only constant in Organic Search is change

October 2012 was another busy month for Google. The search giant started the month by announcing 65 changes they made during August and September. Google also pushed out a new Penguin Update (v3) on October 5 – these Penguin updates penalize the overuse of keywords both on a website, and through links. We have had a few clients with really bad – and sometimes profane links. They may consider Google’s new disavow links tool, just released. But we recommend caution with the tool right now: Some SEOs are speculating Google may see this as a confession! Information also came out early in the month about Google penalizing domains that were more “keyword rich” than authoritative. This Google update (called EMD, or Exact Match Domain) is hitting domains like cheap-flights-from-denver.com. They would have been favored in the past for searches like “cheap flights from Denver”, but no longer. Authoritative sites were not hit though: ski.com still ranks #1 for “ski”. Google also had an update to its penalty for “Top Heavy” sites – those with too many ads at the tops of the page. Highlights of Google’s 65 recent changes include:1. Changes to titles and snippets. Google is ever more treating the robots.txt directives, title and meta description tags as “suggestions” from webmasters. Sometimes this can be helpful – such as when titles have “comments on” or other generic phrases. Other times, Google’s choices may directly conflict with choices the webmaster has made. 2. Google is using more like terms, and expanding their autocomplete suggestions. A search for “telecom provider” returns results where the term “carrier” is bolded as well as “provider”. Google is sure getting smarter, and it’s a good time to diversify keywords! The Google webmaster guidelines were also updated this month, and reflect their move away from counting low-quality directory as well as low quality bookmarking sites. There wasn’t much news for Bing this last month, but a recent report from antivirus vendor Sophos found that Bing search results contained more than twice as many malware-infected pages as Google’s search results(which is still at a hefty 30%).

Google My Business for Your Business

Businesses thrive when they have an effective way for customers to find them on Google – the search engine most frequently used by your future customers. One of the most effective ways to ensure your business is found is with citations. A citation is any mention of your business online. A structured citation is a mention of your business on a directory such as Google My Business or Yelp, and an unstructured citation is your business information (NAP: Name, Address, Phone Number) that is not in a business directory; this could be anything from an article about your company or a mention of your business on a vendor’s website. Citations are important for Local Search, as they give search engines your business information across the internet. If you want to rank in Google’s Map Pack, you’ll want to start by making sure your Google My Business (GMB) listing is properly optimized and maintained. Here’s how: Who Owns Your Listing? If you’re not sure who owns your GMB listing, or you don’t remember which email address you claimed it under, don’t worry! You can request ownership by creating a Google My Business account and searching for your business listing. If it’s already claimed, you can request ownership at this point, and if it isn’t claimed you can request a postcard be mailed to your business address to verify you are who you say you are, and this business is indeed yours. Search For Duplicate Listings. SEOs know there is nothing search engines hate more than an incorrect NAP on a citation listing. Second to that would be duplicate listings. Google looks at duplicate listings for a business, picks which one they like best (regardless if the information is correct or not) and shows that listing in search. So, how can you tell if you have a duplicate business listing in Google? It’s simple: search for your address and select Maps results. This will show every Google My Business listing for that location. If you see a duplicate of your business, you can claim this listing and merge it with the correct one. Is Your Map Marker Correct? There is nothing more frustrating as a user than finding incorrect information on a business listing. Wrong hours of operation, a listing for a business that is no longer at the address, and the dreaded map marker location. When users are getting directions to your business address, they’ll often look at your listing in maps to see where you are located. Especially if the user is familiar with the area in which your business is located, they may skip the directions altogether. Make sure your map marker is in the correct place by updating your correct address in your listing information and moving the map marker to the correct spot. Trust me, your users will appreciate it! Optimize Your Listing. Optimizing your Google My Business listing is a lot easier than it sounds. You want to make sure your business name is correct, your address and phone number are correct, and as we covered above, your map marker is in the correct place. Additionally, you want to make sure the correct business categories are selected so users know exactly what your business does. Make sure your hours of operation are correct and add additional hours of operation for holidays so your customers know when they can and cannot reach you. Add images of your business so users will know when they’ve found you, and add attributes so they know what features you offer! GMB also recently brought back the description section so you can tell users more about your business. Be careful, though; getting too crazy with keywords can cause Google to hide your listing in search. A good rule of thumb is to not leave any field blank, but to keep your listing as organic as possible! Keep Responding To Your Reviews. This is arguably the most difficult part of maintaining a Google My Business listing. Fortunately, every time you get a review Google will email you at the email address in which you claimed your listing. However, many business owners find this task daunting, especially if they are getting negative reviews. Think of it this way: you can’t make every customer happy. Users know that, and typically find businesses with 100% five-star reviews untrustworthy. Negative reviews are a normal part of doing business, and responding to these reviews show you care about customer service. Google My Business is a platform for unhappy customers to come and express their frustrations with your business, and how you respond says a lot about you. Don’t offer coupons or discounts for the customers to return to your business; instead, express your concerns and give them a phone number or email address to contact you directly to resolve the issue. This turns a negative review to a positive experience and shows Google you’re interacting with your customers which helps boost your rankings. It’s a win, win! User Suggested Edits. Google allows users to suggest edits to business listings directly from search. This means if I know a business location offers bathroom access, has a different phone number, or the listing is missing a suite number, I can suggest the update directly from search. When you log into your GMB account you’ll find a yellow banner across the top of your listing prompting you to approve user suggested edits for your business. Sometimes, Google will publish these edits if they go unapproved by the business owner, or if the listing is unclaimed. So, it’s very important for business owners to be checking in on their listing frequently to make sure their listing information isn’t being changed. Making sure your Google My Business listing is properly optimized (and stays that way!) is the first step to achieving local search rankings. Google My Business is just one piece to a very extensive puzzle, but once you master your Google My Business listing you can easily begin claiming and optimizing … Read more

6 Changes in Google Search

Google has made many changes over the years, other engines have followed suit, and SEO has evolved along with these changes. Consider these 6 ways Google has changed over the last several years. 1. More pages are not necessarily betterGoogle used to reward what would now be considered duplicate content. Endless search results pages, doorway pages, and many other techniques of the past are easily detected by the modern Googlebot. In today’s world, these techniques can be ignored, or even penalized. Where quantity ruled supreme, now quality does. Many sites are pruning, combining, or redirecting the flood of URLs of the old days. If you are tempted by these old techniques, consider that you will likely have to undo the changes. 2. CSS and JS should not be blockedIt used to be a best practice to block Google from JavaScript and CSS resources, as they could otherwise show in the index. And to have those as landing pages was just horrible. But modern Google is very smart: It wants access to everything and needs everything to fully render the page. In having to access these resources, Google analyzes mobile friendliness, speed, layout, and many other factors. 3. Get only good linksFrom the start, Google has always weighed links very heavily. SEOs used to be able to get websites to rank without even improving the site! And in the old days, any link helped – and was disregarded at worst. In modern Google, links should come from the best sources. Links from penalized, unimportant or even new sites are risky and can now cause a Google penalty. A typical link profile of a site might have these and ratios should be monitored – but some low-quality links are best disavowed. A high ratio of any one type can be a red flag to Google. It’s best to invest your time in getting the best links. 4. Google wants to understand youGoogle wants to understand concepts better, and wants to understand you better, too! With the advent of Hummingbird and RankBrain, Google is getting smarter and smarter. Hummingbird was Google’s update to help with classifying content. RankBrain is an Artificial Intelligence update to help Google understand what sort of results a certain query would like to see. Consider that these similar queries are actually quite different: https://www.google.com/search?q=windows+updatehttps://www.google.com/search?q=windows+replacement Think about your prospects’ most important queries driving your traffic. Are you delivering what they are looking for? 5. It’s not just 10 blue linksGoogle has many changes over the years, and what began as a simple list of 10 blue links has evolved into a wide variety of results that could be returned. Results can now include answers, cards, carousels, images, videos, and more. And voice results are becoming increasingly valuable for some queries. Getting to “number one in Google” isn’t quite the same as it was: Number one might be a block of images or an answer ABOVE the number 1 position. The modern approach is key to being successful in today’s Google. Images should be named, tagged and captioned appropriately. Schema should be used to help Google understand and classify your content and even your site. For those that commit to helping Google understand their content, the reward is visibility in a multitude of ways. 6. Keywords? Not providedIn the old days, it was easy to see what keywords your prospects were using to find your site. But since “(Not Provided)” has replaced keyword data in analytics, there have been some big changes. Many sites were over-optimized in the old days, anyway. The new approach isn’t spammy but instead is about being more relevant. In the old days, you could target a broad phrase by using it multiple times, and with a heavy bit of anchor-text. In modern times, it’s important to “talk around” any broad phrases. If you want to be relevant for “Blue Widgets”, you must be relevant for as many aspects of the Blue Widget as possible. Consider what questions prospects are asking, what information or media exist around Blue Widgets, etc. In your SEO approach, always keep in mind that Google has changed quite a bit over the years. Yesterday’s approach was for yesterday’s Google. Bing and the other remaining competitors will keep changing, trying to catch-up to or outdo Google’s innovations. To ensure your success, make sure your approach is in line with Google’s ongoing changes.  

4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

5 Vital Steps Toward Google’s “Mobile First” Indexing

“Mobile is exploding,” said every headline for the last decade. Google is all about traffic and mobile is both largest segment of traffic, as well as the fastest growing! Google’s search results will be based on the mobile versions of web pages, including the results that are shown to desktop users. This is even if your prospects are primarily using desktop (if you are in manufacturing and a few other industries), desktop drives most of your actual conversions, or maybe you just like the look of your desktop site better. Up to now, Google has been indexing web pages as desktop browsers see them. With the new ‘mobile first’ approach, Google will start indexing web pages as mobile phones see them. The rankings will be calculated based on the mobile results. Google says there will be minimal rankings changes, but this is a pretty major announcement. It is likely that mobile-friendly sites will see minimal ranking changes, but mobile unfriendly sites are likely to see an increasing loss of visibility. Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic and here are some important tips to make sure: 1. Check your mobile rankings, check your risk Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic. It’s only an indicator, however: Google is basing mobile rankings to some extent on crawls of the Desktop version of your site. So better keep reading… 2. Be accessible Some sites hide content behind popups / interstitials. Google is specifically planning on penalizing intrusive popups on January 10, 2017. If you have an email subscription popup or survey layer, you may be penalized. And we all experience frustration with those ads that come up when we are trying to read a news article. Some vendors, such as Ometrics have been on top of this since the day of Google’s announcement! Make sure all of your vendors are. If you have a separate mobile site, make sure it is crawlable and be sure to register it in Google Search Console! Old best practices – blocking the duplicate content on a mobile version of your site – could potentially kill your traffic. 3. Be responsive Responsive mobile design allows for the best (compromise of) user experience across the many mobile, tablet and desktop displays. It adapts the page, and allows a single URL for mobile and desktop versions of the site. If you haven’t changed to responsive mobile design, ask us for a list of great web designers. 4. Be fast Speed on mobile is quite important. Research has shown that 40% of consumers will leave a page that takes longer than three seconds to load. Wireless internet connections are usually not nearly as fast as wired connections that desktop users experience. Optimizing image file sizes and resolutions hasn’t been this important since the days of the modem. 5. Don’t mess up AMP Staying ahead of the curve takes advantage of the greatest opportunities: Being the first among your competitors to implement mobile-friendly, mobile responsive, schema and AMP creates traffic. The period in which your site is in Google’s favor – and competitors are playing catch-up – can mean serious revenue. With these 5 tips, you will be ahead of the pack (for a short while). As Google implements more changes, search is likely to keep changing at a breakneck pace. Watch your indexing, ranking, traffic and conversion to keep ahead of the curve. Oh and PS: Bing will still use Desktop crawling to determine mobile rankings.

Preparing For SEO in 2017

  Every year brings new SEO challenges and surprises. The year 2017 won’t be any different, but we do expect these topics to be important considerations in the new year: Interstitials / Popups on Mobile DevicesWe’ve all seen mobile sites with a popup covering the content we were trying to read. These popups will be punished by Google in early 2017. Like ads above the fold, Google feels these popups harm the user experience – and they do not want to send visitors to such sites. Many survey and tool vendors such as ometrics and surveygizmo have been proactive to make sure their clients are not at risk, but some vendors may not be aware. SSL / HTTPSGoogle is really pushing SSL, and this is the year they accelerate their plan to make the web secure. Having your entire website served over HTTPS used to be rare, and only credit card or health privacy transactions were secured. And even that was spotty. But Google has begun a campaign since 2014 to secure everything. Two years ago, Google introduced a rankings boost for sites entirely on SSL. Last year they provided better features in Search Console. And we started to see SSL as “must have“. But progress has been voluntary in many regards, with other business objectives prioritized first. Next year, new developments will force your hand: Warnings will start appearing in Chrome. Come January 2017 the Chrome browser will show increasingly dire warnings for any site that hasn’t moved to HTTPS. Starting with pages that have credit card or password fields: Initially, users will be warned: With more dire warnings for insecure sites later in 2017: JavaScript-based sites There are many great reasons to use one of the new JavaScript frameworks in a web app or site: They tend to be mobile friendly and give a superior user experience in many cases. You’ve seen JavaScript search widgets on ebay and amazon providing “faceted search” – allowing users to easily refine their searches by clicking a few checkboxes. Frameworks needing some help include Angular, Backbone, Meteor, and many of their child/related frameworks. Some frameworks, such as Angular v2, are getting better about being search engine friendly. And Google is crawling ever more javascript, but not well from what we’ve seen. And often sites need help implementing technologies such as prerender.io. We are increasingly seeing more of this kind of work, and expect it to accelerate in 2017. AMP (Accelerated Mobile Pages)AMP is the super-speedy loading of pages you’ve likely seen in some mobile results. After you setup AMP on your site, Googlebot places your content on it’s super-fast servers – but making it look like your URL. AMP was just for news sites, but now Google has opened AMP up to other sorts of sites – and 700k+ sites have been using it! If mobile traffic is important to your site, AMP will likely become vital over the next year. SchemaGoogle just loves schema. We’ve seen over this last year as schema has helped increase pages indexed, and expect it to play a greater role every year. As artificial intelligence is used more and more in the “Rank Brain” algorithm, sites that can be easily categorized by Google will received more visibility. I for one welcome our new overlords… subject to future review. BacklinksLinks are still an important part of Google’s algorithm. But sustainable, authentic link earning is always the best longterm approach in link building. So how can you get these links? 1. Content marketingProduce great content, and reach out to authority sites and influencers in your space. 2. Business Development Link BuildingAll of those traditional activities such as sponsoring a baseball team, joining the chamber, or participating in online communities/forums are actually great ways to get links. 3. PublicityPublicity is that powerful branch of public relations that provides links and visibility from media sites. These methods of earning links have the best longterm potential, and are quite powerful for building and keeping rankings. More effortThe shrinking organic traffic (more ads at the top), increased competition, and ever-changing nature of organic search require more effort than ever. Gone are the days of getting your site “SEO-ed” and expecting free traffic. All traffic is either earned, or easily taken away. May you experience a great new year with SEO!

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

3 Persistent SEO Misconceptions

SEO has had many changes over the years. As marketers and small business owners have worked to understand its many complexities, several misconceptions have remained.   Misconception #1: SEO is “free traffic” Many small businesses are interested in SEO — they see it as “free traffic”. Tired of the ever-increasing click costs of PPC, they are drawn to the siren call of a tactic that will bring free traffic — forever. But this is a giant misconception. Search engine optimization was once a simple process of using the keywords your audience is searching for. And that worked fine — until 2001 or so. But now, competitors are a bit savvier, and ranking in search engines is more like a horse race requiring effort: server configuration, mobile responsiveness, image optimization, tagging, schema, AMP, plenty of content, and — oh yeah — the content should be interesting. Misconception #2: SEO is one time (rules, competitors) In the old days of websites and SEO, getting your site “SEO-ed” could be a one-time process. While the web has changed substantially, this view of Search Engine Optimization has persisted. Modern SEO is indeed a horse race, in which competitors must constantly be bettered by: constantly adding awesome content  earning and seeking inbound links and we think probably: social sharing usability metrics Misconception #3: High-traffic keywords are the best ranking targets High traffic keywords can sometimes sound like the best keyword targets, but they are often the worst! High converting keywords are best in every case. Consider this example: Several years ago we received a call from a prospective client that wanted to rank #1 for “Travel”. Wow, I thought: This could be Expedia or Travelocity on the line. But actually it was a Breckenridge Condominium property. Competing for rankings for the term “Travel” is a really bad idea for (at least) 4 reasons: People searching for “Travel” do not yet know where they want to go — they aren’t necessary looking for Breckenridge — and we don’t know if they would want a condo. In a best-case scenario, the site could get to page eight — and that still doesn’t mean any prospects would book a condo. Even page two is a ghost town, with page eight as quiet as deep space. They are competing at a huge level, way beyond what is necessary to rank number one for “Breckenridge Condo.” It’s crazy inefficient,  like investing in a triple-crown champion horse when you just need a healthy horse to win the race. In a fantasy universe, a Breckenridge Condo would get to number one in Google — and receive an overwhelming amount of bad leads a day. Keyword targets are also a prequalifying process when done right. A better approach is for the condo company to first compete for exactly what they are: “Breckenridge Condo” “Breckenridge Condominium” (These are the keywords with a 100% chance of conversion) Only then should they look at broader terms likely to have some prospects: “Breckenridge Hotel” “Breckenridge Motel” “Summit County Condo” This phenomenon isn’t just among condo owners — we all have daydreams of ranking for something that delivers huge traffic. Instead, focus on what your best customers are typing into search engines — just make sure it does have some search volume. SEO has changed much over the years, and has evolved from a one-time process of using high-search-volume keywords to using targeted keywords with a high search volume and high conversion rate.