4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

Preparing For SEO in 2017

  Every year brings new SEO challenges and surprises. The year 2017 won’t be any different, but we do expect these topics to be important considerations in the new year: Interstitials / Popups on Mobile DevicesWe’ve all seen mobile sites with a popup covering the content we were trying to read. These popups will be punished by Google in early 2017. Like ads above the fold, Google feels these popups harm the user experience – and they do not want to send visitors to such sites. Many survey and tool vendors such as ometrics and surveygizmo have been proactive to make sure their clients are not at risk, but some vendors may not be aware. SSL / HTTPSGoogle is really pushing SSL, and this is the year they accelerate their plan to make the web secure. Having your entire website served over HTTPS used to be rare, and only credit card or health privacy transactions were secured. And even that was spotty. But Google has begun a campaign since 2014 to secure everything. Two years ago, Google introduced a rankings boost for sites entirely on SSL. Last year they provided better features in Search Console. And we started to see SSL as “must have“. But progress has been voluntary in many regards, with other business objectives prioritized first. Next year, new developments will force your hand: Warnings will start appearing in Chrome. Come January 2017 the Chrome browser will show increasingly dire warnings for any site that hasn’t moved to HTTPS. Starting with pages that have credit card or password fields: Initially, users will be warned: With more dire warnings for insecure sites later in 2017: JavaScript-based sites There are many great reasons to use one of the new JavaScript frameworks in a web app or site: They tend to be mobile friendly and give a superior user experience in many cases. You’ve seen JavaScript search widgets on ebay and amazon providing “faceted search” – allowing users to easily refine their searches by clicking a few checkboxes. Frameworks needing some help include Angular, Backbone, Meteor, and many of their child/related frameworks. Some frameworks, such as Angular v2, are getting better about being search engine friendly. And Google is crawling ever more javascript, but not well from what we’ve seen. And often sites need help implementing technologies such as prerender.io. We are increasingly seeing more of this kind of work, and expect it to accelerate in 2017. AMP (Accelerated Mobile Pages)AMP is the super-speedy loading of pages you’ve likely seen in some mobile results. After you setup AMP on your site, Googlebot places your content on it’s super-fast servers – but making it look like your URL. AMP was just for news sites, but now Google has opened AMP up to other sorts of sites – and 700k+ sites have been using it! If mobile traffic is important to your site, AMP will likely become vital over the next year. SchemaGoogle just loves schema. We’ve seen over this last year as schema has helped increase pages indexed, and expect it to play a greater role every year. As artificial intelligence is used more and more in the “Rank Brain” algorithm, sites that can be easily categorized by Google will received more visibility. I for one welcome our new overlords… subject to future review. BacklinksLinks are still an important part of Google’s algorithm. But sustainable, authentic link earning is always the best longterm approach in link building. So how can you get these links? 1. Content marketingProduce great content, and reach out to authority sites and influencers in your space. 2. Business Development Link BuildingAll of those traditional activities such as sponsoring a baseball team, joining the chamber, or participating in online communities/forums are actually great ways to get links. 3. PublicityPublicity is that powerful branch of public relations that provides links and visibility from media sites. These methods of earning links have the best longterm potential, and are quite powerful for building and keeping rankings. More effortThe shrinking organic traffic (more ads at the top), increased competition, and ever-changing nature of organic search require more effort than ever. Gone are the days of getting your site “SEO-ed” and expecting free traffic. All traffic is either earned, or easily taken away. May you experience a great new year with SEO!

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

After Keyword Research – What do I do with these keywords?!

Getting a keyword research report is just the first step in enhancing your on site SEO. Once the research is complete, it is important to use those words to build out new pages – or improve tagging on existing pages. DomainsBuying a keyword rich domain name is not as lucrative as it once was, but there are still good opportunities. See last month’s article: Do Minisites still work? NamingSavvy business owners may use words and phrases found in their keyword research to name products, services, and even companies. There is no better way to show your audience that you have their solution than to name it (or the whole company!) appropriately. Social DestinationsSocial sites can rank for your keywords and act as informational channels. While your best prospects are not likely searching Pintrest or YouTube for solutions, certain keyword searches might be good content channels. Even in the long buying cycles of business to business sales, social media content will help inform and qualify prospects. Consider which of these channels might work well for your keywords:– Pintrest boards– YouTube channels– LinkedIn groups– SlideShare presentations Consider that a keyword-focused social destination may not be appropriate for your entire brand: You may want a brand focused YouTube channel and a campaign channel focused on a specific keyword phrase. Blogging TopicsRanking at the top of search engine results for any competitive keyword phrase requires you to be “all about that phrase.” To be relevant for the many topics and categories of your targeted phrase, you will need many different pieces of content around that phrase. Consider online tools such as HubSpot’s blog topic generator to help inspire your next article:http://www.hubspot.com/blog-topic-generator to generate “clickable” blogging ideas. Here is another nice post: https://www.authorityhacker.com/blog-post-ideas/ be sure to check that the blogging titles themselves have search volume. That’s a nice bonus you don’t want to pass up! Content FormatsSome key phrases give away hints as to what kind of content would be best to produce. “How to” searches may lend themselves to tutorials and videos. Other topics are worthy of any entire channel or perhaps a white paper. For any keyword phrase you may want to target, taking the searchers’ needs into account is always the best approach: Consider what content your audience is looking for with each query. A keyword research report is the beginning of any good SEO campaign. Depending on the site, audience and available resources any number of tactics could be deployed. For each of the above methods, however, focus should always come back to your target audience. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Do Minisites still work?

Minisites used to be a good technique, but is getting harder to make them work. Here are 3 challenges for the “Minisite Approach”: Google doesn’t value new websites. Google doesn’t value 2-3 page websites. It’s rare for small sites to have the depth of content that Google values. If this site cannot go into depth on a topic, it might not be seen as valuable – to Google bot, or to human visitors. You can overcome that with link authority, but it’s tough. Google doesn’t have a powerful “exact match bonus.” Google used to give easy rankings to “exact match domains,” but lessened that 2-3 years ago. If someone was typing “iPhone ringtones” into Google, it was simple for iphoneringtones.com to rank at the top. In the newer version of Google’s algorithm, exact match domains do not necessarily mean top rankings for little effort – although it is still helpful: Keywords will be bolded in the URL in some search engines. That can be very tempting to prospective visitors. Inbound links that use the domain as anchor text will experience a bonus for that keyword targeting. Anchor text is still powerful in Google’s algorithm. Here are some tips to make the most of your Minisite: – The content must be unique Minisites are often created to be a tangential offering of a brand, but shouldn’t just be a copy/paste of the existing content from a site. Instead, the content should be created especially for the Minisite, with some thought given for how this audience might be unique. – The URLs need to not look spammy to your audience. So many keyword rich URLs can look that way these days. Test with PPC and see if your prospects want to click. No more than a single dash in the URL, only use .com, and two word phrases. For example, this is not a clickable URL: http://solve-your-sales-problems.biz But this is: http://salesmanship.com – The keyword phrase should have good search volume. Keyword phrases that do not show search volume in Google’s Keyword Planner may not be worth investing in. One of the main advantages of a minisite on a custom domain is the “exact match domain” that should exactly match your prospects’ query. Without search volume, that’s one less compelling reason to do a minisite. – Don’t rely on type in traffic. Prospects using Internet explorer when it was the dominant browser would type in “sales management” and be taken to salesmanagement.com. A few years ago, 12% of search traffic could arrive like that. Chrome is now dominant and it searches Google for what you type in. So that type in traffic isn’t as prevalent as it was. – Buy keyword focused domains if there is good search volume. Test them with PPC (for both click through rate and conversion), and then build out larger sites of 20 pages, blog weekly on the site, have videos, get some good links etc. But this technique is not the easy road it once was. There are many fewer shortcuts in today’s Google. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Can Google read JavaScript? Yes, but can it really?

Google will eventually crawl all JavaScript, but they haven’t been indexing JavaScript pages very  successfully. Every year, we hear the same story: Google says it’s  getting better at crawling and indexing Javascript. Except crawling JavaScript, and crawling ALL JavaScript are clearly two different accomplishments. Google can crawl it, render it, but just doesn’t seem to use it in the same way as optimized content. JavaScript pages can’t seem to rank as well in search engines, from what we’ve seen. Title tags come through here and there, but not consistently. Although, with the ease of development that JavaScript frameworks offer, it can be difficult to justify optimization with plain text and images. Here are some important questions to consider: 1. Fail gracefully For visitors without JavaScript – either bot or human – offering some sort of page content has always been important. Showing plain text and image content when JavaScript is off embraces the best practice of “failing gracefully.” 2. How quickly do you want results? For many sites, faster rankings means a faster path to revenue. Where pure JavaScript offers a compelling business case, it could be prioritized over “search engine friendliness.” For most sites, the extra visibility is worth extra work optimizing in the most search-friendly ways possible.  3. Is Google responding correctly to a test The entire site doesn’t have to be converted to JavaScript. Instead, use simple one page tests and check Google’s ‘crawlability.” Is Google understanding the DOM, and extracting titles, images and content correctly? 4. What other Google bots need to access your content? There are actually a variety of bots across Google’s many services. Google employs specific bots for their image search, ad services, product listing feeds, etc. Try accessing these with your test. Also, definitely keep your schema/rich snippet code easily accessible: Google has specifically warned that it cannot be found inside of javascript objects.  5. Test with all of Google’s tools: Speaking of Google’s bots, try using Google’s many tools for understanding and analyzing webpages. Seeing any problems here is a serious red flag for your JavaScript. But even if these render JavaScript, Google may not be ranking your pages as well as they would “search friendly” pages. Fetch and render https://www.google.com/webmasters/tools/googlebot-fetch (must be verified and logged into Google Search Console) Page speed Insights https://developers.google.com/speed/pagespeed/insights/ Mobile friendly https://www.google.com/webmasters/tools/mobile-friendly/ Keyword planner: https://adwords.google.com/ko/KeywordPlanner/Home (Ask Google to fetch the keywords from your landing page) Bing is rising Google isn’t the only search engine in town. Even without Yahoo and AOL numbers, Bing’s market share has been increasing steadily year over year. Bing had 21.4 percent market share last year, not counting partnerships with Apple, Yahoo or AOL. That’s getting to be a huge chunk of users. Bing especially has trouble with images inside javascript objects. Bing’s version of the fetch and render tool may display a rendered page, but bing isn’t going to show images in its image results, and the regular results will be inconsistent. Social Media Plain text and image content is also ideal for social media sharing. When a page is shared, most social media sites and can parse the simple text description and image right out – unless there is JavaScript. For most social networks, rich snippets such as open graph and twitter cards could help for the established social networks – but with new social networks (WhatsApp, Snapchat, etc) popping up every year, it would be best to expose the page content as plain text. Google’s JavaScript support is constantly improving. Having a Javascript app on the landing page is often needlessly complex. As of this writing, having an optimized version does appear to still be necessary. Maybe next year’s announcement that Google is crawling JavaScript will be followed by a more robust crawl, but there are plenty of other sites embracing “search engine friendliness”; Your site should too, in order to be competitive.   PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: https://mailchi.mp/3542d97c2fbd/hyper-dog-media-seo-ppc-tips

Conversion is King

Content is helpful, but conversion is everything. The point of content – and usability in general – is to meet business objectives. Any business objective can be a conversion of sorts: bookmarking, social sharing/liking, video views, time on site, lead generation, add to cart, and hopefully even completing the sale! By measuring each step, brands can understand where their site can improve it’s usability and contribute more to the bottom line. 1. It can be easier to increase conversion than to increase traffic Increasing conversion also increases revenue, and can be easier than increasing traffic – up to a point. 2. Even mobile apps can easily conduct conversion optimization tests Mobile testing platforms now allow conversion and usability testing without rolling out new versions of your app. Solutions exist from Optimizely ,Visual Website Optimizer (VWO), Liquid, and Artisan Optimize Mobile App. 3. You should test EVERYTHING User Experience professionals agree: Take their advice, but “always keep testing”. Conversion case studies show all sorts of factors can influence conversion: Logos and headers Design style of the the site Product page designs Product descriptions and overall copy writing The text of your call to action buttons Images Use of video (usually boosts conversion, but not always!) Purchasing path through the site 4. Website redesigns should use, not reset your data Now if the site is just awful, start with a redesign. But a website redesign that starts over can sometimes be a horrible waste: Another shot in the dark, with hope and prayer. Consider instead a redesign process based on evolving the website with small changes, continually tested for improvement. But definitely start from having your website in a “good place”! Not sure of next steps for your site? Time to start testing – or maybe a redesign from that “good place”. Need a good interactive agency or website design firm? We’ve worked with agencies and designers. And we partner with the best! Talk to us about your needs, and we’ll introduce you to the right match. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! See you at SearchCon 2015! SearchCon! Are you interested in learning about the latest in search from the experts? Join us at SearchCon 2015 – The Digital Marketing and SEO Conference! SearchCon is April 9th and 10th and will be held at Beaver Run Resort in beautiful Breckenridge, Colorado. Register before March 2nd and take advantage of early bird pricing! http://searchcon.events/