Preparing For SEO in 2017

  Every year brings new SEO challenges and surprises. The year 2017 won’t be any different, but we do expect these topics to be important considerations in the new year: Interstitials / Popups on Mobile DevicesWe’ve all seen mobile sites with a popup covering the content we were trying to read. These popups will be punished by Google in early 2017. Like ads above the fold, Google feels these popups harm the user experience – and they do not want to send visitors to such sites. Many survey and tool vendors such as ometrics and surveygizmo have been proactive to make sure their clients are not at risk, but some vendors may not be aware. SSL / HTTPSGoogle is really pushing SSL, and this is the year they accelerate their plan to make the web secure. Having your entire website served over HTTPS used to be rare, and only credit card or health privacy transactions were secured. And even that was spotty. But Google has begun a campaign since 2014 to secure everything. Two years ago, Google introduced a rankings boost for sites entirely on SSL. Last year they provided better features in Search Console. And we started to see SSL as “must have“. But progress has been voluntary in many regards, with other business objectives prioritized first. Next year, new developments will force your hand: Warnings will start appearing in Chrome. Come January 2017 the Chrome browser will show increasingly dire warnings for any site that hasn’t moved to HTTPS. Starting with pages that have credit card or password fields: Initially, users will be warned: With more dire warnings for insecure sites later in 2017: JavaScript-based sites There are many great reasons to use one of the new JavaScript frameworks in a web app or site: They tend to be mobile friendly and give a superior user experience in many cases. You’ve seen JavaScript search widgets on ebay and amazon providing “faceted search” – allowing users to easily refine their searches by clicking a few checkboxes. Frameworks needing some help include Angular, Backbone, Meteor, and many of their child/related frameworks. Some frameworks, such as Angular v2, are getting better about being search engine friendly. And Google is crawling ever more javascript, but not well from what we’ve seen. And often sites need help implementing technologies such as prerender.io. We are increasingly seeing more of this kind of work, and expect it to accelerate in 2017. AMP (Accelerated Mobile Pages)AMP is the super-speedy loading of pages you’ve likely seen in some mobile results. After you setup AMP on your site, Googlebot places your content on it’s super-fast servers – but making it look like your URL. AMP was just for news sites, but now Google has opened AMP up to other sorts of sites – and 700k+ sites have been using it! If mobile traffic is important to your site, AMP will likely become vital over the next year. SchemaGoogle just loves schema. We’ve seen over this last year as schema has helped increase pages indexed, and expect it to play a greater role every year. As artificial intelligence is used more and more in the “Rank Brain” algorithm, sites that can be easily categorized by Google will received more visibility. I for one welcome our new overlords… subject to future review. BacklinksLinks are still an important part of Google’s algorithm. But sustainable, authentic link earning is always the best longterm approach in link building. So how can you get these links? 1. Content marketingProduce great content, and reach out to authority sites and influencers in your space. 2. Business Development Link BuildingAll of those traditional activities such as sponsoring a baseball team, joining the chamber, or participating in online communities/forums are actually great ways to get links. 3. PublicityPublicity is that powerful branch of public relations that provides links and visibility from media sites. These methods of earning links have the best longterm potential, and are quite powerful for building and keeping rankings. More effortThe shrinking organic traffic (more ads at the top), increased competition, and ever-changing nature of organic search require more effort than ever. Gone are the days of getting your site “SEO-ed” and expecting free traffic. All traffic is either earned, or easily taken away. May you experience a great new year with SEO!

Penguin 4 has Arrived: What We Know

It’s been 2 years since the last Penguin Penalty update. The Penguin Penalties were known to destroy site traffic by placing sites – that were formerly on page 1– onto page 4 or even page 9. Organic traffic would decrease sometimes to less than 10% of previous levels, and devastate revenue. Penguin is such a serious update for any site relying on organic traffic, that new insights are being gained daily. This update is a little bit different than previous Penguin updates. They appear to get increasingly more harsh. 1. Google still cares tremendously about links We’ve been expecting Google to use social media at some point for authority, but instead they keep using links as a powerful part of their algorithm. Looking at the amount of processing power, education, penalties and heat they have taken… well, we can assume links will be with us for a long time. And Google cares more about authority than popularity, freshness, content, spelling, valid html, or any of the other hundreds of factors they may (or may not) take into account. 2.  It’s now “realtime” As Google discovers links to your site, they will be judged as good, bad or somewhere in-between. Rankings will fluctuate accordingly. This system is long overdue: Previous penguin updates have meant years of waiting to see if link removal, disavowal, site pruning, 301 redirecting, gaining high authority links, and other strategies would be enough. It was a horribly unfair system for most small businesses, as years of lost traffic was particularly painful. 3. Realtime can mean weeks Few have done the math and research in this quora thread, but that sounds like it will be a few weeks. 4. Penguin penalties will now be on the page level, not site level Penguin used to penalize an entire site, impacting rankings for all keywords and on all pages. This was horribly unfair and we saw several clients over the years being penalized after an intruder built pages (and bad links to those pages). Months and years after the intrusion, site keyword rankings (and traffic!) suffered greatly. 5. Bad links no longer penalize – they just don’t count This is a return to the “old days”, simpler times when webmasters didn’t have to continually audit who was linking to them. One of the worst parts of previous penguin updates was the way that low quality links provided a “double whammy” to rankings: They stopped boosting rankings, and also penalized the site. 6. Disavow files are still recommended Google still recommends the disavow file is used. It helps Google identify low quality sites, as well as offering protection against a “manual penalty”, where a human at Google has specifically penalized your site. In that case a disavow file can show that you are trying to distance your site from it’s bad links. Every day brings more insight into how Penguin 4.0 is impacting rankings and traffic. We’ll keep you updated!

3 Persistent SEO Misconceptions

SEO has had many changes over the years. As marketers and small business owners have worked to understand its many complexities, several misconceptions have remained.   Misconception #1: SEO is “free traffic” Many small businesses are interested in SEO — they see it as “free traffic”. Tired of the ever-increasing click costs of PPC, they are drawn to the siren call of a tactic that will bring free traffic — forever. But this is a giant misconception. Search engine optimization was once a simple process of using the keywords your audience is searching for. And that worked fine — until 2001 or so. But now, competitors are a bit savvier, and ranking in search engines is more like a horse race requiring effort: server configuration, mobile responsiveness, image optimization, tagging, schema, AMP, plenty of content, and — oh yeah — the content should be interesting. Misconception #2: SEO is one time (rules, competitors) In the old days of websites and SEO, getting your site “SEO-ed” could be a one-time process. While the web has changed substantially, this view of Search Engine Optimization has persisted. Modern SEO is indeed a horse race, in which competitors must constantly be bettered by: constantly adding awesome content  earning and seeking inbound links and we think probably: social sharing usability metrics Misconception #3: High-traffic keywords are the best ranking targets High traffic keywords can sometimes sound like the best keyword targets, but they are often the worst! High converting keywords are best in every case. Consider this example: Several years ago we received a call from a prospective client that wanted to rank #1 for “Travel”. Wow, I thought: This could be Expedia or Travelocity on the line. But actually it was a Breckenridge Condominium property. Competing for rankings for the term “Travel” is a really bad idea for (at least) 4 reasons: People searching for “Travel” do not yet know where they want to go — they aren’t necessary looking for Breckenridge — and we don’t know if they would want a condo. In a best-case scenario, the site could get to page eight — and that still doesn’t mean any prospects would book a condo. Even page two is a ghost town, with page eight as quiet as deep space. They are competing at a huge level, way beyond what is necessary to rank number one for “Breckenridge Condo.” It’s crazy inefficient,  like investing in a triple-crown champion horse when you just need a healthy horse to win the race. In a fantasy universe, a Breckenridge Condo would get to number one in Google — and receive an overwhelming amount of bad leads a day. Keyword targets are also a prequalifying process when done right. A better approach is for the condo company to first compete for exactly what they are: “Breckenridge Condo” “Breckenridge Condominium” (These are the keywords with a 100% chance of conversion) Only then should they look at broader terms likely to have some prospects: “Breckenridge Hotel” “Breckenridge Motel” “Summit County Condo” This phenomenon isn’t just among condo owners — we all have daydreams of ranking for something that delivers huge traffic. Instead, focus on what your best customers are typing into search engines — just make sure it does have some search volume. SEO has changed much over the years, and has evolved from a one-time process of using high-search-volume keywords to using targeted keywords with a high search volume and high conversion rate.

After Keyword Research – What do I do with these keywords?!

Getting a keyword research report is just the first step in enhancing your on site SEO. Once the research is complete, it is important to use those words to build out new pages – or improve tagging on existing pages. DomainsBuying a keyword rich domain name is not as lucrative as it once was, but there are still good opportunities. See last month’s article: Do Minisites still work? NamingSavvy business owners may use words and phrases found in their keyword research to name products, services, and even companies. There is no better way to show your audience that you have their solution than to name it (or the whole company!) appropriately. Social DestinationsSocial sites can rank for your keywords and act as informational channels. While your best prospects are not likely searching Pintrest or YouTube for solutions, certain keyword searches might be good content channels. Even in the long buying cycles of business to business sales, social media content will help inform and qualify prospects. Consider which of these channels might work well for your keywords:– Pintrest boards– YouTube channels– LinkedIn groups– SlideShare presentations Consider that a keyword-focused social destination may not be appropriate for your entire brand: You may want a brand focused YouTube channel and a campaign channel focused on a specific keyword phrase. Blogging TopicsRanking at the top of search engine results for any competitive keyword phrase requires you to be “all about that phrase.” To be relevant for the many topics and categories of your targeted phrase, you will need many different pieces of content around that phrase. Consider online tools such as HubSpot’s blog topic generator to help inspire your next article:http://www.hubspot.com/blog-topic-generator to generate “clickable” blogging ideas. Here is another nice post: https://www.authorityhacker.com/blog-post-ideas/ be sure to check that the blogging titles themselves have search volume. That’s a nice bonus you don’t want to pass up! Content FormatsSome key phrases give away hints as to what kind of content would be best to produce. “How to” searches may lend themselves to tutorials and videos. Other topics are worthy of any entire channel or perhaps a white paper. For any keyword phrase you may want to target, taking the searchers’ needs into account is always the best approach: Consider what content your audience is looking for with each query. A keyword research report is the beginning of any good SEO campaign. Depending on the site, audience and available resources any number of tactics could be deployed. For each of the above methods, however, focus should always come back to your target audience. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Do Minisites still work?

Minisites used to be a good technique, but is getting harder to make them work. Here are 3 challenges for the “Minisite Approach”: Google doesn’t value new websites. Google doesn’t value 2-3 page websites. It’s rare for small sites to have the depth of content that Google values. If this site cannot go into depth on a topic, it might not be seen as valuable – to Google bot, or to human visitors. You can overcome that with link authority, but it’s tough. Google doesn’t have a powerful “exact match bonus.” Google used to give easy rankings to “exact match domains,” but lessened that 2-3 years ago. If someone was typing “iPhone ringtones” into Google, it was simple for iphoneringtones.com to rank at the top. In the newer version of Google’s algorithm, exact match domains do not necessarily mean top rankings for little effort – although it is still helpful: Keywords will be bolded in the URL in some search engines. That can be very tempting to prospective visitors. Inbound links that use the domain as anchor text will experience a bonus for that keyword targeting. Anchor text is still powerful in Google’s algorithm. Here are some tips to make the most of your Minisite: – The content must be unique Minisites are often created to be a tangential offering of a brand, but shouldn’t just be a copy/paste of the existing content from a site. Instead, the content should be created especially for the Minisite, with some thought given for how this audience might be unique. – The URLs need to not look spammy to your audience. So many keyword rich URLs can look that way these days. Test with PPC and see if your prospects want to click. No more than a single dash in the URL, only use .com, and two word phrases. For example, this is not a clickable URL: http://solve-your-sales-problems.biz But this is: http://salesmanship.com – The keyword phrase should have good search volume. Keyword phrases that do not show search volume in Google’s Keyword Planner may not be worth investing in. One of the main advantages of a minisite on a custom domain is the “exact match domain” that should exactly match your prospects’ query. Without search volume, that’s one less compelling reason to do a minisite. – Don’t rely on type in traffic. Prospects using Internet explorer when it was the dominant browser would type in “sales management” and be taken to salesmanagement.com. A few years ago, 12% of search traffic could arrive like that. Chrome is now dominant and it searches Google for what you type in. So that type in traffic isn’t as prevalent as it was. – Buy keyword focused domains if there is good search volume. Test them with PPC (for both click through rate and conversion), and then build out larger sites of 20 pages, blog weekly on the site, have videos, get some good links etc. But this technique is not the easy road it once was. There are many fewer shortcuts in today’s Google. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

9 ways to get the sitelinks you want (and deserve!)

Organic sitelinks are the sub-links that appear under your homepage URL in search queries specific to your company. Matt Cutts explaining how sitelinks are generated: A typical company listing has 4-6 sitelinks meant to help users navigate your site directly from the search engine results page, rather than having to click your primary URL to navigate. Some URLs may have up to 12 sitelinks below the primary search result! Organic sitelinks are great for users (and for you!) There are many key benefits to organic sitelinks: Users can quickly and easily gain access to a better-suited landing page than the homepage. This quick navigation option is great for the user and it reduces your organic bounce rate too. Sitelinks provide a large presence on the search results pages. PPC Hero did some research into sitelinks, and found that, why they’re not clicked as often as the primary link, they do provide additional CTR and conversions. Read more the PPC Hero study. Showing 64% increases in PPC ad Click-Through-Rate with sitelinks Having numerous – and well-crafted – sitelinks helps to make your brand look more popular. Big brand tends to have more, and better, sitelinks. 9 tips to get the sitelinks you want (and deserve!) Typical sitelinks include a Contact Us page, plus other pages that look important to Google. However, Google often misunderstands what the key pages are on your site! That’s why it’s crucial that companies watch over and adjust their sitelinks. While you can’t specify sitelinks directly to Google, and they don’t disclose exactly how they choose organic sitelinks, there are key tactics you can use to get the sitelinks you want (and deserve!): Be #1! You will typically only get sitelinks for branded searches, such as for your company name. Sometimes the #1 result will get sitelinks as well, but it’s typically branded queries. Submit a sitemap.xml in Search Console (formerly Webmaster Tools). This appears to be a necessary step before sitelinks are “granted” by Google. Demote undesirable sitelinks in Search Console (formerly Webmaster Tools) if you find that any are showing up. To demote a sitelink URL: On the Search Console homepage, click the site you want. Under Search Appearance, click Sitelinks. In the For this search result box, complete the URL for which you don’t want a specific sitelink URL to appear. In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote. You can demote up to 100 URLs, and demotions are effective for 90 days from your last visit to the demotion page (no need to resubmit – just revisit the page). Look at what you’re linking to sitewide (stop linking or do nofollow), especially in your main navigation elements. Googlebot seems to like lists of links, including H2 tags with links to sections or pages and bulleted lists of links. Learn more here: http://www.seerinteractive.com/blog/get-organic-google-sitelinks-long-form-content/ Use rel=nofollow. Sometimes, privacy policies show up as sitelinks because they have a link on every page of the site. Use a rel=nofollow on pages that Google is incorrectly choosing as sitelinks. Optimize your pages. Ideally, your best pages should already be optimized, but make sure titles and meta-descriptions are in order. Inbound links look at where other sites are linking to (change your redirects or outreach to other sites and ask them to update their links). Googlebot prefers popular pages, including landing pages with volume in analytics. Organic sitelink takeaways While there is no direct formula for sitelinks, these tips can help you better communicate to Googlebot what you would like to show up for your brand. Since search results are often very personalized and based on Google’s algorithm, it may be that certain sitelinks appear for some users, but not for others. PSST! Need a Free Link?  Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Can Google read JavaScript? Yes, but can it really?

Google will eventually crawl all JavaScript, but they haven’t been indexing JavaScript pages very  successfully. Every year, we hear the same story: Google says it’s  getting better at crawling and indexing Javascript. Except crawling JavaScript, and crawling ALL JavaScript are clearly two different accomplishments. Google can crawl it, render it, but just doesn’t seem to use it in the same way as optimized content. JavaScript pages can’t seem to rank as well in search engines, from what we’ve seen. Title tags come through here and there, but not consistently. Although, with the ease of development that JavaScript frameworks offer, it can be difficult to justify optimization with plain text and images. Here are some important questions to consider: 1. Fail gracefully For visitors without JavaScript – either bot or human – offering some sort of page content has always been important. Showing plain text and image content when JavaScript is off embraces the best practice of “failing gracefully.” 2. How quickly do you want results? For many sites, faster rankings means a faster path to revenue. Where pure JavaScript offers a compelling business case, it could be prioritized over “search engine friendliness.” For most sites, the extra visibility is worth extra work optimizing in the most search-friendly ways possible.  3. Is Google responding correctly to a test The entire site doesn’t have to be converted to JavaScript. Instead, use simple one page tests and check Google’s ‘crawlability.” Is Google understanding the DOM, and extracting titles, images and content correctly? 4. What other Google bots need to access your content? There are actually a variety of bots across Google’s many services. Google employs specific bots for their image search, ad services, product listing feeds, etc. Try accessing these with your test. Also, definitely keep your schema/rich snippet code easily accessible: Google has specifically warned that it cannot be found inside of javascript objects.  5. Test with all of Google’s tools: Speaking of Google’s bots, try using Google’s many tools for understanding and analyzing webpages. Seeing any problems here is a serious red flag for your JavaScript. But even if these render JavaScript, Google may not be ranking your pages as well as they would “search friendly” pages. Fetch and render https://www.google.com/webmasters/tools/googlebot-fetch (must be verified and logged into Google Search Console) Page speed Insights https://developers.google.com/speed/pagespeed/insights/ Mobile friendly https://www.google.com/webmasters/tools/mobile-friendly/ Keyword planner: https://adwords.google.com/ko/KeywordPlanner/Home (Ask Google to fetch the keywords from your landing page) Bing is rising Google isn’t the only search engine in town. Even without Yahoo and AOL numbers, Bing’s market share has been increasing steadily year over year. Bing had 21.4 percent market share last year, not counting partnerships with Apple, Yahoo or AOL. That’s getting to be a huge chunk of users. Bing especially has trouble with images inside javascript objects. Bing’s version of the fetch and render tool may display a rendered page, but bing isn’t going to show images in its image results, and the regular results will be inconsistent. Social Media Plain text and image content is also ideal for social media sharing. When a page is shared, most social media sites and can parse the simple text description and image right out – unless there is JavaScript. For most social networks, rich snippets such as open graph and twitter cards could help for the established social networks – but with new social networks (WhatsApp, Snapchat, etc) popping up every year, it would be best to expose the page content as plain text. Google’s JavaScript support is constantly improving. Having a Javascript app on the landing page is often needlessly complex. As of this writing, having an optimized version does appear to still be necessary. Maybe next year’s announcement that Google is crawling JavaScript will be followed by a more robust crawl, but there are plenty of other sites embracing “search engine friendliness”; Your site should too, in order to be competitive.   PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: https://mailchi.mp/3542d97c2fbd/hyper-dog-media-seo-ppc-tips

The care and feeding of images: Optimizing your site’s images

SUMMARY OF SEARCH | April 2016The care and feeding of images: Optimizing your site’s images   Google’s recent changes to search results means you can expect organic traffic to decline: There are more ads at the top for many queries, but Google may have also expanded the display in images in search results. There wasn’t an official announcement, but anecdotal evidence from the last several weeks proves this to be true. Speedy  Google loves speed. It’s because users love speed. A search engine that delivers speedy results can certainly expect to dominate market share. With exponential rise of mobile search, speed is more important than ever. – Images should be sampled down to 72 dpi/ppi.If needed, 96 ppi should be the absolute maximum. In photo editing apps such as Adobe Photoshop, this is usually found in a menu item named “Image Settings.” – Try to scale images appropriately. Increase width if needed, but rely on recommendations from http://gtmetrix.com and https://developers.google.com/speed/pagespeed/insights/ to gauge the best size (one or both will recommend images are scaled down, if needed). Experimentation here will help optimize user experience for the best load times and that’s a great investment of time. When editing your photos, this is also found in “Image Settings” in your image editing app. RelevantGoogle’s patents around reading text in images go way back. But they are not perfect, and if your image is of a certain item like a punching bag, there is no way for Google to instinctively “know” that. – Keywords used in the image filenames.Use dashes instead of spaces or underscores between words. It used to be hotly debated by techies, but now is mostly accepted that Google doesn’t see underscores as spaces. Dashes are so much better, and an improvement for your human audience as well. Image filenames with a space between words can look like this to users: punching%20bag.jpg instead of the more pleasing punching-bag.jpg – ALT tags with keywords describing the product. Use “punching bag” or “martial arts punching bag” instead of just “bag”. Use model numbers and serial numbers in ALT tags where appropriate. But not every image needs an ALT tag. The decorative squiggle image your site might use in its footer doesn’t really need an ALT tag. – Use the Title attribute for imagesThe (lesser) title attribute for images can usually fill with the same content as the alt tag. In some browsers, this text will popup when a user hovers their mouse over the image. Consider situations where you might want text other than the ALT tag here, but they are often very similar. – Put captions below the photos.Text content in the same <div> tag as the photo will help describe your images to Google. Or use the <figcaption> tag when using the <figure> tag for images. RankbrainGoogle’s Rankbrain is an artificial intelligence system that helps Google return the most relevant search results for users. If users expect – and especially click – images for a certain query, Rankbrain is going to show more images for those queries. – Prioritize images for related queries.When someone types is a query “photos of dogs”, Rankbrain correctly guesses that a large block of dog photos should be shown. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: http://goo.gl/YCSNc4

Diversify and Conquer the Ever-Changing SERPs

Google is constantly changing their Search Engine Result Pages (SERPs), and recently caused a stir by removing ads from the right side. For years, organic positions have been changing: Traditional “organic text listings” have been shrinking, but ads have always had their place. And ads have increasingly dominated above the fold. With 4 ads on top, and no ads on the side, it’s a big visual change for desktop search, but there are opportunities. Ad Domination When Google makes a change, we all know by now that the change has been tested thoroughly – and will help them expand on their already 74.5 billion revenue. For some queries, it feels like ads are the new page 1. There might be local, images, news, and perhaps some organic. What we see above the fold in these cases feels like an interstitial; something that we need to click past. With organic position 4 sometimes now falling on page 2, it’s another reason why traffic can decrease when rankings stay the same. Brand Domination The last several years have seen bigger brands dominate both organic and ppc. Big brands get authority links more easily, and have bigger budgets on the ppc side as well. Google is not the level playing field it once seemed for small business, but is increasingly becoming a way to search for “things to buy from top brands”. On the organic side, Google’s updates have penalized the cheap link building of smaller businesses – while favoring brands in separate efforts. Now PPC will be feeling a crunch: Fewer spots near the top is likely o increase bid prices, while removing some bargain positions with traffic at ad position 5. Opportunities Look closely at the search results your best prospects are seeing. Trust Google’s ever-changing algorithm is making the right decisions – eventually – and use it to your advantage; both organically and in your ad campaign. Diversify Check the SERPs for your favorite target keywords and ask yourself: “What content are prospects looking for with this query?” Luckily, Google has already measured for you! There are a variety of research tools to discover what content is getting clicked, linked, liked, shared, visited, etc. But Google is also figuring this out or you- and really has the final say. Consider the types of content Google has chosen for your query: – Images – News – In-depth Articles – Direct Answers – Apps And are images above organic text listings? That’s Google telling what is most important to people conducting this query! What content you see should be taken into account with your SEO Strategy. Great opportunities abound with image search for most sites. On the PPC side, bargains tend to match Google’s latest innovations. Inexpensive clicks are best found in the newest kinds of ads: Product Listing Ads, remarketing, video ads, etc. Smart advertisers implement these before the competition arrives. And by diversifying among different types of advertising, marketers can measure, compare and choose the most efficient. And are you using all of the features of PPC? <a href=”https://searchenginewatch.com/2016/02/23/google-kills-right-hand-side-ads-what-does-this-mean-for-sem/”>Larry Kim pointed out</a> that, since the change, “now all ads can use call-out extensions, sitelink extensions, location extensions, etc.” That’s a huge opportunity to raise CTR in any position, especially if you implement before the ads next to yours.   2.  Piggyback Organic opportunities abound for those watching the SERPS. What sites are at the top of the results? Identify each organic slot as competitor or potential link partner. Those wikipedia pages at the top of many queries can become your next source of great referral traffic. And something Google increasingly references as it scrapes and answers related queries. In the world of PPC, there are also opportunities to piggyback. See apps in the mobile results? Consider in-app advertising. Any site listed in Google’s top results is worth investigating as a potential advertising opportunity, as well. Consider Google your “advertising research engine” for the best sites. As more ads and different kinds of ads are introduced, Google still gives opportunities to nimble marketers. Use Google’s SERPs to research both the content and advertising landscape of your best prospects. And then implement before your competitors. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Title tags & Meta Descriptions: Technical SEO is the Foundation of Engagement SEO

SEO still begins with being friendly to the bots. This “Technical SEO” is more focused on helping bots understand a page, so that humans get a chance to engage with it. Once visitors might see a page, Google can weigh more of the engagement metrics such as organic click-through-rate, bounce rate, time on site, conversion, etc. It’s a mystery -and a controversy – as to whether the current Google algorithm uses these engagement metrics, but pretty much everyone agrees they will move in that direction. With title tags, there are nuances for the crawlers, nuances for the humans, and the sweet spot is where those two worlds connect. If the same title tag most appealing to crawlers is the same title tag your audience will find enticing in Google’s search results, you are on the right path. Engagement SEO is the hard part, but let’s start with the basic requirements, aka Technical SEO. Technical SEO 1. Title tags Valid tags (No single quotes or trying anything cute) Not too long, not too short (50-55 characters is usually best) No duplicates: Every page should be unique, so every title tag should be too! Your keyword targets should be in your title tag, because your page is about them 2. Meta Descriptions Valid tags (We only mention because we’ve seen some crazy code out there) Not too long, not too short (155 characters is the maximum) No duplicates: Every page should be unique, so every description tag should be too! Your keyword targets should be in your description, because your page is about them Engagement SEO Engagement SEO is user-focused, and only possible once a search engine has enough technical SEO requirements in place to give the site visibility. Engagement SEO maximizes whatever visibility the technical SEO provides and includes directives to maximize engagement in search results, landing pages, and throughout the entire buying journey of your prospect 1. Title tags In most cases, Google uses your title tag as the blue link for your page in their search results. Use Adwords to test variations of ad titles. Put the best performing (and variations) into your title tags. No duplicates: Let the user know how this page differs from others you might have of a similar topic. Help them get the correct page first off. Know that Google is watching over their shoulder. Your keyword targets should be in your title tag, because your page is the answer to the user’s query. When a user sees the keyword query they typed in -right there in your title tag – it’s powerful. Google may not bold the keywords in the “ten blue links”, but Bing and other engines do. Social media sites often use the title tag for their “blue link” when something is shared, too!  2. Meta Descriptions In most cases, Google uses your meta description as the black text snippet for your page in their search results. Use Adwords to test variations of meta descriptions, too. Maximize the research you can get from those PPC campaigns! Use calls to action, and entice your prospects to click. Did you know you can break most of the rules of Adwords here? (Don’t get crazy on the exclamation marks though!!) No duplicates: Describe and inform your user what query this description is meant to answer. Your keyword targets should be in your description, because your page is about them. Google bolds the keywords your prospect typed in, right there in their search results. Social media sites often use the meta description when something is shared, too! Titles and meta descriptions must be enticing to searchers. Don’t settle for title tags and meta descriptions that your web developer created to “SEO your site”, but raise the bar. These vital tags are used for more than communicating with bots. This is a prime location to entice searchers with keywords and calls to action. That’s engagement with your user from the very start. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!