The only constant in Organic Search is change

October 2012 was another busy month for Google. The search giant started the month by announcing 65 changes they made during August and September. Google also pushed out a new Penguin Update (v3) on October 5 – these Penguin updates penalize the overuse of keywords both on a website, and through links. We have had a few clients with really bad – and sometimes profane links. They may consider Google’s new disavow links tool, just released. But we recommend caution with the tool right now: Some SEOs are speculating Google may see this as a confession! Information also came out early in the month about Google penalizing domains that were more “keyword rich” than authoritative. This Google update (called EMD, or Exact Match Domain) is hitting domains like cheap-flights-from-denver.com. They would have been favored in the past for searches like “cheap flights from Denver”, but no longer. Authoritative sites were not hit though: ski.com still ranks #1 for “ski”. Google also had an update to its penalty for “Top Heavy” sites – those with too many ads at the tops of the page. Highlights of Google’s 65 recent changes include:1. Changes to titles and snippets. Google is ever more treating the robots.txt directives, title and meta description tags as “suggestions” from webmasters. Sometimes this can be helpful – such as when titles have “comments on” or other generic phrases. Other times, Google’s choices may directly conflict with choices the webmaster has made. 2. Google is using more like terms, and expanding their autocomplete suggestions. A search for “telecom provider” returns results where the term “carrier” is bolded as well as “provider”. Google is sure getting smarter, and it’s a good time to diversify keywords! The Google webmaster guidelines were also updated this month, and reflect their move away from counting low-quality directory as well as low quality bookmarking sites. There wasn’t much news for Bing this last month, but a recent report from antivirus vendor Sophos found that Bing search results contained more than twice as many malware-infected pages as Google’s search results(which is still at a hefty 30%).

Google My Business for Your Business

Businesses thrive when they have an effective way for customers to find them on Google – the search engine most frequently used by your future customers. One of the most effective ways to ensure your business is found is with citations. A citation is any mention of your business online. A structured citation is a mention of your business on a directory such as Google My Business or Yelp, and an unstructured citation is your business information (NAP: Name, Address, Phone Number) that is not in a business directory; this could be anything from an article about your company or a mention of your business on a vendor’s website. Citations are important for Local Search, as they give search engines your business information across the internet. If you want to rank in Google’s Map Pack, you’ll want to start by making sure your Google My Business (GMB) listing is properly optimized and maintained. Here’s how: Who Owns Your Listing? If you’re not sure who owns your GMB listing, or you don’t remember which email address you claimed it under, don’t worry! You can request ownership by creating a Google My Business account and searching for your business listing. If it’s already claimed, you can request ownership at this point, and if it isn’t claimed you can request a postcard be mailed to your business address to verify you are who you say you are, and this business is indeed yours. Search For Duplicate Listings. SEOs know there is nothing search engines hate more than an incorrect NAP on a citation listing. Second to that would be duplicate listings. Google looks at duplicate listings for a business, picks which one they like best (regardless if the information is correct or not) and shows that listing in search. So, how can you tell if you have a duplicate business listing in Google? It’s simple: search for your address and select Maps results. This will show every Google My Business listing for that location. If you see a duplicate of your business, you can claim this listing and merge it with the correct one. Is Your Map Marker Correct? There is nothing more frustrating as a user than finding incorrect information on a business listing. Wrong hours of operation, a listing for a business that is no longer at the address, and the dreaded map marker location. When users are getting directions to your business address, they’ll often look at your listing in maps to see where you are located. Especially if the user is familiar with the area in which your business is located, they may skip the directions altogether. Make sure your map marker is in the correct place by updating your correct address in your listing information and moving the map marker to the correct spot. Trust me, your users will appreciate it! Optimize Your Listing. Optimizing your Google My Business listing is a lot easier than it sounds. You want to make sure your business name is correct, your address and phone number are correct, and as we covered above, your map marker is in the correct place. Additionally, you want to make sure the correct business categories are selected so users know exactly what your business does. Make sure your hours of operation are correct and add additional hours of operation for holidays so your customers know when they can and cannot reach you. Add images of your business so users will know when they’ve found you, and add attributes so they know what features you offer! GMB also recently brought back the description section so you can tell users more about your business. Be careful, though; getting too crazy with keywords can cause Google to hide your listing in search. A good rule of thumb is to not leave any field blank, but to keep your listing as organic as possible! Keep Responding To Your Reviews. This is arguably the most difficult part of maintaining a Google My Business listing. Fortunately, every time you get a review Google will email you at the email address in which you claimed your listing. However, many business owners find this task daunting, especially if they are getting negative reviews. Think of it this way: you can’t make every customer happy. Users know that, and typically find businesses with 100% five-star reviews untrustworthy. Negative reviews are a normal part of doing business, and responding to these reviews show you care about customer service. Google My Business is a platform for unhappy customers to come and express their frustrations with your business, and how you respond says a lot about you. Don’t offer coupons or discounts for the customers to return to your business; instead, express your concerns and give them a phone number or email address to contact you directly to resolve the issue. This turns a negative review to a positive experience and shows Google you’re interacting with your customers which helps boost your rankings. It’s a win, win! User Suggested Edits. Google allows users to suggest edits to business listings directly from search. This means if I know a business location offers bathroom access, has a different phone number, or the listing is missing a suite number, I can suggest the update directly from search. When you log into your GMB account you’ll find a yellow banner across the top of your listing prompting you to approve user suggested edits for your business. Sometimes, Google will publish these edits if they go unapproved by the business owner, or if the listing is unclaimed. So, it’s very important for business owners to be checking in on their listing frequently to make sure their listing information isn’t being changed. Making sure your Google My Business listing is properly optimized (and stays that way!) is the first step to achieving local search rankings. Google My Business is just one piece to a very extensive puzzle, but once you master your Google My Business listing you can easily begin claiming and optimizing … Read more

6 Changes in Google Search

Google has made many changes over the years, other engines have followed suit, and SEO has evolved along with these changes. Consider these 6 ways Google has changed over the last several years. 1. More pages are not necessarily betterGoogle used to reward what would now be considered duplicate content. Endless search results pages, doorway pages, and many other techniques of the past are easily detected by the modern Googlebot. In today’s world, these techniques can be ignored, or even penalized. Where quantity ruled supreme, now quality does. Many sites are pruning, combining, or redirecting the flood of URLs of the old days. If you are tempted by these old techniques, consider that you will likely have to undo the changes. 2. CSS and JS should not be blockedIt used to be a best practice to block Google from JavaScript and CSS resources, as they could otherwise show in the index. And to have those as landing pages was just horrible. But modern Google is very smart: It wants access to everything and needs everything to fully render the page. In having to access these resources, Google analyzes mobile friendliness, speed, layout, and many other factors. 3. Get only good linksFrom the start, Google has always weighed links very heavily. SEOs used to be able to get websites to rank without even improving the site! And in the old days, any link helped – and was disregarded at worst. In modern Google, links should come from the best sources. Links from penalized, unimportant or even new sites are risky and can now cause a Google penalty. A typical link profile of a site might have these and ratios should be monitored – but some low-quality links are best disavowed. A high ratio of any one type can be a red flag to Google. It’s best to invest your time in getting the best links. 4. Google wants to understand youGoogle wants to understand concepts better, and wants to understand you better, too! With the advent of Hummingbird and RankBrain, Google is getting smarter and smarter. Hummingbird was Google’s update to help with classifying content. RankBrain is an Artificial Intelligence update to help Google understand what sort of results a certain query would like to see. Consider that these similar queries are actually quite different: https://www.google.com/search?q=windows+updatehttps://www.google.com/search?q=windows+replacement Think about your prospects’ most important queries driving your traffic. Are you delivering what they are looking for? 5. It’s not just 10 blue linksGoogle has many changes over the years, and what began as a simple list of 10 blue links has evolved into a wide variety of results that could be returned. Results can now include answers, cards, carousels, images, videos, and more. And voice results are becoming increasingly valuable for some queries. Getting to “number one in Google” isn’t quite the same as it was: Number one might be a block of images or an answer ABOVE the number 1 position. The modern approach is key to being successful in today’s Google. Images should be named, tagged and captioned appropriately. Schema should be used to help Google understand and classify your content and even your site. For those that commit to helping Google understand their content, the reward is visibility in a multitude of ways. 6. Keywords? Not providedIn the old days, it was easy to see what keywords your prospects were using to find your site. But since “(Not Provided)” has replaced keyword data in analytics, there have been some big changes. Many sites were over-optimized in the old days, anyway. The new approach isn’t spammy but instead is about being more relevant. In the old days, you could target a broad phrase by using it multiple times, and with a heavy bit of anchor-text. In modern times, it’s important to “talk around” any broad phrases. If you want to be relevant for “Blue Widgets”, you must be relevant for as many aspects of the Blue Widget as possible. Consider what questions prospects are asking, what information or media exist around Blue Widgets, etc. In your SEO approach, always keep in mind that Google has changed quite a bit over the years. Yesterday’s approach was for yesterday’s Google. Bing and the other remaining competitors will keep changing, trying to catch-up to or outdo Google’s innovations. To ensure your success, make sure your approach is in line with Google’s ongoing changes.  

3 Persistent SEO Misconceptions

SEO has had many changes over the years. As marketers and small business owners have worked to understand its many complexities, several misconceptions have remained.   Misconception #1: SEO is “free traffic” Many small businesses are interested in SEO — they see it as “free traffic”. Tired of the ever-increasing click costs of PPC, they are drawn to the siren call of a tactic that will bring free traffic — forever. But this is a giant misconception. Search engine optimization was once a simple process of using the keywords your audience is searching for. And that worked fine — until 2001 or so. But now, competitors are a bit savvier, and ranking in search engines is more like a horse race requiring effort: server configuration, mobile responsiveness, image optimization, tagging, schema, AMP, plenty of content, and — oh yeah — the content should be interesting. Misconception #2: SEO is one time (rules, competitors) In the old days of websites and SEO, getting your site “SEO-ed” could be a one-time process. While the web has changed substantially, this view of Search Engine Optimization has persisted. Modern SEO is indeed a horse race, in which competitors must constantly be bettered by: constantly adding awesome content  earning and seeking inbound links and we think probably: social sharing usability metrics Misconception #3: High-traffic keywords are the best ranking targets High traffic keywords can sometimes sound like the best keyword targets, but they are often the worst! High converting keywords are best in every case. Consider this example: Several years ago we received a call from a prospective client that wanted to rank #1 for “Travel”. Wow, I thought: This could be Expedia or Travelocity on the line. But actually it was a Breckenridge Condominium property. Competing for rankings for the term “Travel” is a really bad idea for (at least) 4 reasons: People searching for “Travel” do not yet know where they want to go — they aren’t necessary looking for Breckenridge — and we don’t know if they would want a condo. In a best-case scenario, the site could get to page eight — and that still doesn’t mean any prospects would book a condo. Even page two is a ghost town, with page eight as quiet as deep space. They are competing at a huge level, way beyond what is necessary to rank number one for “Breckenridge Condo.” It’s crazy inefficient,  like investing in a triple-crown champion horse when you just need a healthy horse to win the race. In a fantasy universe, a Breckenridge Condo would get to number one in Google — and receive an overwhelming amount of bad leads a day. Keyword targets are also a prequalifying process when done right. A better approach is for the condo company to first compete for exactly what they are: “Breckenridge Condo” “Breckenridge Condominium” (These are the keywords with a 100% chance of conversion) Only then should they look at broader terms likely to have some prospects: “Breckenridge Hotel” “Breckenridge Motel” “Summit County Condo” This phenomenon isn’t just among condo owners — we all have daydreams of ranking for something that delivers huge traffic. Instead, focus on what your best customers are typing into search engines — just make sure it does have some search volume. SEO has changed much over the years, and has evolved from a one-time process of using high-search-volume keywords to using targeted keywords with a high search volume and high conversion rate.

Can Google read JavaScript? Yes, but can it really?

Google will eventually crawl all JavaScript, but they haven’t been indexing JavaScript pages very  successfully. Every year, we hear the same story: Google says it’s  getting better at crawling and indexing Javascript. Except crawling JavaScript, and crawling ALL JavaScript are clearly two different accomplishments. Google can crawl it, render it, but just doesn’t seem to use it in the same way as optimized content. JavaScript pages can’t seem to rank as well in search engines, from what we’ve seen. Title tags come through here and there, but not consistently. Although, with the ease of development that JavaScript frameworks offer, it can be difficult to justify optimization with plain text and images. Here are some important questions to consider: 1. Fail gracefully For visitors without JavaScript – either bot or human – offering some sort of page content has always been important. Showing plain text and image content when JavaScript is off embraces the best practice of “failing gracefully.” 2. How quickly do you want results? For many sites, faster rankings means a faster path to revenue. Where pure JavaScript offers a compelling business case, it could be prioritized over “search engine friendliness.” For most sites, the extra visibility is worth extra work optimizing in the most search-friendly ways possible.  3. Is Google responding correctly to a test The entire site doesn’t have to be converted to JavaScript. Instead, use simple one page tests and check Google’s ‘crawlability.” Is Google understanding the DOM, and extracting titles, images and content correctly? 4. What other Google bots need to access your content? There are actually a variety of bots across Google’s many services. Google employs specific bots for their image search, ad services, product listing feeds, etc. Try accessing these with your test. Also, definitely keep your schema/rich snippet code easily accessible: Google has specifically warned that it cannot be found inside of javascript objects.  5. Test with all of Google’s tools: Speaking of Google’s bots, try using Google’s many tools for understanding and analyzing webpages. Seeing any problems here is a serious red flag for your JavaScript. But even if these render JavaScript, Google may not be ranking your pages as well as they would “search friendly” pages. Fetch and render https://www.google.com/webmasters/tools/googlebot-fetch (must be verified and logged into Google Search Console) Page speed Insights https://developers.google.com/speed/pagespeed/insights/ Mobile friendly https://www.google.com/webmasters/tools/mobile-friendly/ Keyword planner: https://adwords.google.com/ko/KeywordPlanner/Home (Ask Google to fetch the keywords from your landing page) Bing is rising Google isn’t the only search engine in town. Even without Yahoo and AOL numbers, Bing’s market share has been increasing steadily year over year. Bing had 21.4 percent market share last year, not counting partnerships with Apple, Yahoo or AOL. That’s getting to be a huge chunk of users. Bing especially has trouble with images inside javascript objects. Bing’s version of the fetch and render tool may display a rendered page, but bing isn’t going to show images in its image results, and the regular results will be inconsistent. Social Media Plain text and image content is also ideal for social media sharing. When a page is shared, most social media sites and can parse the simple text description and image right out – unless there is JavaScript. For most social networks, rich snippets such as open graph and twitter cards could help for the established social networks – but with new social networks (WhatsApp, Snapchat, etc) popping up every year, it would be best to expose the page content as plain text. Google’s JavaScript support is constantly improving. Having a Javascript app on the landing page is often needlessly complex. As of this writing, having an optimized version does appear to still be necessary. Maybe next year’s announcement that Google is crawling JavaScript will be followed by a more robust crawl, but there are plenty of other sites embracing “search engine friendliness”; Your site should too, in order to be competitive.   PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: https://mailchi.mp/3542d97c2fbd/hyper-dog-media-seo-ppc-tips

The care and feeding of images: Optimizing your site’s images

SUMMARY OF SEARCH | April 2016The care and feeding of images: Optimizing your site’s images   Google’s recent changes to search results means you can expect organic traffic to decline: There are more ads at the top for many queries, but Google may have also expanded the display in images in search results. There wasn’t an official announcement, but anecdotal evidence from the last several weeks proves this to be true. Speedy  Google loves speed. It’s because users love speed. A search engine that delivers speedy results can certainly expect to dominate market share. With exponential rise of mobile search, speed is more important than ever. – Images should be sampled down to 72 dpi/ppi.If needed, 96 ppi should be the absolute maximum. In photo editing apps such as Adobe Photoshop, this is usually found in a menu item named “Image Settings.” – Try to scale images appropriately. Increase width if needed, but rely on recommendations from http://gtmetrix.com and https://developers.google.com/speed/pagespeed/insights/ to gauge the best size (one or both will recommend images are scaled down, if needed). Experimentation here will help optimize user experience for the best load times and that’s a great investment of time. When editing your photos, this is also found in “Image Settings” in your image editing app. RelevantGoogle’s patents around reading text in images go way back. But they are not perfect, and if your image is of a certain item like a punching bag, there is no way for Google to instinctively “know” that. – Keywords used in the image filenames.Use dashes instead of spaces or underscores between words. It used to be hotly debated by techies, but now is mostly accepted that Google doesn’t see underscores as spaces. Dashes are so much better, and an improvement for your human audience as well. Image filenames with a space between words can look like this to users: punching%20bag.jpg instead of the more pleasing punching-bag.jpg – ALT tags with keywords describing the product. Use “punching bag” or “martial arts punching bag” instead of just “bag”. Use model numbers and serial numbers in ALT tags where appropriate. But not every image needs an ALT tag. The decorative squiggle image your site might use in its footer doesn’t really need an ALT tag. – Use the Title attribute for imagesThe (lesser) title attribute for images can usually fill with the same content as the alt tag. In some browsers, this text will popup when a user hovers their mouse over the image. Consider situations where you might want text other than the ALT tag here, but they are often very similar. – Put captions below the photos.Text content in the same <div> tag as the photo will help describe your images to Google. Or use the <figcaption> tag when using the <figure> tag for images. RankbrainGoogle’s Rankbrain is an artificial intelligence system that helps Google return the most relevant search results for users. If users expect – and especially click – images for a certain query, Rankbrain is going to show more images for those queries. – Prioritize images for related queries.When someone types is a query “photos of dogs”, Rankbrain correctly guesses that a large block of dog photos should be shown. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business! use Link: http://goo.gl/YCSNc4

Title tags & Meta Descriptions: Technical SEO is the Foundation of Engagement SEO

SEO still begins with being friendly to the bots. This “Technical SEO” is more focused on helping bots understand a page, so that humans get a chance to engage with it. Once visitors might see a page, Google can weigh more of the engagement metrics such as organic click-through-rate, bounce rate, time on site, conversion, etc. It’s a mystery -and a controversy – as to whether the current Google algorithm uses these engagement metrics, but pretty much everyone agrees they will move in that direction. With title tags, there are nuances for the crawlers, nuances for the humans, and the sweet spot is where those two worlds connect. If the same title tag most appealing to crawlers is the same title tag your audience will find enticing in Google’s search results, you are on the right path. Engagement SEO is the hard part, but let’s start with the basic requirements, aka Technical SEO. Technical SEO 1. Title tags Valid tags (No single quotes or trying anything cute) Not too long, not too short (50-55 characters is usually best) No duplicates: Every page should be unique, so every title tag should be too! Your keyword targets should be in your title tag, because your page is about them 2. Meta Descriptions Valid tags (We only mention because we’ve seen some crazy code out there) Not too long, not too short (155 characters is the maximum) No duplicates: Every page should be unique, so every description tag should be too! Your keyword targets should be in your description, because your page is about them Engagement SEO Engagement SEO is user-focused, and only possible once a search engine has enough technical SEO requirements in place to give the site visibility. Engagement SEO maximizes whatever visibility the technical SEO provides and includes directives to maximize engagement in search results, landing pages, and throughout the entire buying journey of your prospect 1. Title tags In most cases, Google uses your title tag as the blue link for your page in their search results. Use Adwords to test variations of ad titles. Put the best performing (and variations) into your title tags. No duplicates: Let the user know how this page differs from others you might have of a similar topic. Help them get the correct page first off. Know that Google is watching over their shoulder. Your keyword targets should be in your title tag, because your page is the answer to the user’s query. When a user sees the keyword query they typed in -right there in your title tag – it’s powerful. Google may not bold the keywords in the “ten blue links”, but Bing and other engines do. Social media sites often use the title tag for their “blue link” when something is shared, too!  2. Meta Descriptions In most cases, Google uses your meta description as the black text snippet for your page in their search results. Use Adwords to test variations of meta descriptions, too. Maximize the research you can get from those PPC campaigns! Use calls to action, and entice your prospects to click. Did you know you can break most of the rules of Adwords here? (Don’t get crazy on the exclamation marks though!!) No duplicates: Describe and inform your user what query this description is meant to answer. Your keyword targets should be in your description, because your page is about them. Google bolds the keywords your prospect typed in, right there in their search results. Social media sites often use the meta description when something is shared, too! Titles and meta descriptions must be enticing to searchers. Don’t settle for title tags and meta descriptions that your web developer created to “SEO your site”, but raise the bar. These vital tags are used for more than communicating with bots. This is a prime location to entice searchers with keywords and calls to action. That’s engagement with your user from the very start. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

HTTPS is Quickly Becoming A “Must Have”

Google is really pushing SSL, with some recent announcements aimed at getting the web secure. Those sites still on http – in whole or in part – need to migrate the entire site to SSL early this next year to stay in Google’s good graces. With “Mobilepocalypse”, we saw that Google makes an announcement before they bring the “beat down”. So now is the time to start planning your migration. SSL is a good change Having SSL encryption between server and user eliminates eavesdropping and “man-in-the-middle” attacks that threaten both site visitors and the site itself! While SSL comes in different varieties of encryption, and has gone through a few security issues the last couple of years (“heartbleed”, etc), it’s an improvement for both privacy and security. More and more organic traffic is being counted as Direct traffic by Google. It’s how they count anything that goes from an https site to http site, RFC 2676.  We are seeing a sudden surge in this “some organic has become direct” across client accounts. It’s a full website migration Changing an entire site to https can be a pretty major website migration, as all URLs change. All internal and external links, scripts, images, iframes, stemap.xml URLs, canonical tags and other tags need to be checked. 301 redirects In the migration, all URLs that start with http will need to be 301 redirected to their https equivalent. And it needs to be a single hop. We’ve seen server configurations that have a whole chain of redirects going from 301 to a 302 back to a 301, so it’s good to check redirects thoroughly. Without a single 301 redirect, link authority is discarded by Google – and that means rankings are not maximized. Links / References to http resources and URLs After the migration, all URLs should be 301 redirecting – and all references to them within the site’s html should as well. We’ve seen many internal links that are actually absolute links such as: <a href=”http://www.site.com/contact-us”>Contact Us</a> instead of relative links such as <a href=”/contact-us”>Contact Us</a> Page speed Keep an eye on page speed before and after the transition to SSL. The server’s processor needs a bit more time for the SSL encryption and transactions. Server response times can increase a little or a lot based on it’s configuration. And it’s best to go in to the migration with a super-fast website. Google Search Console In Google Search Console (formerly Webmaster Tools), it’s important to verify the SSL version of your website. If Bing is a significant source of traffic, you should also verify the SSL version of the site there. Inbound links After the migration, it’s a great idea to reach out to important sites linking to you. Not only is it an excellent way to keep those relationships, but having inbound links change to reference your https URLs maximizes link authority. And a direct link to the new URL is better than a 301 redirect. Consider a press release Why not consider a press release afterward, professing your commitment to the security and privacy of your visitors? If you haven’t made the switch to HTTPS, now is the time to start. And if you have made the switch, be sure to have us double-check your work! PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

SEO Forecast 2016: What Needs To Be On Your Radar

The Search Engine Optimization (SEO) world continues to evolve at break-neck speed. Constant change is the only constant, as even the machine learning that governs the algorithms is in constant flux. The year 2016 will be no different, and here are the items to have on your radar. Here are 4 things to focus your efforts on in 2016: 1. Mobile Mobile is seen as last year’s concern in many ways but there is much to do beyond Google’s mobile friendly tool: https://www.google.com/webmasters/tools/mobile-friendly/ Speed is everything, and takes on extra importance for mobile. Marketers must make sure they pass Google’s speed test: https://developers.google.com/speed/pagespeed/insights/ Just because Google doesn’t have a factor in it’s own public test, doesn’t mean Google doesn’t notice. Internal tests, and the engagement metrics from actual users are influenced by speed, so it’s increasingly vital. So take a good look at speed suggestions from 3rd party sources: https://gtmetrix.com/ Searchmetrics recently released their first-ever Mobile Ranking Factors report, and found the 10 highest-ranking pages take an average of 1.10 seconds to load! 2. Rich Snippets / Schema Seen review stars in Google’s results before? These are likely from “schema” code elements on the web page itself. Code elements such as Rich Snippets have been shown to boost a site’s click-through-rate in organic results, increase visibility, increase sharing – but they also help communicate with Googlebot and other search engine crawlers. A variety of Rich Snippets should be employed on your site this year: Schema elements have been shown to increase rankings 4 places, and click-through rates 30% in various studies. Different schema rich snippet examples can be found at http://schema.org, which is a collaboration between the major search engines. Some types include products, ratings, events, recipes, locations, people, etc. Social media networks have come up with their ideas for Rich snippets as well: Facebook open graph tags LinkedIn cards Twitter cards Pinterest pincards Consider how a Tweet of a site using Twitter cards looks better than the standard Tweet: It shows a larger image on Twitter, which provides a much richer experience for viewers of that Tweet. And there are many different types of Twitter cards too: Galleries, large images, video players, etc. As Google tries to answer many queries now right in it’s search engine results, you may need to provide “Rich Answers” to Google. This can mean extra brand visibility, but less traffic to your site. Still, if Google is featuring a brand at the top of their results, you’ll want it to be yours! 3. Distribute Your Content for Link Authority Content marketing is the new link building. And many brands are creating content, but not marketing it. Creating relevant content for your prospects is authentic marketing. With the right approach, that content is a valuable asset and can also boost rankings! When done correctly, content marketing brings: social sharing brand visibility inbound links (with authority) referral traffic Search Engine Optimization will always be ever-changing: Technology is moving at breakneck speeds and search engines have ever-changing criteria and expectations. Having these six items on your radar will help carry you nicely into the New Year, and then some. The year 2016 may be completely different, but these are good solid investments of time and money. 4. Be Ready for More Penalties Google’s Panda penalties are assessed against sites that do not have their house in order, and trust us that sites can go into disarray so very easily! Broken links, missing pages not redirecting, thin pages, duplicate content are all challenges websites deal with. But Google’s Phantom Updates and Penguin Penalties have also had a tremendous impact of website visibility. Be aware, and be ready! PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!

Penguin Update Coming: An Ominous End to 2015

Gary Illyes of Google indicated the next Penguin update is by end-of-year. While not acknowledged by Gary Illes, this Penguin update is rumored to be continuous, and apermanent part of Google’s algorithm. If the rumors are true, it will be welcome relief for manysites that are penalized, making amends (through disavowing, etc), and waiting up to a year to see the penalty lifted. There has actually been some confusion, spread by Google,as to whether these updates are “realtime & continuous” or not. Gone are the days where a penalty might be matched up between dates of Google updates and a sudden loss of rankings and traffic. 1. Use your keywords, in a natural way Be relevant for your keywords, but don’t get crazy. Make sure you are not over-optimizing on your site around a limited set of keywords. Consider that the more competitive the topic, the more content you need AROUND the topic. The wrong approach is becoming repetitive and beyond what feels like natural, human readable content. 2. Disavow bad links Links can go bad: Sites get penalized, or maybe Google changes their guidelines. Regardless, link disavowal is an important part of modern SEO. It can be hard to find all sources of links, but start with Webmaster Tools. We also subscribe to many tools for link research: Google hasn’t found all of your site’s bad links yet, and they also do not report all of the links they do find! 3. Maximize your good links Many times pages move or removed from a site, and Google stops counting the inbound links (from other sites) to those pages. What a huge loss of authority this can be! Now is the time to maximize your existing link authority: Simply 301 redirect old links to their proper new locations. 4. Get more good links Google weighs hundreds of factors in ranking websites, but links consistently have been shown to be a top factor. The best way to get great links is to be awesome, or produce awesome content – and then get the word out! And Google Panda is allegedly STILL rolling out – since July! So make sure your Technical SEO house is in order. PSST! Need a Free Link? Get a free link for your agency: Would you like our monthly take on the changing world of SEO delivered to your inbox? Subscribe to the Hyper Dog Media SEO Newsletter HERE! When you subscribe, each newsletter will contain a link idea for your business!