President of Hyper Dog Media to Speak at Thrive Workplace AI Panel

Hyper Dog Media announced that their President Jim Kreinbrink will be among the featured speakers at Thrive Workplace’s upcoming event, AI IRL: Real Tools for Real Business, taking place on Thursday, September 25, from 2:00–3:30 PM at Thrive’s Centennial coworking location in south Denver. The panel will bring together prominent local business leaders to share practical insights on how artificial intelligence tools are being used to enhance productivity, streamline workflows, and improve marketing strategies. The event, hosted by Thrive Workplace, is designed for business owners, entrepreneurs, and professionals eager to learn how to apply AI in real-world scenarios without the risk of being replaced by it. Attendees will hear from experts who are actively leveraging platforms such as ChatGPT, Perplexity, Make, Lovable, Claude Code, Windsurf and many other AI tools to solve everyday business challenges. For Jim Kreinbrink, the opportunity to share his experiences will emphasize Hyper Dog Media’s ongoing role as a resource for effective digital strategies. “AI is no longer just a buzzword—it’s a set of tools that can be implemented immediately to save time and deliver measurable results,” Kreinbrink explained. “As part of the Thrive Workplace AI panel, I look forward to sharing with businesses how they can take advantage of these new innovations while keeping their human creativity and strategy at the core of their work.” As President of Hyper Dog Media, Kreinbrink has guided organizations through the evolving landscape of search engine optimization, paid advertising, and digital branding. The company has consistently worked with businesses to build sustainable growth through online visibility. By combining advanced technologies with proven digital marketing expertise, Hyper Dog Media helps clients design and implement digital marketing strategy inside of a digital-first economy. The September panel will not only showcase how AI can be integrated into marketing, but also how it can support operations, customer service, and overall business efficiency. The session promises practical takeaways; ranging from ready-to-use prompts to time-saving workflows that attendees can apply immediately. Following the discussion, Thrive Workplace will host its Spritz Happy Hour at 3:30 PM — a casual member driven networking reception for attendees to continue the conversation and connect with local professionals. Community events like this highlight the importance of shared knowledge among Denver-area businesses. The inclusion of voices such as Kreinbrink’s provides attendees with a grounded perspective from another leader who is not only talking about AI but actively using it in their day-to-day work for the last few years. Hyper Dog Media’s participation reflects a larger commitment to helping organizations use technology to achieve meaningful, long-term results. “This event is about collaboration and learning,” Kreinbrink added. “No matter the industry or size of a business, AI can offer tools to work smarter, not harder. I’m excited to exchange ideas and provide practical resources that people can put into practice right away.” Those interested in learning more about Hyper Dog Media’s digital marketing services can visit https://www.hyperdogmedia.com/ for additional information.

Hyper Dog Media Would Never Disavow a Link, Challenges Aging Best Practices

Hyper Dog Media, an innovative digital marketing agency with a specialization in Search Engine Optimization (SEO), has recently taken a definitive position on a topic that has long sparked debate within the SEO community: the practice of link disavowal. In a comprehensive blog post, the agency debunks the necessity of disavowing links, stating that they would “never ever disavow a link—probably.” This pronouncement is a reflection of the agency’s allegiance to mastering Google’s complex algorithms and best practices, highlighting their profound insight into the current dynamics of how search engines evaluate and attribute importance to backlinks. The foundation of Hyper Dog Media’s perspective is a meticulous analysis of the progression of Google’s algorithm over the years. Currently, Google’s algorithm boasts advanced features that enable it to automatically discern the quality of links and penalize or ignore those identified as spam or of poor quality. The blog post explores the original intention behind the creation of Google’s disavow tool, designed as a measure of last resort for websites that find themselves grappling with severe penalties due to an accumulation of harmful backlinks. However, the emphasis is on the hidden risks associated with the tool, which might not be immediately obvious upon first examination. “Google has made significant strides in how it evaluates links, making the disavow tool essentially redundant for the majority of SEO scenarios we encounter,” Jim Kreinbrink, the founder of Hyper Dog Media, explains. “Our stance is informed by a deep understanding of these technological advancements. Instead of relying on disavowal, we advocate for the development of resilient SEO strategies that preempt the need for such extreme measures, choosing rather to concentrate on the generation of high-caliber content and the fostering of trust through effective link building.” Hyper Dog Media’s blog post serves as a caution against precipitous decisions to use the disavow tool. The subtleties distinguishing harmful, neutral, and beneficial links are nuanced, and a misjudgment in this area could inadvertently sabotage a site’s SEO performance. Aligning with advice from Google’s own spokespeople, Hyper Dog Media concurs that the cons of disavowing links generally outweigh the pros unless there’s a manual penalty applied directly by Google. “Our approach champions the pursuit of sustainable SEO success,” Kreinbrink continues. “Hasty decisions, such as the blanket disavowal of links, risk derailing the strategic objectives we’ve set for our clients. Our methodology is to adopt strategies that organically elevate our clients’ online authority and reputation in a positive and lasting way.” By articulating their insights and the rationale behind this critical policy, Hyper Dog Media seeks to cast light on their nuanced and sophisticated understanding of SEO. This policy informs their wide array of services, including but not limited to, pay-per-click advertising, conversion optimization, and social media optimization. Their comprehensive strategy is aimed at enhancing a business’s online visibility without resorting to the expedient use of link disavowal. For more information about Hyper Dog Media and their services, visit their website. Such an approach is not only in accord with Google’s evolving algorithms but also embodies best practices that define successful SEO strategies in the contemporary digital landscape. This alignment assures a forward-looking and dynamic digital marketing strategy that further underscores the importance of a deep, strategic understanding of SEO beyond superficial fixes.

Hyper Dog Media PPC Strategy

Our approach is to always save the ad budget for our best prospects. We don’t want broad campaigns to use the budget and miss maximum visibility in front of the best prospects. We use settings such as Network, Location, Demographics, Audiences, Devices, Time of Day, Day of Week, and Negative Keywords to focus campaigns. We fully fund these focused campaigns, while running more broad campaigns only where needed. We don’t want to starve the funnel, but ad networks made quite a bit of money on waste and broad targeting. Other aspects of our approach: – Favoring bidding methods with control and data. That means more favoring manual CPC and tight management with data over Google’s automated bidding management and a black box. Some prospects would rather pay more per click instead of hiring a firm to manage: Automated bidding and the old “Adwords Express” were created for those people. Those aren’t our prospects. – Testing EVERYTHING for conversions and Click Through Rate: Keywords, ad copy, landing pages. – Creating ads and landing pages for every kind of Call To Action. – Remarketing/retargeting to prospects, with ads and landing pages that match where they are in their buyers journey. Using the lessons of PPC to inform SEO decisions 1. Title tags and meta descriptions. Clicks are expensive, so it’s vital to use those lessons everywhere. We can raise organic CTR, which gets more visitors and may be a ranking factor (The “rank brain” part of Google’s Algorithm watches what people are clicking!) 2. Site structure We organize ad campaigns around types of prospects. Websites should be organized similarly into “silos”. Google rewards this structure, and it makes sense. On an economic development website, there are existing business members, prospective members, and site selectors visiting. Each has unique needs and should have a unique area of the site.

The only constant in Organic Search is change

October 2012 was another busy month for Google. The search giant started the month by announcing 65 changes they made during August and September. Google also pushed out a new Penguin Update (v3) on October 5 – these Penguin updates penalize the overuse of keywords both on a website, and through links. We have had a few clients with really bad – and sometimes profane links. They may consider Google’s new disavow links tool, just released. But we recommend caution with the tool right now: Some SEOs are speculating Google may see this as a confession! Information also came out early in the month about Google penalizing domains that were more “keyword rich” than authoritative. This Google update (called EMD, or Exact Match Domain) is hitting domains like cheap-flights-from-denver.com. They would have been favored in the past for searches like “cheap flights from Denver”, but no longer. Authoritative sites were not hit though: ski.com still ranks #1 for “ski”. Google also had an update to its penalty for “Top Heavy” sites – those with too many ads at the tops of the page. Highlights of Google’s 65 recent changes include:1. Changes to titles and snippets. Google is ever more treating the robots.txt directives, title and meta description tags as “suggestions” from webmasters. Sometimes this can be helpful – such as when titles have “comments on” or other generic phrases. Other times, Google’s choices may directly conflict with choices the webmaster has made. 2. Google is using more like terms, and expanding their autocomplete suggestions. A search for “telecom provider” returns results where the term “carrier” is bolded as well as “provider”. Google is sure getting smarter, and it’s a good time to diversify keywords! The Google webmaster guidelines were also updated this month, and reflect their move away from counting low-quality directory as well as low quality bookmarking sites. There wasn’t much news for Bing this last month, but a recent report from antivirus vendor Sophos found that Bing search results contained more than twice as many malware-infected pages as Google’s search results(which is still at a hefty 30%).

Google My Business for Your Business

Businesses thrive when they have an effective way for customers to find them on Google – the search engine most frequently used by your future customers. One of the most effective ways to ensure your business is found is with citations. A citation is any mention of your business online. A structured citation is a mention of your business on a directory such as Google My Business or Yelp, and an unstructured citation is your business information (NAP: Name, Address, Phone Number) that is not in a business directory; this could be anything from an article about your company or a mention of your business on a vendor’s website. Citations are important for Local Search, as they give search engines your business information across the internet. If you want to rank in Google’s Map Pack, you’ll want to start by making sure your Google My Business (GMB) listing is properly optimized and maintained. Here’s how: Who Owns Your Listing? If you’re not sure who owns your GMB listing, or you don’t remember which email address you claimed it under, don’t worry! You can request ownership by creating a Google My Business account and searching for your business listing. If it’s already claimed, you can request ownership at this point, and if it isn’t claimed you can request a postcard be mailed to your business address to verify you are who you say you are, and this business is indeed yours. Search For Duplicate Listings. SEOs know there is nothing search engines hate more than an incorrect NAP on a citation listing. Second to that would be duplicate listings. Google looks at duplicate listings for a business, picks which one they like best (regardless if the information is correct or not) and shows that listing in search. So, how can you tell if you have a duplicate business listing in Google? It’s simple: search for your address and select Maps results. This will show every Google My Business listing for that location. If you see a duplicate of your business, you can claim this listing and merge it with the correct one. Is Your Map Marker Correct? There is nothing more frustrating as a user than finding incorrect information on a business listing. Wrong hours of operation, a listing for a business that is no longer at the address, and the dreaded map marker location. When users are getting directions to your business address, they’ll often look at your listing in maps to see where you are located. Especially if the user is familiar with the area in which your business is located, they may skip the directions altogether. Make sure your map marker is in the correct place by updating your correct address in your listing information and moving the map marker to the correct spot. Trust me, your users will appreciate it! Optimize Your Listing. Optimizing your Google My Business listing is a lot easier than it sounds. You want to make sure your business name is correct, your address and phone number are correct, and as we covered above, your map marker is in the correct place. Additionally, you want to make sure the correct business categories are selected so users know exactly what your business does. Make sure your hours of operation are correct and add additional hours of operation for holidays so your customers know when they can and cannot reach you. Add images of your business so users will know when they’ve found you, and add attributes so they know what features you offer! GMB also recently brought back the description section so you can tell users more about your business. Be careful, though; getting too crazy with keywords can cause Google to hide your listing in search. A good rule of thumb is to not leave any field blank, but to keep your listing as organic as possible! Keep Responding To Your Reviews. This is arguably the most difficult part of maintaining a Google My Business listing. Fortunately, every time you get a review Google will email you at the email address in which you claimed your listing. However, many business owners find this task daunting, especially if they are getting negative reviews. Think of it this way: you can’t make every customer happy. Users know that, and typically find businesses with 100% five-star reviews untrustworthy. Negative reviews are a normal part of doing business, and responding to these reviews show you care about customer service. Google My Business is a platform for unhappy customers to come and express their frustrations with your business, and how you respond says a lot about you. Don’t offer coupons or discounts for the customers to return to your business; instead, express your concerns and give them a phone number or email address to contact you directly to resolve the issue. This turns a negative review to a positive experience and shows Google you’re interacting with your customers which helps boost your rankings. It’s a win, win! User Suggested Edits. Google allows users to suggest edits to business listings directly from search. This means if I know a business location offers bathroom access, has a different phone number, or the listing is missing a suite number, I can suggest the update directly from search. When you log into your GMB account you’ll find a yellow banner across the top of your listing prompting you to approve user suggested edits for your business. Sometimes, Google will publish these edits if they go unapproved by the business owner, or if the listing is unclaimed. So, it’s very important for business owners to be checking in on their listing frequently to make sure their listing information isn’t being changed. Making sure your Google My Business listing is properly optimized (and stays that way!) is the first step to achieving local search rankings. Google My Business is just one piece to a very extensive puzzle, but once you master your Google My Business listing you can easily begin claiming and optimizing … Read more

6 Changes in Google Search

Google has made many changes over the years, other engines have followed suit, and SEO has evolved along with these changes. Consider these 6 ways Google has changed over the last several years. 1. More pages are not necessarily betterGoogle used to reward what would now be considered duplicate content. Endless search results pages, doorway pages, and many other techniques of the past are easily detected by the modern Googlebot. In today’s world, these techniques can be ignored, or even penalized. Where quantity ruled supreme, now quality does. Many sites are pruning, combining, or redirecting the flood of URLs of the old days. If you are tempted by these old techniques, consider that you will likely have to undo the changes. 2. CSS and JS should not be blockedIt used to be a best practice to block Google from JavaScript and CSS resources, as they could otherwise show in the index. And to have those as landing pages was just horrible. But modern Google is very smart: It wants access to everything and needs everything to fully render the page. In having to access these resources, Google analyzes mobile friendliness, speed, layout, and many other factors. 3. Get only good linksFrom the start, Google has always weighed links very heavily. SEOs used to be able to get websites to rank without even improving the site! And in the old days, any link helped – and was disregarded at worst. In modern Google, links should come from the best sources. Links from penalized, unimportant or even new sites are risky and can now cause a Google penalty. A typical link profile of a site might have these and ratios should be monitored – but some low-quality links are best disavowed. A high ratio of any one type can be a red flag to Google. It’s best to invest your time in getting the best links. 4. Google wants to understand youGoogle wants to understand concepts better, and wants to understand you better, too! With the advent of Hummingbird and RankBrain, Google is getting smarter and smarter. Hummingbird was Google’s update to help with classifying content. RankBrain is an Artificial Intelligence update to help Google understand what sort of results a certain query would like to see. Consider that these similar queries are actually quite different: https://www.google.com/search?q=windows+updatehttps://www.google.com/search?q=windows+replacement Think about your prospects’ most important queries driving your traffic. Are you delivering what they are looking for? 5. It’s not just 10 blue linksGoogle has many changes over the years, and what began as a simple list of 10 blue links has evolved into a wide variety of results that could be returned. Results can now include answers, cards, carousels, images, videos, and more. And voice results are becoming increasingly valuable for some queries. Getting to “number one in Google” isn’t quite the same as it was: Number one might be a block of images or an answer ABOVE the number 1 position. The modern approach is key to being successful in today’s Google. Images should be named, tagged and captioned appropriately. Schema should be used to help Google understand and classify your content and even your site. For those that commit to helping Google understand their content, the reward is visibility in a multitude of ways. 6. Keywords? Not providedIn the old days, it was easy to see what keywords your prospects were using to find your site. But since “(Not Provided)” has replaced keyword data in analytics, there have been some big changes. Many sites were over-optimized in the old days, anyway. The new approach isn’t spammy but instead is about being more relevant. In the old days, you could target a broad phrase by using it multiple times, and with a heavy bit of anchor-text. In modern times, it’s important to “talk around” any broad phrases. If you want to be relevant for “Blue Widgets”, you must be relevant for as many aspects of the Blue Widget as possible. Consider what questions prospects are asking, what information or media exist around Blue Widgets, etc. In your SEO approach, always keep in mind that Google has changed quite a bit over the years. Yesterday’s approach was for yesterday’s Google. Bing and the other remaining competitors will keep changing, trying to catch-up to or outdo Google’s innovations. To ensure your success, make sure your approach is in line with Google’s ongoing changes.  

4 Reasons Why Organic Traffic Can Stay the Same – Even When Rankings Go Up

The amount of organic traffic coming to a website is an important measurement of SEO success, but several factors can mean fluctuations – or even decreases – while rankings are stable. Four Ads at the Top In the last year, Google has removed text ads from the side of their search engine results pages (SERPs) and placed up to four at the top. For many competitive queries, this means less visibility. In many cases, the #1 organic position is now below the fold! That dramatic shift in position means fewer clicks. According to a 2014 study, these are the percentage of clicks a listing can expect in each of Google’s top 5 positions: 1 – 29% 2 – 15% 3 – 11% 4 – 7% 5 – 5%   The dynamics change considerably when more ads push a number 2 position down to where it might receive 7% or 5% of the clicks! For many competitive keywords we are tracking, this is the most dramatic shift we’ve seen for organic traffic. It is also possible to “cannibalize” your organic traffic with PPC where your site was already at the top. So be careful out there, and check your most important SERPs.   Search Volume has Decreased Another reason organic traffic can decrease is due to trends or seasonal fluctuations. Many businesses do have seasons, and Year-over-Year traffic is the better measurement. And don’t forget to check https://trends.google.com/ for trends in the queries your visitors might be using.   Organic Traffic Counted as Direct Traffic There are a few ways that organic traffic can show up as direct traffic. If it’s a mystery as to why organic traffic is decreasing, check direct traffic in Google Analytics. Where direct traffic is soaring, Google Analytics may not be seeing the true source (aka referrer) of the traffic. There may be a couple of reasons:   – Redirects We’ve seen many strange redirects over the years, enough that this is worth mentioning. Referrer information can be removed when redirects are done via programming languages, or even in a chain of redirects that cross to HTTPS and back.   – Certain browsers block information There have been periods in which Safari blocked referrer information. On sites with heavy IOS traffic, the effect is easier to spot. But for many sites, this can be a difficult blip to locate.   Decreased Number of Pages or Products For eCommerce sites that have dropped product lines for business reasons, eventually, a loss of organic traffic for those keywords will be seen. Pages that are redirecting or missing will eventually drop from Google’s index – and organic traffic can suffer. However, if you are trimming low-quality pages, that is certainly worth the short-term decrease in your traffic! Quality is still king, and Google can see if a page is being visited, shared or linked to. So don’t stop pruning your site.These four situations explain the cases we’ve found where rankings might stay the same (or even improve) with no commensurate increase in organic traffic numbers. Be sure to check this list next time you find yourself wondering,”Where did all of the Organic traffic go?”

Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Google Analytics Doesn’t Provide all of the Answers

Google analytics has become a great source of data about visitors to your website – assuming your configuration is correct. Sometimes configuration issues inadvertently block your view of what is really happening. Common issues can include… 1. Not having your analytics snippet in the correct place.   There are many legacy variations of the analytics snippets. In addition, what was the correct installation a couple of years ago may have dramatically changed, depending on if you have an asynchronous snippet, etc. We still run into snippets calling for urchin.js for their Google Analytics, which are quite a few years old. The best place  – currently – to have your analytics code is inside the <head> tag, and right before it ends with the </head> tag. This will prevent interference with other scripts, which we have seen mess with bounce rates, conversion tracking, ROI, sleep schedules, general happiness, and more 2. Filters Your filters could have been created years ago and for long forgotten purposes. In Google Analytics, check your Admin area (under view, on the right halfway down) to see if you are filtering traffic. Look at the filters – do you know who created them and why they are present? Some have complicated REGEX rules and it can be difficult to decipher. Everyone should have at least one profile with no filters. We usually name this profile with RAW in the name. This system allows anyone to easily see if a filter has “gone rogue” and is filtering out good traffic. There are also these problems with getting good data, and you did not even cause them: 1. Incomplete data / views Most businesses are using the free version of Google Analytics, and sometimes experience “sampling” in important reports. Sampling in Google Analytics (or in any analytics software) refers to the practice of selecting a subset of data from your traffic and reporting on the trends detected in that sample set. Sampling is widely used in statistical analysis because analyzing a subset of data gives similar results to an analysis of a complete data set, while returning these results to you more quickly due to reduced processing time. In Analytics, sampling can occur in your reports, during your data collection, or in both place. (Image of sampling) 2. Organic keywords Years back, Google Analytics allowed you to see the query typed in by visitors. It was so powerful! It allowed you to see quite a bit of information about your prospects – perhaps too much. It has now become standard that search engines, browsers, and analytics itself is restricting this information. If you are new to analytics, you probably have not missed what you do not have. However, if you have been doing this a while, take a second to reflect on what was lost. We are right there with you. Hmph. 3. Referral spam, organic keyword spam, language spam In addition to losing out on good data, there is often too much noise in otherwise good data. Using fake browsers – bots that can run analytics code, all sorts of things are being inserted into your analytics. Some of the offenders might put – “Vitally was here” in the list of languages your visitors use – or make it look like visitors are coming in droves from some site you’ve never heard of (which is either selling SEO or hosting malware). Spam is analytics has become a major nuisance and we constantly have to deal with it while compiling reports. We see the same offenders across multiple accounts, and create a custom analytics segment to filter them from reports. Want to try our segment? Click this link and scrub your own view of your account: https://analytics.google.com/analytics/web/template?uid=wd7C1dObSgCOSpEEQsiWXg (There are other great segments on the Internet too, but we have customized this one for our clients.)

5 Vital Steps Toward Google’s “Mobile First” Indexing

“Mobile is exploding,” said every headline for the last decade. Google is all about traffic and mobile is both largest segment of traffic, as well as the fastest growing! Google’s search results will be based on the mobile versions of web pages, including the results that are shown to desktop users. This is even if your prospects are primarily using desktop (if you are in manufacturing and a few other industries), desktop drives most of your actual conversions, or maybe you just like the look of your desktop site better. Up to now, Google has been indexing web pages as desktop browsers see them. With the new ‘mobile first’ approach, Google will start indexing web pages as mobile phones see them. The rankings will be calculated based on the mobile results. Google says there will be minimal rankings changes, but this is a pretty major announcement. It is likely that mobile-friendly sites will see minimal ranking changes, but mobile unfriendly sites are likely to see an increasing loss of visibility. Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic and here are some important tips to make sure: 1. Check your mobile rankings, check your risk Looking at your website’s rankings in Google’s mobile search results gives an indicator of whether your site is vulnerable to losing traffic. It’s only an indicator, however: Google is basing mobile rankings to some extent on crawls of the Desktop version of your site. So better keep reading… 2. Be accessible Some sites hide content behind popups / interstitials. Google is specifically planning on penalizing intrusive popups on January 10, 2017. If you have an email subscription popup or survey layer, you may be penalized. And we all experience frustration with those ads that come up when we are trying to read a news article. Some vendors, such as Ometrics have been on top of this since the day of Google’s announcement! Make sure all of your vendors are. If you have a separate mobile site, make sure it is crawlable and be sure to register it in Google Search Console! Old best practices – blocking the duplicate content on a mobile version of your site – could potentially kill your traffic. 3. Be responsive Responsive mobile design allows for the best (compromise of) user experience across the many mobile, tablet and desktop displays. It adapts the page, and allows a single URL for mobile and desktop versions of the site. If you haven’t changed to responsive mobile design, ask us for a list of great web designers. 4. Be fast Speed on mobile is quite important. Research has shown that 40% of consumers will leave a page that takes longer than three seconds to load. Wireless internet connections are usually not nearly as fast as wired connections that desktop users experience. Optimizing image file sizes and resolutions hasn’t been this important since the days of the modem. 5. Don’t mess up AMP Staying ahead of the curve takes advantage of the greatest opportunities: Being the first among your competitors to implement mobile-friendly, mobile responsive, schema and AMP creates traffic. The period in which your site is in Google’s favor – and competitors are playing catch-up – can mean serious revenue. With these 5 tips, you will be ahead of the pack (for a short while). As Google implements more changes, search is likely to keep changing at a breakneck pace. Watch your indexing, ranking, traffic and conversion to keep ahead of the curve. Oh and PS: Bing will still use Desktop crawling to determine mobile rankings.