Speed is Everything

Page loading speed has great importance with Google these days. From mobile visitors to Googlebots, every visitor will appreciate a speedy experience. Here are some ideas to keep in mind: 1. Rise of mobile The importance of mobile can be seen in Google’s announcements the last few years. Mobile users are more impatient than ever, and Google provided stats last week regarding just how impatient mobile users are: – The average mobile page takes 22 seconds to load, but 53% of users leave after 3 seconds! – Even mobile landing pages in AdWords were found to take 10 seconds loading time. There are many easy changes available for sites to make, as the answer isn’t always in purchasing a faster web server. Google’s own analysis found that simply compressing images and text can be a “game changer”—30% of pages could save more than 250KB that way. 2. Ranking factor A few years back, Google made page speed a small ranking factor – or at least they were finally explicit about it being a ranking factor. Since page speed issues aren’t given the exposure of crawl errors and other items in Google Search Console, it can be easy to put them on the “long list” of items to fix. Its addition as a ranking factor is a great signal that this needs to be prioritized. 3. Bounce rate Nice try, loading up your site with images that take forever to load. Unfortunately, that doesn’t increase the duration of site visits. It just makes people angry. According to Google’s analysis, every second of loading time, from 1 to 7 seconds, increases the chance of a bounce by 113%! Many SEOs believe that “engagement metrics” such as bounce rate could also be a ranking factor. And it makes sense: When Google sees a rise in organic bounce rate, they know human visitors are judging the content. How could Google not take this data into account? 4. Crawl rate In one recent test, increasing page speed across a site dramatically increased the site’s crawl budget. Slower sites can be overwhelmed by crawl activity. But if you ever feel the need to put a crawl delay in your robots.txt, take that as a warning sign. After all, even reasonably fast sites can often need more crawl budget. Tools and Fixes Luckily there are remedies. Some can be quite easy, such as adding compression to your web server. Others might require a trip to Photoshop for your site’s images. However, some items will not be worth fixing. Try to concentrate on the easiest tasks first. Run an analysis of your site through these two tools and see what you need to fix: Google’s newest tool: Test how mobile-friendly your site is. GTmetrix.com features include a “waterfall” showing which page items load at which stage, history, monitoring, and more. Good luck and enjoy optimizing the speed of your site!

Doing the Pigeon (Update)

Last month, Google rolled out one of their largest local search updates in quite some time. Since Google didn’t name the update, Search Engine Land named this one the Google Pigeon Update. It’s seemingly unrelated to Google’s Pigeon Rank, an April Fools joke from back when Google did good and funny things. This update does not penalize sites, but does change how local results are shown: – Fewer queries are generating a map listing / “local pack” – More traditional SEO signals are used, such as title tags and quality inbound links. Some interesting things are happening with this update: – When a query includes the word “yelp”, those listings on yelp.com are back at the top. This fixes a recent bug. – Web design and SEO companies are getting shown in local queries again! If you depend on local traffic, hopefully your results weren’t negatively impacted by the update. The best approach for local visibility includes these tasks: – make sure to update and creat local directory listings on authority sites such as yelp. – Use the highest quality photo on your Google+ business profile, and get more reviews. You might make it into the Carousel listings at the top of Google for some queries. – Make sure your business Name, Address and Phone(NAP) are consistent on your site, google+ business page, and local directories. – Be sure your city/state is in site’s title tags And now for something good, and funny: PSST! Need a Free Link? We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!  Their site also provides an excellent backlink. You may even get human visitors, website projects and new partners. Now THAT’s business development link building!

Notes on the Yahoo / Bing Transition

Yahoo advertisers received an email outlining a few more terms regarding the upcoming transition to the Microsoft Advertising adCenter platform. Some quick points: 1. A Tab will show up in “YSM later this month”, so be sure to login and look around. 2. Yours ads can server in adCenter right away when you transition. 3. Silverlight will continue being used. I still need to test and see what functionality might be missing when I login from my iPhone or an older mac. I hope whatever missing features degrade well! 4. The upcoming changes to organic search are later this month. We anticipate a rocky ride, as Microsoft will likely need to make ongoing tweaks. Dear Advertiser, As your transition to the Microsoft Advertising adCenter platform approaches, we have more details to share to help you prepare for the changes to come. Considerations for your upcoming transition adCenter account Soon, you’ll need to either create a new adCenter account, or link an existing adCenter account to your Yahoo! Search Marketing account. Later this month, you’ll see an “adCenter” tab within your Yahoo! Search Marketing account. Clicking there will take you to the beginning of the account transition process, where we’ll walk you through the simple steps to create or link accounts. Budgeting Once you create your adCenter account, it will be active and your ads will be eligible to serve on Bing right away. As a result, you’ll be managing both your new adCenter account and your existing Yahoo! Search Marketing account in parallel until ad serving for Yahoo! traffic transitions to adCenter, so plan to budget accordingly. Microsoft Silverlight With Silverlight installed, you’ll be able to see and address key differences between your Yahoo! and adCenter accounts as you transition. Download Silverlight now. Organic search transition Yahoo! organic search results will be powered by Bing as early as late August. If organic search results are an important source of referrals to your website, you’ll want to make sure that you’re prepared for this change. For more details, check out this blog post. As we’ve stated previously, our primary goal is to provide a quality transition experience for advertisers in the U.S. and Canada in 2010, while protecting the holiday season. However, please remember that as we continue to go through our series of checkpoints, if we conclude that it would improve the overall experience, we may choose to defer the transition to 2011. We are committed to making this transition as seamless and beneficial for you as possible. We appreciate your business, and look forward to bringing you the benefits of the Yahoo! and Microsoft Search Alliance. Sincerely, Your Partners in the Search Alliance, Yahoo! and Microsoft

4 reasons to 301 redirect old subpages ASAP

After a major website redesign, it’s not uncommon for page locations and even page extensions to change. Maybe you’ve switched web development languages, or changed your website’s structure into a SEO friendly themed set of silos. Whatever the reason page locations have changed, it’s vital that the old page locations are 301 redirected to the appropriate new pages. It’s time sensitive for the developers to make the change, as: 1. Pages will start dropping out of the index (Google hates sending visitors to bad pages, and can see the bounce rate skyrocket). When Googlebot comes to visit your site, it will probably receive a “404 Error Page” as well as a 404 HTTP error code. A 404 error code is the surefire way to get a page out of Google’s index. 2. Humans that have bookmarked the old page will be stranded. Depending on the 404 error page (Your server’s default is simply awful), your loyal return visitor may think the entire site is down. 3. Search engines will stop counting the power of the links coming into broken pages, and rankings will drop. Search engines do not count links to missing pages. The wonderfully diverse link profile you’ve built over the years can disappear as links to subpages are no longer counted. 4. Webmasters linking into subpages might notice the 404 and remove their links. Some webmasters routinely monitor where they are linking to, and remove links to broken destinations. Don’t make the most common of 301 redirect errors: Sending everything to the home page. to preserve a diverse link profile, you’ll want to keep those links spread naturally across your site’s homepage AND subpages. Happy 301 redirecting!

Denver SEO / Colorado SEMPO communities flourishing

Denver SEO Meetup and the Colorado working group of SEMPO have seen tremendous growth in the last year. In the ever developing world of search marketing, the meetups have become excellent resources for search marketing professionals looking to network – as well as the professional development opportunities provided by SEMPO’s excellent speakers. Last week, our president Jim Kreinbrink spoke about “Driving traffic to your blog with SEO techniques”. It was a technical presentation that gave away many great tidbits. The audience was full of experienced search marketers, and we hoped to show the value of collaboration and community. The previous month, two excellent PPC case studies were presented by Alex Porter from Location 3 Media. Seeing the approaches Location 3 took for two PPC campaigns, and the results attained, were very exciting. Search marketing is growing in a recession, so expect a packed house. The focus on measurable, trackable results makes it particularly appealing to agencies and advertisers alike. All this means that the Denver search marketing coomunity will continue to grow and flourish.

13 Reasons Why Google Loves Blogs

Google loves blogs. What is it about blogs that Google loves so very much? We’ve pinpointed 13 reasons why Google may give – or appear to give – sites with blogs a little extra boost in rankings. Of course, the list is broken down into our framework of looking at good quality sites as being accessible, relevant, and popular. Accessibility: Search Engine robots must be able to find your content. These reasons help the bots find your postings without a lot of muss or fuss. 1. Pinging Most blog software sends out a “ping” when there is a new post. Instead of waiting for a search engine crawler to come across your site’s new content – either via a routine crawling or via a link – a notification is sent out to sites like pingomatic, technorati, and even google blog search. This notification tells the search engine robots to come and fetch some fresh (crunchy) content. 2. RSS feeds provide deep links to content RSS Feeds are useful for so many, many things. They contain links to your latest postings, but also consider that they contain links right to the postings themselves. Even crawlers that aren’t that smart (you know who you are, little bots!) can figure out how to find a link in a list. That’s essentially all an RSS Feed is: A list of links in a predictable format. Hint: You subscribed to your feed in iGoogle, didn’t you? 3. Standard sitemap.xml provide deep links to content If an RSS feed isn’t enough, use a sitemap.xml file to notify search engines about your site, including any new posts. A great thing about sitemap.xml files is that they can communicate additional information about a link, like how often a search engine robot should visit and what priority the page has in relation to your site. 4. Based on modern HTML design standards Most blogging software was created or updated very recently, and doesn’t use outdated HTML methods like nested tables, frames, or other HTML methods that can cause a bot to pause. Relevance: Once found, search engines must be able to see the importance of your content to your desired audience. 5. Fresh content, updated often Nothing quite gets the attention of a search engine robot like fresh content. It encourages frequent repeat visits from both humans and robots alike! 6. Fresh comments, updated often Of course, the blogosphere is a very social place. Googlebot is likely to come back often to posts that are evolving over time, with fresh new comments being added constantly. 7. Keyword Rich Categories, Tags, URLs Invariably, some of your best keywords are likely to be used in the tags and categories on your blog. If you aren’t using keyword rich categories and tags, you really should be. Popular: Google looks at what other sites link to your site, how important they are, and what anchortext is used. 8. RSS Feeds provide syndication RSS Feeds can help your content and links get spread all around the internet. Provide an easy path to syndication for the possibility of links and, of course, human traffic. 9. Extra links from blog & RSS Feed directories The first blog I ever started was for the possibility of a link from a blog directory. But RSS Feed directories exist too! Be sure to maximize the link possibilities by submitting to both. 10. Linking between bloggers / related sites Blog rolls are links that blogger recommend to their audience. sometimes they have nice, descriptive text and even use XFN to explain relationships between bloggers. Some of your best human traffic can be attained through blogrolls. 11. Social bookmarking technologies built in Blog posts are usually created with links directly to social bookmarking services like delicious.com, stumbleupon, and other social bookmarking sites. You’ve never made it easier for your audience to share your posting and give you a link! 12. Tagging / Categories with relevant words Tags can create links to your blog by relevant pages on technorati and other blog search engines. These tag pages sometimes even have pagerank! They deliver keyword rich links and quality traffic. 13. Trackbacks (Conversations) Trackbacks are conversations spanning several blogs. They are an excellent way to gain links (although often nofollowed these days), and traffic. Other blogs can be part of the conversation, thanks to the trackback system!

4 Places to find keywords for your SEO / PPC campaigns

What is an SEO or PPC campaign without the right keywords? Great keyword targets have a good amount of traffic, and a hopefully small amount of competition. Before you can even start measuring such things, however, you must create a broad list of keywords. Here’s where to start: 1. Keyword research  / suggestion services Services like WordTracker, KeywordDiscovery and even Google Suggest can give a great idea of the traffic surrounding certain keywords, as well as the variations of keywords a site should target. 2. Analytics / Statistics If you currently have analytics or web visitor statistics on your website, it is very helpful to look at how existing customers have found your site. If you haven’t loaded Google Analytics, it is quite easy – and free! 3. Brainstorming / Asking customers Great keywords can also be found, just be interviewing current customers with “How did you find us?” Even a quick glance at your business plan can lead you to a few new ideas on how prospective customers might find you. 4. Competitors Competitor websites can be a treasure trove of keywords. Scan their source code for a keywords metatag, if present. Also look at the keywords in their page titles by searching google for: site:competitor.com These four methods should lead you to plenty of keywords for your next campaign.

Denver SEMPO Meetup / Denver SEO Meetup

Why travel outside of Denver for great SEO and Search Engine Marketing events? Last week saw great attendance at the new Denver SEMPO Meetup (Created by the members of SEMPO’s Colorado Working Group). This week’s Denver SEMPO meetup was an excellent educational program provided by Jim Brown, Online Marketing guru for Quark (of QuarkXPress fame). The presentation focused on opportunities in Social Media. Jim provided great information regarding Twitter, Facebook, and Facebook ads. While his presentation was friendly to all audiences, even seasoned Denver SEO professionals left with a new trick or two. And most valuable were the brand ambassador experiences Jim relayed to the group. The Denver SEO Meetup followed, just a few blocks away. Many members attended both meetup groups. The Denver SEO Meetup is not an educational program, but a social function – founded our our President Jim Kreinbrink. Many notable SEO professionals regularly attend, but Search Marketing, Advertising, and Affiliate marketing professionals are also frequenting the meetup. Several SEOs noticed glitches in running Google ranking reports for clients that week, and it was nice to exchange what was working and not working in small informal conversations. Of course, don’t come to the Denver SEO Meetup hoping to learn all about SEO: It’s a more relaxing networking function, not an educational opportunity. With SEO / SEM knowledge and professional networking available in here in Colorado, why travel to search marketing and ad industry conferences every weekend?

5 web development techniques to prevent Google from crawling your HTML forms

Google has recently decided to let it’s Googlebot crawl through forms in an effort to index the “Deep Web”. There are numerous stories about wayward crawlers deleting and changing content through submitting forms, and it’s about to get worse. Googlebot is about to start submitting forms in an effort to get to your website’s deeper data. So what’s a web developer to do? 1. Use GET and POST requests correctly Use GET requests in forms to look up information, use POST requests to make changes. Google will only be crawling forms via GET requests, so following this “Best Practice” for forms is vital. 2. Make sure your POST forms do not respond to GET requests It sounds so simple, but many sites are being exploited for XSS (Cross Site Scripting) vulnerabilities because they respond (and return HTML) to both GET and POST requests. Be sure to check your form input carefully on the backend, and for heaven’s sake – do not use globals! 3. Use robots.txt to keep robots OUT robots.txt file keeps Googlebot out of where it doesn’t belong. Luckily, Googlebot will continue it’s excellent support of robots.txt directives when it goes crawling through forms. Be sure not to accidentally restrict your website too much, however. Keep the directives simple, excluding by directory if possible. And test, test, test in Google’s Webmaster Tools! 4. Use robots metatag directives Using the robots metatag directives for more refined control. We recommend “nofollow” and “noindex” directives for both the form submission page and search results pages you want Google to stay out of, even though Google says disallowing the form submission page is enough. Consider using tags and category pages that are Google friendly instead. 5. Use a CAPTCHA where possible Googlebot isn’t going to fill out a CAPTCHA, so it’s an easy way to make sure some bot isn’t filling out your form. Googlebot is, of course, the nicest bot you can hope to have visit your website. This provides a chance to secure forms and take necessary precautions before other – not so polite – bots visit your forms.

Upcoming Denver SEO Presentation: An Excellent Value

Hyper Dog Media is providing Search Engine Optimization tips at the Association of Strategic Marketing’s upcoming seminar. The full agenda includes information from experts in PPC (Pay Per Click), Web Analytics, and more: Proven Strategies for Improving Your Search Engine Marketing Are you optimizing your greatest asset? Website content is an essential part of online success. Help search engines see the relevance of your pages, articles, press releases and more. Learn to identify and target ranking opportunities with titles, headings, bolding and additional techniques. Also, HTML can be used to communicate the relevance of your website and content to search engines. You don’t need to be an HTML whiz either! Once you have the content, you must know how to maximize your search engine exposure. Find out how aggressive search engine submission may harm your ability to get into Google’s listings, as well as modern strategies on how to get your site indexed safely. Learn how to take an active role in getting pages indexed quickly in the major search engines as you add new content. Finally, links from other websites are an important source of traffic and search rankings. Several kinds of links will be discussed and you are sure to leave with new link building ideas! 5 reasons to attend! Translate the user experience to all online channels Learn about online measurement and analytics tools Use your SEM campaign to maximize your ROI Ensure you are paying for profitable clicks Discover 26 sources of links to target BONUS! Free manual with registration Hope to see you there!