Nofollow tags

Nofollow tags are a fairly recent invention. The history of nofollow goes something like this: 1. Google gives priority to sites with many links. 2. Spammers use blogs and guestbooks to artificially increase their link counts. 3. Somebody proposes that certain places on the web – like blogs and guestbooks – should have a way of devaluing any links that are added. 4. Search engines listened. Well, some of them. Well, Google. Search Engine Journal did a recent  posting: 13 reasons why nofollow tags suck.  I couldn’t agree more!

Bidding on number 3 Ad spot probably your best bet

I recently say another analysis showing that bidding for the number 3 spot  is the best use of money. Brandt Dainow, CEO of ThinkMetrics, after reviewed three years worth(1500 keyword sets!) of click-through trends and found the click-through rates (CTR) were virtually identical between ad positions 1-3 in his campaigns. Others disagree, but in our testing (limited, not 1500 keyword sets!) we have found the same result: positions 3 and 4 are your best use of money. (In most cases. Your mileage may vary. Other standard disclaimers here. Banned where prohibited.)

Googlebombs have been defused

The phenomenon known as “Googlebombing” has been defused by Google. It used to be that if enough sites linked to a site using certain words, that site would quickly rise in rank for that keyphrase. The result? You could search Google for “miserable failure” and see the White House is number 1, with Micheal Moore only slightly behind. Political viewpoints were being communicated, and we all understood a little more about the “Google Algorithm”. With a recent change at google, this “Googlebombing” technique is no more. What technique lies ahead? What is Google’s new strategy for ranking? We’ve seen positive results for many of our clients in this last Google update, but it will be interesting to see what techniques will be found with Google’s latest changes! More coverage on slashdot… And at Google’s webmaster blog

Optimizing for Google – Were they all lies?

Each of the search engines have their own unique criteria for relevancy. Google is no exception, and is usually the most mystical. In our tests, msn and Yahoo respond very quickly to SEO efforts, but Google take a little more time and finesse. And every once in a while, you see something in Google’s search results that makes no sense. Over at Intrapromote, Erik noticed recently that a search for 2007 Ford Explorer is yielding some very confusing results. He noticed that one text link was getting a site into Google’s top 10 for this competitive keyword. No giant SEO campaign, no link popularity project, and no compelling, fresh content. Just a single link. Compare the power this link has – this lame site is being ranked above relevant content in yahoo auto’s, about.com and auto magazine. Surely this content is more helpful to potential visitors?! And so, where is this illustrious link from? The Google DataCenter? Or perhaps Sergey Brin‘s blog? Nope. The link is from a page almost as lame: www.egateway.us/elist.html This link should also have no real weight, either. Erik points out that the egateway page has pretty much nothing but junk links pointing to it. Didn’t Google tell us this wouldn’t work any more? That fresh, relevant content and popular, themed links are the only way to get to the top of Google? If Matt Cutts were dead, he’d roll over in his grave. Thankfully, he is alive and – hopefully – well. Matt, what the heck is going on here? Please wave your mighty spam wand at the site – Google only wants good sites in their SERPs, right? (But please tell us how to achieve the same result with RELEVANT content before you do!) I’m interested to see Erik’s analysis, and will keep poking around in these links to see what the secret could be!

Microsoft’s new “Behavioral Targeting”

Microsoft is increasing personalizing ads given to it’s users. Microsoft says privacy is kept intact, and advertisers using Microsoft’s new Adcenter Pay-Per-Click(PPC) service are indeed seeing higher click-through rates(CTR). So, what’s the problem? It sounds like a win-win, and behavioral targeting will certainly be seen with increasing emphasis at Google AdWords, Yahoo Search Marketing and other PPC advertising providers. In a discussion of the article on Slashdot.org, one user writes: Microsoft and friends are going to push ads at us either way, I would just as soon see ads for stuff that I am actually interested in. When I go to a store and the salesman knows me well enough to actually be helpful I chalk that up to good service. Why should a website be any different? Has anyone noticed the behavior targeted ads – rolled out in September in the United States? Do you prefer them? Let us know what you think.

Performing Keyword Research

[tag]Keyword research[/tag] should be the beginning of every web page placed on the web. Instead of dumping the same old text from the brochure you created in 1987, rewrite your content from scratch with an eye toward your [tag]best keywords[/tag] and keyphrases! Here are steps to help you in your search for your very own [tag]keyword niches[/tag]: 1. Define your target market(s) Every [tag]target market[/tag] is going to have their own way of thinking, but you should also be sure to have a unique area of your site for each target market. You might want a section for investors, a page for community members, and another section for prospects. 2. What would they search for on the internet? Think like your audience. What would they search for? Would they mispell a keyword? Would their terms be more sophisticated than the terms you use to describe yourselff around the office? Maybe they would be less sophisticated. a. Brainstorm keyword phases Now look at those phrases and try to think about any possible variations. Are there more descriptive variations needed to really pinpoint the right searchers? Maybe you need to be less specific to increase the potential [tag]traffic[/tag] to your page. Remember: Less words in the phrase will help broaden your possible audience. More words in the phrase will help target the best. Would you get better prospects with a targeted phrase? b. Look keyword phases competitors are targeting on their website What are your competitors targeting? Look at competitors you know about, but also look at who is competing for spots 1-10 in Google. What are they targeting? What niche might they be leaving out? c. Look at keyword phases competitors are targeting using their [tag]link partners[/tag] (We have an automated tool we use for this – email us at sales@hyperdogmedia.com for more information!) 3. Existing [tag]keyword phases[/tag] you are being found for What better way to figure out which keywords are already working in some way? a. Web hosting visitor log files If you don’t have decent stats, install [tag]Google Analytics[/tag] ASAP. On most hosts, the free package [tag]awstats[/tag] is available. Also free are webalizer and analog. Any of these will tell you what keywords your site is being found under. b. [tag]Analytics[/tag] and/or [tag]Paid campaigns[/tag] Look at existing analytics and paid campaigns. The [tag]keywords[/tag] from your paid campaign can yield very valuable information. Keywords that result in [tag]clicks[/tag] and [tag]convert[/tag] into actual sales are like gold. These “[tag]converting keywords[/tag]” are some of the best you can target. 4. Expand the list a. Geographic Especially if you are [tag]targeting local business[/tag], think about where you are. Are you in a certain metropolitan area? What cities are nearby? What smaller communities? Be sure to include local nicknames like “bay area”, “front range”, etc. What county and state are you in? Include any other pertinent information – are you on a major street or thoroughfare? b. Thesaurus / Ontology Use a thesaurus to increase possiblities for your list. Do not judge keywords just yet – keep an open mind. You’d be surprised what searchers type in! The ontology or category in which your person-place-or-thing keywords exist can lead you to new possibilities. For example, a book has to do with publishing, printing, authors, etc. What “has to do with” your [tag]keyword phrases[/tag]? c. Incorrect spelling:typos, phonetic Bad spelling and phonetic misunderstandings can also lead you in the direction of new keywords. In a recent conversation, an acquaintance told me he can see that his best prospects always spell a certain keyword incorrectly: It is for a disease that the propects have. Doctors never buy the product directly, but always know how to spell it! d. Aggregate lists(like AOL’s leaked search data) Giant [tag]lists of keywords[/tag] can give insight into how visitors query a search engine. AOL released acontroversial amount of searches by their visitors. Third party sites like http://www.aolsearchdatabase.com/ allow you to look through the data. While it isn’t complete, it can yield valuable information about search behavior, and maybe about your keywords! e. [tag]Google[/tag] Suggest / [tag]Overture[/tag] [tag]Yahoo[/tag] tells you what keywords visitors searched for a month or two ago. Visit their site at: http://inventory.overture.com Google offers some search numbers and keywords with their suggest tool, too: http://labs.google.com/suggest f. Survey of automated tools (We have several automated tools and services we use for keyword research. Contact us at sales@hyperdogmedia.com for more information.) g. Repeat the process Did you get several new keywords? Now be sure to add on your geographic and other variations. Did your list just get MUCH bigger? Good! 5. Find the least [tag]competitive terms[/tag] Of course, it is always best to go after the least competitive keywords. To figure out which keywords have the best ratio of searches to competition, figure out the [tag]KEI[/tag]. We have automated tools that figure this out, but try the manual method for a few of the keywords you think might be real gems: a. KEI (Keyword Effectiveness Index) KEI = (# of monthly searches) / (# of exact results in Google) Gather (# of monthly searches) from the overture tool above (http://inventory.overture.com) Gather (# of exact results in Google) by searching for the your “keyword phrase” in the titles of possible competitors: allintitle:”keyword1 keyword2 keyword3″ b. Examine [tag]PPC bids[/tag] Looking at bids – especially in overture, but also with Google’s AdWords estimator tool – can tell you which keywords are the most competitive. So easy to see, and look – no math required! This article contains many of the tips we give for [tag]keyword research[/tag]. Have other tips? Leave a comment! We’d love to add your tip to the list!

YouTube growing up

Mark Cuban recently prognosticated that youtube would soon be “sued into oblivion.” User generated content is, of course, a legal liability that must be measured along with other business risks. Are the high flying days of youtube over? Hardly. Youtube is rumored to be in talks with major content license holders as we speak, offering up ways to share the advertising wealth – based on how many times the content is viewed.

Google PageRank: Interesting Facts

An interesting sets of facts about Google PageRank, originally posted at netconcepts: “Each web page within a website has its own PageRank score. PageRank scores run from 0 to 10 on a logarithmic scale, meaning that the gaps between the integers increase logarithmically the closer you get to 10. So, for example, the gap between the 3 and a 4 is quite small, whereas the gap between 7 and 8 is huge in comparison. As such, boosting your PageRank from a 3 to 4 would be quite easy, and going from a 7 to 8 would be quite hard. Another logarithmic scale you might be familiar with is the Richter scale. As you probably know, a 5.5 on the Richter scale isn’t such a huge deal, whereas a 7.0 is a very big deal indeed.” I think going from a 4 or 5 PR is probably very similar to a 7 on the Richter scale – better have your server ready for the jolt!

Make sure your site is accessible when Googlebot visits

For all of the attention paid to “On-Page Relevance” and “[tag]Link Popularity[/tag]”, SEO begins with accessiblity. Are your websites [tag]search engine friendly[/tag]? Follow our checklist below to measure: A. Search Engine Point of View Surf your site from a search engine’s point of view. Download and Install Firefox and the Fangs Extension. By experiencing the web as a non-sighted user, you are also experiencing the web as a search engine would. Can you navigate your site easily? Experienced [tag]SEO[/tag]s already have the tools and knowledge to examine a site for the items below, but beginners may find our [tag]free[/tag] tool at the SEO Diagnostic website quite helpful. 1. [tag]HTTP codes[/tag] When any web page is accessed, the server hosting the page responds with a code. Most visitors are familiar with [tag]error 404[/tag], returned when a page is missing, but many users are not aware of the full range of HTTP codes that can be returned. Are your pages returning a code 200, indicating “OK”? Don’t know where to start? Test your site at http://www.seodiagnostic.com. 2. Page size The size of your HTML page – including all images, stylesheets, and external scripts – should never exceed 90k. A recent study shows only 63% of Americans are online – an even smaller number have a broadband connection. If your target market includes consumers, or impatient business users, it is imperative to keep the size of your pages(including images, scripts and stylesheets) under control. 3. Frames Frames have usability and accesibility challenges that are rarely overcome. If your site’s HTML code uses frames, you should have an experienced SEO see if they can make it navigable inside a NOFRAMES tag. You may also require a site redesign. 4. Flash [tag] Flash[/tag] navigation cannot be followed easily by search engines. While flash is becoming more friendly, it still poses challenges to search engine indexing. Where flash is used on a site, make sure that html text links also exist. 5. [tag]JavaScript[/tag] navigation [tag] JavaScript menus[/tag] and rollover images make for stunning visual elements, but be sure to include navigation for your visitors that are not capable of executing JavaScript! While a NOSCRIPT tag may be indexed by some search engines, not all are created equal. 6. Dynamic URLs Google’s webmaster guidelines advise against dynamic URLs. Humans don’t like ’em, and search engine spiders don’t either! Use [tag]Apache[/tag]’s [tag]modrewrite[/tag] – hire an expert if you need to – but get rid of those dynamic URLs! 7. Robot [tag]Metatags[/tag] /[tag]robots.txt[/tag] While it may not seem that robots.txt is always necessary, consider adding a very basic robots.txt file that welcomes all spiders to index the site. Consider the relatively “open arms” policy our site has, reflected in our robots.txt: # All robots will spider the domain User-agent: * Disallow: User-Agent: googlebot Disallow: # Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/ 8. Google [tag]Sitemap.xml[/tag] Google’s new sitemap features allow your site to specify page locations and how often to “check back”. For large sites, this can be a dream come true. For smaller sites, it can be an opportunity to see your site as [tag]Google[/tag] sees it. Once your Google sitemap is active, Google will disclose any problems it had with your website. Get started in the Google Webmaster Area. 9. [tag]Yahoo Site Explorer[/tag] Yahoo Site Explorer is another great tool to see your site as a search engine does. No sitemap creation necessary! 10. Pages Indexed The number of indexed pages is a telling measurement about your search engine accessibility. To view the number of pages any of the major engines have indexed on your site, do a search for your domain with the “site:” prefix. For example, for seomoz.com you could search for “site:seomoz.com” (no quotes) in msn, yahoo or google. You will see each of the engines has a different number of index pages! 11. [tag]Best Practices[/tag] HTML sitemap Include a HTML sitemap to help your search engine(and other) visitors get to any page fast. A HTML sitemap is simply a html page with links throughout your site. Keep the amount of links under 100, as [tag]Google’s Webmaster Guidelines[/tag] recommend. Structure the links in a categorized outline, well organized for human and search engine visitors. CSS and JavaScript relocated to external files CSS files can be cached, reducing your bandwidth bill and providing less code for the engines to wade through before they encounter content. For human visitors, we often think about what is “above the fold”. For search engines, try to get as much juicy content “above the fold” as possible, too. HTML Validation While search engines can navigate some of the worst HTML code, why make it any harder than it needs to be? Try to keep to valid HTML as much as possible. [tag]Search engine[/tag] accessibility is so very important. First the search engines come, then the human visitors come. It does not matter how pretty your site is, if important visitors like the [tag]Googlebot[/tag] cannot get through it. Take a good look at your site accessibility to determine improvements you might be able to make. And, of course, many of these [tag]accessibilty metrics[/tag] can be measured with our new tool SEO Diagnostic. If you aren’t sure where to start, start there!

Yahoo Warns Growth of Internet Advertising Sales slowing in key sectors

Terry Semel, chairman and chief executive officer (CEO) of Yahoo warned that ad sales were growing in two key sectors, but not growing as quickly as they had hoped. Semel’s warning was about the Auto and Financial services sectors. He added: “But Yahoo was careful to note that it cannot tell whether the current slowdown is a sign of broader trouble or is limited to ads from the auto and financial sectors.“