Web design followup: What to do after the big site launch

After the launch of any web development project, stakeholders and web design firms might sit back proudly and call it done. There are, however, a few things that should be cared for after the big launch. 1. Check 404 error logs Be sure to check your logs after you launch that new site. A. Missing pages You wouldn’t move without forwarding your mail, would you? Don’t forget to forward your important (former) page locations, either! Instead of showing the (hopefully customized) 404 error page, make sure you 301 redirect that page to the appropriate new location. You’re not only saving your visitors a click, but you might just preserve the pagerank(and trustrank) Google has given that page. B. Images Were important images being shown on other websites? Perhaps your logo is being shown on a partner’s website. Of course they shouldn’t link directly to images on your site like that. But they did. And if the logo is now missing, it isn’t going to get visitors to click through to your site! You may also have traffic from Google’s image search or other sources. Make sure you know what happens to that traffic when images are suddenly missing. 2. Announce the site launch to vendors and customers A website launch is an excellent reason to get in touch with old and new partners, vendors and customers. Contact them via email, email newsletter, or a direct mail piece. Who knows – you may have a product they didn’t know you offered! 3. Make adjustments Luckily, changes can always be made after a website launch. Is something working? Not working? You can always fix it on the web. Everything web is measurable. Measure and adjust. These steps will help any website design launch go more smoothly. Remember – it isn’t over after the big launch. Sometimes a little more work is needed to put the professional touch on that site. Looking for a more organized approach to your next web design or redesign project? We HIGHLY recommend Web ReDesign 2.0: Workflow that Works.

Web designers must factor in the growing impatience of web surfers

Website visitors have never been more impatient, and I’m the worst. Just today, I was looking up the lyrics to a song. I clicked on the site in #1 position(Like 90% of the rest of the world), but it was too slow. Before I even left the Google SERPS (Search Engine Result Pages), I clicked on the link in position 2. I’m going to bet I’m not the only impatient soul looking for lyrics… or even more important things(as if!). Lucky, mother Google(our gentle overlord) is paying attention. One of the items mentioned in seomoz’s recent survey of perceived ranking factors is the availability of the server hosting a site. In this case lyricbarn, or whatever they were called, lost a visitor and a potential adsense click or two(Ads are fun to click). Web Designers – Yeah, you – Reduce your page load times and keep visitors!

5 web design & SEO tips from the world of PPC

Many view the worlds of Pay Per Click Advertising and Search Engine Optimization as opposites. While they are certainly very different, the goals are similar: bring eyeballs(with wallets) to your site and make it easy for them to buy.  Here are 5 tips to improve your SEO based on lessons from PPC. 1. Converting keywords Some keywords convert into sales better than others. Use your analytics to discover which keywords are bringing you sales, then target them with your SEO campaign. PPC(Pay-Per-Click) ads are a wonderful testbed to discover those converting keywords if you are pressed for time. 2. Your title and metadescription are your ad When composing your titles and metadescriptions, remember they will be shown in the search engine result pages. It’s like having an advertisement to click, but without Google’s AdWords rules. Always remember you are competing against the other pages in the SERPS(Search Engine Result Pages) – who will get the click? 3. Landing pages It’s great to optimize for your homepage, but setup some (even more relevant) landing pages and be sure they get some of the inbound links you are building. 4. Optimize landing pages for different steps in the buying process As visitors reach your site, think about what step they might be at in their buying process? Are they conducting preliminary research?  Give them links to bookmark your content, send it to a friend, or signup for your newsletter.  Is their search so specific that they are probably ready to buy? Now is the time to wave the free shipping! 5. Split Test Internet marketing is measurable. Why not setup split tests when you design your web pages? Create a couple of similar pages(avoid duplicate content) and use your analytics to measure performance. When your sample size tells you which one is better, adjust the worst of the two and measure again. Or create a third page. Hey, why not? HMTL is still free.

9 Common Web Design Mistakes Prevent Google From Indexing Your Site

Web Designers frequently destroy their clients’ chances of ranking well in Google, without even knowing it! Here are three common mistakes that can ruin a client’s chances of ranking well in Google, Yahoo or MSN – simply by preventing the site from being indexed! Search engines follow regular text links, but web designers like to use these unfriendly search engine navigation methods: 1. JavaScript Menus Search Engines do not follow links reliably in JavaScript, if at all. 2. Imagemaps Search Engines cannot see the image, and so cannot classify the relevance or topic of the link. Lesser search engine robots do not even attempt to follow imagemap links. 3. Image Links / Rollover links These links frequently contain JavaScript, but also are difficuly for search engines to classify. 4. JavaScript popups Search Engines do not follow JavaScript reliably, and do not seem to like popups at all! 5. “Jump menus” These pulldown menus are usually submitting a form. If the form target is sent GET requests, there is a chance that the links will be followed in some manner, but again – this isn’t reliable navigation for Search Engines. 6. NOSCRIPT embedded links We were told that content in NOSCRIPT tags is for those visitors that have JavaScript off. But if you were told this means search engines, you were told wrong! This HTML tags has been abused by spammers early on, and search engines do not reliably follow navigation within these tags. 7. Frames – they’re rarely done in a search friendly manner More on the “right way” in a later post. Frames are challenging for search engines, and we have recently seen Google penalizing framee-based sites, perhaps due to the usability challenges they can present. 8. Java Java cannot be executed by search engines. Many early rollover effects relied on Java, but the navigation cannot be read by search engine robots. 9. Flash Flash navigation cannot be followed by search engines. Splash pages can become a deadend for search engines, and alternatives to Flash navigation should always be given. So what can you do to be sure that search engines will crawl your site? We’ll have answers in a future post, but a frequent supplement to websites that use the above techniques – meant almost entirely for search engines – is a set of footer links for seach engines to follow.

7 untimely ways for a SEO to die

In ancient Rome, the ghosts of the ancestors were appeased during Lemuria on May 9. Not many people know that, and even fewer care. But in the spirit of Lemuria, we offer seven untimely ways a SEO can die(It’s a dangerous world out there, and also I’m low on blog posting ideas): – Bitten by search engine crawlers. – Trampled by googlebots(This is actually the best way to go, if you have to). – Trip over a HTML tag someone forgot to close. (This was funnier last night when I thought of it – go figure) – You get (google)whacked while visiting a bad link neighborhood. – You’re doing the googledance, slip on a banana peel and hit your head. Certainly I’m not the only one who knows the googledance? Please submit your videos if you know it: googledance@hyperdogmedia.com. – You receive a suspicious package in the mail, and it turns out to be a googlebomb. – Setting linkbait traps and you get an arm caught. Please submit any other ideas you might have via email: lemuria@hyperdogmedia.com. So strike up that pun machine, it’s Friday! Update: Debra just suggested you could “overdose on link juice” – if only!

Reducing page load times

With the ever increasing impatience of internet visitors, it is important that pages load as fast as possible. Here are some quick tips we implement when developing websites to keep the page size to a minimum: 1. CSS and Javascript should be in external files. This way, they are cached after the first page is visited. 2. For large images that cannot be optimized any further, load a placeholder and update it with the full version after the page has loaded. 3. Get faster hosting. 4. Use CSS instead of tables for layout. 5. Be a minimalist. Do you really need a sound on the home page? Are animations really needed to convey your message?

Performing Keyword Research

[tag]Keyword research[/tag] should be the beginning of every web page placed on the web. Instead of dumping the same old text from the brochure you created in 1987, rewrite your content from scratch with an eye toward your [tag]best keywords[/tag] and keyphrases! Here are steps to help you in your search for your very own [tag]keyword niches[/tag]: 1. Define your target market(s) Every [tag]target market[/tag] is going to have their own way of thinking, but you should also be sure to have a unique area of your site for each target market. You might want a section for investors, a page for community members, and another section for prospects. 2. What would they search for on the internet? Think like your audience. What would they search for? Would they mispell a keyword? Would their terms be more sophisticated than the terms you use to describe yourselff around the office? Maybe they would be less sophisticated. a. Brainstorm keyword phases Now look at those phrases and try to think about any possible variations. Are there more descriptive variations needed to really pinpoint the right searchers? Maybe you need to be less specific to increase the potential [tag]traffic[/tag] to your page. Remember: Less words in the phrase will help broaden your possible audience. More words in the phrase will help target the best. Would you get better prospects with a targeted phrase? b. Look keyword phases competitors are targeting on their website What are your competitors targeting? Look at competitors you know about, but also look at who is competing for spots 1-10 in Google. What are they targeting? What niche might they be leaving out? c. Look at keyword phases competitors are targeting using their [tag]link partners[/tag] (We have an automated tool we use for this – email us at sales@hyperdogmedia.com for more information!) 3. Existing [tag]keyword phases[/tag] you are being found for What better way to figure out which keywords are already working in some way? a. Web hosting visitor log files If you don’t have decent stats, install [tag]Google Analytics[/tag] ASAP. On most hosts, the free package [tag]awstats[/tag] is available. Also free are webalizer and analog. Any of these will tell you what keywords your site is being found under. b. [tag]Analytics[/tag] and/or [tag]Paid campaigns[/tag] Look at existing analytics and paid campaigns. The [tag]keywords[/tag] from your paid campaign can yield very valuable information. Keywords that result in [tag]clicks[/tag] and [tag]convert[/tag] into actual sales are like gold. These “[tag]converting keywords[/tag]” are some of the best you can target. 4. Expand the list a. Geographic Especially if you are [tag]targeting local business[/tag], think about where you are. Are you in a certain metropolitan area? What cities are nearby? What smaller communities? Be sure to include local nicknames like “bay area”, “front range”, etc. What county and state are you in? Include any other pertinent information – are you on a major street or thoroughfare? b. Thesaurus / Ontology Use a thesaurus to increase possiblities for your list. Do not judge keywords just yet – keep an open mind. You’d be surprised what searchers type in! The ontology or category in which your person-place-or-thing keywords exist can lead you to new possibilities. For example, a book has to do with publishing, printing, authors, etc. What “has to do with” your [tag]keyword phrases[/tag]? c. Incorrect spelling:typos, phonetic Bad spelling and phonetic misunderstandings can also lead you in the direction of new keywords. In a recent conversation, an acquaintance told me he can see that his best prospects always spell a certain keyword incorrectly: It is for a disease that the propects have. Doctors never buy the product directly, but always know how to spell it! d. Aggregate lists(like AOL’s leaked search data) Giant [tag]lists of keywords[/tag] can give insight into how visitors query a search engine. AOL released acontroversial amount of searches by their visitors. Third party sites like http://www.aolsearchdatabase.com/ allow you to look through the data. While it isn’t complete, it can yield valuable information about search behavior, and maybe about your keywords! e. [tag]Google[/tag] Suggest / [tag]Overture[/tag] [tag]Yahoo[/tag] tells you what keywords visitors searched for a month or two ago. Visit their site at: http://inventory.overture.com Google offers some search numbers and keywords with their suggest tool, too: http://labs.google.com/suggest f. Survey of automated tools (We have several automated tools and services we use for keyword research. Contact us at sales@hyperdogmedia.com for more information.) g. Repeat the process Did you get several new keywords? Now be sure to add on your geographic and other variations. Did your list just get MUCH bigger? Good! 5. Find the least [tag]competitive terms[/tag] Of course, it is always best to go after the least competitive keywords. To figure out which keywords have the best ratio of searches to competition, figure out the [tag]KEI[/tag]. We have automated tools that figure this out, but try the manual method for a few of the keywords you think might be real gems: a. KEI (Keyword Effectiveness Index) KEI = (# of monthly searches) / (# of exact results in Google) Gather (# of monthly searches) from the overture tool above (http://inventory.overture.com) Gather (# of exact results in Google) by searching for the your “keyword phrase” in the titles of possible competitors: allintitle:”keyword1 keyword2 keyword3″ b. Examine [tag]PPC bids[/tag] Looking at bids – especially in overture, but also with Google’s AdWords estimator tool – can tell you which keywords are the most competitive. So easy to see, and look – no math required! This article contains many of the tips we give for [tag]keyword research[/tag]. Have other tips? Leave a comment! We’d love to add your tip to the list!

3 Common, Surprising Ways to Get Banned by Google

1. Hidden text in the name of accessibility Many web designers are using a CSS technique that hides text – allegedly for accessibility. The technique uses a background-image to replace text. It is a common CSS technique touted as a way to maintain accessibility while still displaying a graphic instead of the text header. Unfortunately, it fails in many screen readers (see above article), and it is considered by the search engines to be hidden text! 2. Unintentionally spamming keywords Many sites use words in their keywords, title and description that are not ever used on the page. Most webmasters insert keywords into these tags that aren’t used anywhere on the page. The problem occurs when keywords are brainstormed separately from content development. Good SEO involves doing the two techniques together. Don’t expect Google to just trust that you are relevant for these keywords. How relevant could you be for your keywords if you never use them, anyway?! 3. Excessive Links Webmasters have become obsessed with their Google Pagerank, and are trading links at an ever furious pace. Having too many outbound links on a single page makes your site look more like a “link farm” than a legitimate website. You should try not to place too many outbound links on a single page. If you do need to link to 100 or more sites, place the links on separate pages. Instead of focusing on an unhealthy amount of links, create quality content and allow the links to flow in NATURALLY. That is what Google wants to see anyway. For Google’s webmaster guidelines, visit http://www.google.com/webmasters/guidelines.html What to do if you are banned? Google’s Matt Cutts has the answer: After you have fixed your site, file a reincluson request.

Make sure your site is accessible when Googlebot visits

For all of the attention paid to “On-Page Relevance” and “[tag]Link Popularity[/tag]”, SEO begins with accessiblity. Are your websites [tag]search engine friendly[/tag]? Follow our checklist below to measure: A. Search Engine Point of View Surf your site from a search engine’s point of view. Download and Install Firefox and the Fangs Extension. By experiencing the web as a non-sighted user, you are also experiencing the web as a search engine would. Can you navigate your site easily? Experienced [tag]SEO[/tag]s already have the tools and knowledge to examine a site for the items below, but beginners may find our [tag]free[/tag] tool at the SEO Diagnostic website quite helpful. 1. [tag]HTTP codes[/tag] When any web page is accessed, the server hosting the page responds with a code. Most visitors are familiar with [tag]error 404[/tag], returned when a page is missing, but many users are not aware of the full range of HTTP codes that can be returned. Are your pages returning a code 200, indicating “OK”? Don’t know where to start? Test your site at http://www.seodiagnostic.com. 2. Page size The size of your HTML page – including all images, stylesheets, and external scripts – should never exceed 90k. A recent study shows only 63% of Americans are online – an even smaller number have a broadband connection. If your target market includes consumers, or impatient business users, it is imperative to keep the size of your pages(including images, scripts and stylesheets) under control. 3. Frames Frames have usability and accesibility challenges that are rarely overcome. If your site’s HTML code uses frames, you should have an experienced SEO see if they can make it navigable inside a NOFRAMES tag. You may also require a site redesign. 4. Flash [tag] Flash[/tag] navigation cannot be followed easily by search engines. While flash is becoming more friendly, it still poses challenges to search engine indexing. Where flash is used on a site, make sure that html text links also exist. 5. [tag]JavaScript[/tag] navigation [tag] JavaScript menus[/tag] and rollover images make for stunning visual elements, but be sure to include navigation for your visitors that are not capable of executing JavaScript! While a NOSCRIPT tag may be indexed by some search engines, not all are created equal. 6. Dynamic URLs Google’s webmaster guidelines advise against dynamic URLs. Humans don’t like ’em, and search engine spiders don’t either! Use [tag]Apache[/tag]’s [tag]modrewrite[/tag] – hire an expert if you need to – but get rid of those dynamic URLs! 7. Robot [tag]Metatags[/tag] /[tag]robots.txt[/tag] While it may not seem that robots.txt is always necessary, consider adding a very basic robots.txt file that welcomes all spiders to index the site. Consider the relatively “open arms” policy our site has, reflected in our robots.txt: # All robots will spider the domain User-agent: * Disallow: User-Agent: googlebot Disallow: # Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/ 8. Google [tag]Sitemap.xml[/tag] Google’s new sitemap features allow your site to specify page locations and how often to “check back”. For large sites, this can be a dream come true. For smaller sites, it can be an opportunity to see your site as [tag]Google[/tag] sees it. Once your Google sitemap is active, Google will disclose any problems it had with your website. Get started in the Google Webmaster Area. 9. [tag]Yahoo Site Explorer[/tag] Yahoo Site Explorer is another great tool to see your site as a search engine does. No sitemap creation necessary! 10. Pages Indexed The number of indexed pages is a telling measurement about your search engine accessibility. To view the number of pages any of the major engines have indexed on your site, do a search for your domain with the “site:” prefix. For example, for seomoz.com you could search for “site:seomoz.com” (no quotes) in msn, yahoo or google. You will see each of the engines has a different number of index pages! 11. [tag]Best Practices[/tag] HTML sitemap Include a HTML sitemap to help your search engine(and other) visitors get to any page fast. A HTML sitemap is simply a html page with links throughout your site. Keep the amount of links under 100, as [tag]Google’s Webmaster Guidelines[/tag] recommend. Structure the links in a categorized outline, well organized for human and search engine visitors. CSS and JavaScript relocated to external files CSS files can be cached, reducing your bandwidth bill and providing less code for the engines to wade through before they encounter content. For human visitors, we often think about what is “above the fold”. For search engines, try to get as much juicy content “above the fold” as possible, too. HTML Validation While search engines can navigate some of the worst HTML code, why make it any harder than it needs to be? Try to keep to valid HTML as much as possible. [tag]Search engine[/tag] accessibility is so very important. First the search engines come, then the human visitors come. It does not matter how pretty your site is, if important visitors like the [tag]Googlebot[/tag] cannot get through it. Take a good look at your site accessibility to determine improvements you might be able to make. And, of course, many of these [tag]accessibilty metrics[/tag] can be measured with our new tool SEO Diagnostic. If you aren’t sure where to start, start there!

How are you handling 404 pages?

Yahoo has incorporated a new feature in it’s slurp bot to find how your website is handling missing pages. Yahoo occasionally may be requesting random gibberish URLs on your server – usually with slurpconfirm404 in them – to see how missing pages are being handled on your server. Understanding how Search Engine robots are crawling and responding to your site is vital in communicating the accessibility and relevancy of your content. What is a 404 error? A 404 error is a low level code that is supposed to be sent by a webserver when the page that has been requested is not found on the server. It communicates to the web browser (or search engine robot) that the content couldn’t be found. Even a missing graphic should return a 404 error. Don’t worry, your site correctly returns a 404 error unless you (or your “web guy”) has intentionally manipulated it to do otherwise. What should be returned when an item is missing? It makes sense to stick with predictable behaviors when dealing with both users and search engines alike. Some sites may have gotten away from sending 404 errors because that is ALL they were sending. It isn’t very user-friendly to send a page that simply says “404 error”, but with a little tweaking a sitemap and branded page can be sent to your user. This simple step will help you keep more visitors, and communicate to search engines when they have requested missing content. See more about Yahoo and slurpconfirm404.