The SEO Implications of Getting “Hacked”

Websites are increasingly being hacked on autopilot. Intruders are using scripts to crawl the web and infect sites using outdated or insecure software. Including plugins, add-ons, and themes. Security is necessary for web marketing to be successful, and SEO is particularly vulnerable.   1. Spammy ContentIntruders typically want to use a website’s existing authority in Google to push the most spammy content. Usually with affiliate links to casinos, adult content, pharmacies, etc. New pages, outside of the view of your normal website, are often created. Once Google finds this kind of spammy content on your site, your rankings can suffer. And your site might even be classified as “Adult in Nature.” That can mean a complete loss of search viability for prospects with “Safe Search” turned on in their search engine of choice! Google has said that even comments are taken into account when considering overall page content, so having entire sections of pages vulnerable can be particularly dangerous. 2. Thin and Duplicate Content PenaltiesThe pages that intruders create are usually low quality content. To build pages of unique content on hijacked websites, shortcuts are followed. These shortcuts can mean a Google Panda penalty for your site as well! Thin pages and duplicate content matching other hacked sites are enough to set off Google’s alarms. 3. Ads and Affiliate linksWith Google’s new updates centered around quality, it’s easy to also set off alarms when your site is suddenly hosting ads and affiliate links for all sorts of things. Google’s quality guidelines take into account various factors such as ads above the fold, links to known affiliate networks, etc. If these are in your intruder’s monetization strategy, your rankings in Google are very likely to suffer! 4. Over Optimized ContentOutdated and aggressive SEO techniques are still often used by intruders, and that can mean over optimization penalties as well. Repeating a keyword several times in a title tag, or endlessly in page content, is an aggressive SEO technique that used to actually work. But not with modern Google! With spammy automated content created by an intruder, hacked websites are again vulnerable to Google penalties. 5. Growth and Loss of Indexed PagesFor years, Google has been wary of sites that grow their page count by a thousand percent overnight. And when the intrusion is fixed, it can look like a massive cull to Google, as 90% of the site’s content is suddenly uncrawlable. This instability is bad both ways in the world of search engine crawlers, and can take a while to undo. 6. Spammy Inbound LinksTo get the intruder’s pages to rank on search engines, an automated link campaign is often created. The words “automated link campaign” carry the connotation of low quality, and that’s especially true here. Links can be from other compromised websites, adult sites, and just the absolute worst of the web! There are various ways to research what links have been created, but it’s difficult to catch them all! Many will have been de-indexed by Google, but still counted. Link cleanup & disavowal could potentially go on for years. 7. Getting Onto a BlacklistThere are sites, including Google, that may be warning off potential visitors to your site. Google will warn potential visitors right from their search results! But Antivirus software programs from Norton, McAfee, and many others are also scanning websites. Once you are on one of their blacklists, they can potentially block visitors. You won’t even see those attempted visits show up and bounce in analytics. They don’t even get to view your site and trigger analytics code before being blocked. And it can be hard to get off of these blacklists, too. Most companies don’t even think to check blacklists after cleaning up an intrusion. So what can you do about this? Well prevention is key!When it comes to website intrusions, prevention is crucial. Even large companies do not pay enough attention to security until an intrusion happens. Software updates are just the beginning for prevention. Consider monitoring admin logins, file system changes, and more. Catching an intrusion early on will be vital as well. If warnings are in webmaster tools, it could be a long road back for website visibility. PSST! Need a Free Link?We’d like to help you promote your own business, hoping more work for you brings more work our way! Subscribe to the Hyper Dog Media SEO Newsletter HERE!

5 web development techniques to prevent Google from crawling your HTML forms

Google has recently decided to let it’s Googlebot crawl through forms in an effort to index the “Deep Web”. There are numerous stories about wayward crawlers deleting and changing content through submitting forms, and it’s about to get worse. Googlebot is about to start submitting forms in an effort to get to your website’s deeper data. So what’s a web developer to do? 1. Use GET and POST requests correctly Use GET requests in forms to look up information, use POST requests to make changes. Google will only be crawling forms via GET requests, so following this “Best Practice” for forms is vital. 2. Make sure your POST forms do not respond to GET requests It sounds so simple, but many sites are being exploited for XSS (Cross Site Scripting) vulnerabilities because they respond (and return HTML) to both GET and POST requests. Be sure to check your form input carefully on the backend, and for heaven’s sake – do not use globals! 3. Use robots.txt to keep robots OUT robots.txt file keeps Googlebot out of where it doesn’t belong. Luckily, Googlebot will continue it’s excellent support of robots.txt directives when it goes crawling through forms. Be sure not to accidentally restrict your website too much, however. Keep the directives simple, excluding by directory if possible. And test, test, test in Google’s Webmaster Tools! 4. Use robots metatag directives Using the robots metatag directives for more refined control. We recommend “nofollow” and “noindex” directives for both the form submission page and search results pages you want Google to stay out of, even though Google says disallowing the form submission page is enough. Consider using tags and category pages that are Google friendly instead. 5. Use a CAPTCHA where possible Googlebot isn’t going to fill out a CAPTCHA, so it’s an easy way to make sure some bot isn’t filling out your form. Googlebot is, of course, the nicest bot you can hope to have visit your website. This provides a chance to secure forms and take necessary precautions before other – not so polite – bots visit your forms.

Keeping track of multiple passwords

RSA Security’s newest password management survey found that one of the greatest threats to corporate security is the weak password. Employees that change their too often, or have to juggle too many passwords for login to various services, are likely to choose weak passwords or even write them on a scrap of paper near their station. I am a little suspicious a survey that highlights RSA security as the solution to this problem, but it is valuable to stop and ask yourself “Do I have too many passwords to keep track of?”. Sure, too many passwords lead to “irresponsible password behavior”. A single login and password for every service is usually a bad idea, too. Once an intruder has access, they could wreak tremendous havoc. A sensible alternative is to choose Four passwords that you can actually remember. Make each password incrementally more random, if possible. Choose the weakest password, and use it to sign up for services that only need a password for the most rudimentary of tasks. Use the “second level” password for sites that may have some personal information – your name, address, etc. Save the “third level” password for sites that have your credit card on file. The final password is to be used only in online banking and/or paypal. Gee, so simple. But who can keep track of Four passwords, anyway?! Good luck out there – no one ever said good security was easy! (More information on the password survey)