Thursday, March 13, 2008

Six Sneaky SEO Techniques That Will Get Web Sites Banned

by Dharmesh Shah

All businesses—large and small, new and old—have one common denominator: They want to be found by customers, partners, and prospects; increase leads and brand awareness; and, ultimately, generate new business opportunities.

One of the most effective ways to do all that is search engine optimization (SEO).

There are many efficient SEO techniques to optimize your business's Web site, and then there are nefarious methods, which not only ruin your reputation and get your site banned from Google but also have legal ramifications.

There is an ongoing debate among experts as to what is considered "white hat" vs. "black hat" SEO. In my mind, the big difference is that "white hat" SEO helps the search engines deliver quality results to users by working within existing guidelines. On the other hand, "black hat" SEO involves exploiting current limitations in search engine algorithms.


Experts often disagree on what is considered a "black hat" technique and what is considered a "white hat" technique. My argument is that it doesn't matter what you call them; certain techniques are simply bad ideas and should be avoided by most (if not all) marketers.

The reasons vary, but have a common pattern: Avoid SEO practices that rely on tricking search engines and distorting search results. Here's my rule of thumb: If a given technique could be detected by a human doing a "manual review," then it's probably a bad idea.

It's safe to assume that if you try to exploit a hole in the algorithm today, your advantage is going to be temporary. More importantly, you carry a significant risk of having your Web site penalized or banned. The reward, even if there is one—and, in most cases, there isn't—is usually not worth it.

The 6 Sneaky SEO Techniques Marketers Should Avoid

1. Link Farms

There's general consensus that one of the strongest influences on search rankings is the number and quality of inbound links to a Web page. A link farm is a group of Web sites created for the primary purpose of creating a high number of links to a given Web site. These links are not "real" (in terms of signaling the quality of the site they link to), and so they are trying distort search engine results.

2. Automated Content Generation/Duplication

Search engines like content. They particularly like frequently updated content. Unfortunately, creating unique content takes time and energy. To try to trigger search engine spiders to index more pages from a Web site, and do so more frequently, some may try to auto-generate content or scrape Web content from other sites and republish it.

All businesses—large and small, new and old—have one common denominator: They want to be found by customers, partners, and prospects; increase leads and brand awareness; and, ultimately, generate new business opportunities.

One of the most effective ways to do all that is search engine optimization (SEO).

There are many efficient SEO techniques to optimize your business's Web site, and then there are nefarious methods, which not only ruin your reputation and get your site banned from Google but also have legal ramifications.

There is an ongoing debate among experts as to what is considered "white hat" vs. "black hat" SEO. In my mind, the big difference is that "white hat" SEO helps the search engines deliver quality results to users by working within existing guidelines. On the other hand, "black hat" SEO involves exploiting current limitations in search engine algorithms.

Article continues below
--------------------------------------------------------------------------------

Experts often disagree on what is considered a "black hat" technique and what is considered a "white hat" technique. My argument is that it doesn't matter what you call them; certain techniques are simply bad ideas and should be avoided by most (if not all) marketers.

The reasons vary, but have a common pattern: Avoid SEO practices that rely on tricking search engines and distorting search results. Here's my rule of thumb: If a given technique could be detected by a human doing a "manual review," then it's probably a bad idea.

It's safe to assume that if you try to exploit a hole in the algorithm today, your advantage is going to be temporary. More importantly, you carry a significant risk of having your Web site penalized or banned. The reward, even if there is one—and, in most cases, there isn't—is usually not worth it.

The 6 Sneaky SEO Techniques Marketers Should Avoid

1. Link Farms

There's general consensus that one of the strongest influences on search rankings is the number and quality of inbound links to a Web page. A link farm is a group of Web sites created for the primary purpose of creating a high number of links to a given Web site. These links are not "real" (in terms of signaling the quality of the site they link to), and so they are trying distort search engine results.

2. Automated Content Generation/Duplication

Search engines like content. They particularly like frequently updated content. Unfortunately, creating unique content takes time and energy. To try to trigger search engine spiders to index more pages from a Web site, and do so more frequently, some may try to auto-generate content or scrape Web content from other sites and republish it.

All businesses—large and small, new and old—have one common denominator: They want to be found by customers, partners, and prospects; increase leads and brand awareness; and, ultimately, generate new business opportunities.

One of the most effective ways to do all that is search engine optimization (SEO).

There are many efficient SEO techniques to optimize your business's Web site, and then there are nefarious methods, which not only ruin your reputation and get your site banned from Google but also have legal ramifications.

There is an ongoing debate among experts as to what is considered "white hat" vs. "black hat" SEO. In my mind, the big difference is that "white hat" SEO helps the search engines deliver quality results to users by working within existing guidelines. On the other hand, "black hat" SEO involves exploiting current limitations in search engine algorithms.

Article continues below
--------------------------------------------------------------------------------

Experts often disagree on what is considered a "black hat" technique and what is considered a "white hat" technique. My argument is that it doesn't matter what you call them; certain techniques are simply bad ideas and should be avoided by most (if not all) marketers.

The reasons vary, but have a common pattern: Avoid SEO practices that rely on tricking search engines and distorting search results. Here's my rule of thumb: If a given technique could be detected by a human doing a "manual review," then it's probably a bad idea.

It's safe to assume that if you try to exploit a hole in the algorithm today, your advantage is going to be temporary. More importantly, you carry a significant risk of having your Web site penalized or banned. The reward, even if there is one—and, in most cases, there isn't—is usually not worth it.

The 6 Sneaky SEO Techniques Marketers Should Avoid

1. Link Farms

There's general consensus that one of the strongest influences on search rankings is the number and quality of inbound links to a Web page. A link farm is a group of Web sites created for the primary purpose of creating a high number of links to a given Web site. These links are not "real" (in terms of signaling the quality of the site they link to), and so they are trying distort search engine results.

2. Automated Content Generation/Duplication

Search engines like content. They particularly like frequently updated content. Unfortunately, creating unique content takes time and energy. To try to trigger search engine spiders to index more pages from a Web site, and do so more frequently, some may try to auto-generate content or scrape Web content from other sites and republish it.

This technique often goes hand in hand with link farms (because if you're creating thousands of sites, you need some content to put on them in order to get the search engines to index them and for the links to matter).

Google has gotten very good at determining what is "natural" content vs. content that is computer-generated gibberish with no value. As for duplicating content on other Web sites without permission, this is often in violation of copyright laws, and it's unethical.

3. Keyword Stuffing

This involves over-populating certain portions of a Web page with repeated occurrences of a given keyword in the hopes of influencing search engine results. Search engines caught on to this trick many years ago, yet it remains popular for some reason.

4. Cloaking

This practice involves delivering different Web site content to the search engine spiders than is delivered to human users. The usual motivation for this is to send the search engine crawlers content for ranking on a certain term—but to send different content to real users.

It's pretty easy for the search engines to detect this. If you're suspected of using cloaking, it's easy for someone (like a Google employee) to simply visit your Web site as a human and check whether you're cloaking. This technique, when discovered, is one of the most reliable ways to get a site banned.

5. Hidden Text

This technique "hides" text on the Web page that search spiders will index (for ranking purposes), but is invisible to a human. The simplest example is some variation of white text on a white background.

Based on how sophisticated you want to get, it could be based on something as simple as tags in the HTML, CSS stylings or Javascript that changes the page dynamically. Regardless of how sophisticated the approach, it is still going to be detected at some point.

6. Doorway/Gateway Pages

This practice is similar to the cloaking technique. Instead of dynamically delivering different content to spiders, a doorway page involves getting a given page (the "doorway page") to rank well in the search engines, but then redirecting human users to a different page. Clearly, this is not in the interests of end-users as they don't get the content they would have expected.

It's Not Smart to Try to Outsmart Google Engineers

Just about all of these questionable tactics presume that the search engines will not detect them. They are based on exploiting current (and perhaps non-existent) limitations of search engine algorithms.

I'd argue that Google has a lot at stake and has lots of really smart engineers working on the algorithm. An Internet strategy that is predicated on outsmarting Google is not a smart strategy.

For most marketers, the time and energy spent on trying to take these shortcuts is much better invested in two things:

Improving their Web site so that it deserves to be ranked highly because it has valuable, differentiated content
Helping the search engines discover that content for the benefit of users
Working with search engines instead of trying to exploit them is the only approach that works in the long term.



Dharmesh Shah is the chief software architect and cofounder of internet marketing company HubSpot (www.hubspot.com). He authors OnStartups.com, an online community and blog for entrepreneurs, and writes for HubSpot's blog (blog.hubspot.com).

No comments:

Google