Cloaking
By Richard Lowe
Posted Thursday, October 21, 2004
The search engines are pretty good at their jobs. This is especially true of the larger, more established monster listings such as Altavista and Google. They have to be good, as they are in a constant state of war with search engine spammers (webmasters who attempt to artificially increase their rankings in the search engines by unethical means).
You see, the higher a site ranks in the major search engines, the more hits it receives. In many cases, hits directly translate into dollars. Thus, a web site which can, say, double it's hits can often double the amount of money it makes.
What does this have to do with anything? Well, it's common knowledge that sites which do not show up on the first three pages of listings in a major search engine may as well not be listed at all.
In addition, it's important that a site get listed on popular keywords. For example, far more people search for the word "plumber" than "person who fixes pipes". while you might get a few visitors with the later term, you will not get anywhere near as many as the first.
Each of the major search engines has different rules that it uses to rank web sites. Some engines want metatags, some prefer straight text and others want a mixture of both. Some search engines may be fine with dozens of keywords in a metatag, and others want only one. The list goes on and on - each search engine looks at different things in a page.
Why do they go through all of this trouble? They are attempting to determine what your page is all about. The theory is the more a particular keyword (or phrase) is mentioned (and in more ways), the more likely your page is about a particular subject. Thus, if "plumber" appears in the text a few times (especially in the H1 and H2 tags), in a metatag, an ALT tag and the title, then it's likely your page is indeed about plumbers.
On top of that, the search engines must protect against spammers. These are people who use various tricks to fool the engine into thinking they should be well ranked. For example, a common technique a few years ago was to include very small, invisible text containing keywords. The visitors would not see this text but the search engine would and thus would be fooled (the search engines figured this one out a long time ago and it no longer works).
When the search engines discover a web site is spamming, their response is to either (a) drop the site way down in rankings or (b) ban it entirely. If your site has ever been banned from one of the big engines, then you completely understand how devastating it can be to be dropped all of a sudden.
But then again, getting to first can be so rewarding. It can mean the difference between a thousand dollars in sales and a million. Literally. But how do you get to be first with a particular keyword in as many search engines as possible? One way is to look at other sites to see what they have done and, ahem, steal the ideas (or just copy their keywords to your own pages).
But there is a wildcard in all of this, and that's the simple fact that the search engines use different rules to determine the ranking of a page. One engine allows three keywords and will rank higher if it finds three, another might want those keywords to be near the top of the page, and still another might want them in a comment. The second engine (the one that wants the keywords near the top) might actually drop your rankings if it finds three keywords.
One of the more common ways to handle the problem of different search engines is to have different entry pages. Using this method, you might have a page which is perfect for Google, another which is exactly right for Altavista and a third which is made for Northern Lights. The problem with this, of course, is your visitors will be directed by each engine to pages which are probably not exactly right for human beings. After all, the engines work even better with all of those fancy tables and lists which make your pages look so good. And, of course, this does nothing to prevent someone from stealing, uh borrowing, your keywords.
There is a technique which appears, on the surface, to solve every single problem that you could dream of having with rankings and different search engines. This technique will make it header for people to steal your keywords and it will allow you to have different pages for each search engine, while still landing your visitors on a page perfectly suited for human reading.
It's called "cloaking" and it is exactly what it sounds like. The technique is pretty simple, really. You see, search engines are very nice about identifying themselves. They do this for a number of reasons, one of which is to make it easy for a web site to allow or reject their attentions (believe it or not, sometimes there are good reasons NOT to be listed in a search engine).
In a cloaked site, a special script is written which is executed on the server. This can be done with ASP or PHP pages (these are two different scripting languages) although most commonly it is done with standard CGI scripts executed using SSI.
Using this method, the script is called before the page is loaded. The script determines the name of the thing that is loading the page. Is it a browser or a search engine? If it is a search engine, which one is it? Based upon the answer, the script loads a page. So if it determines that the page is being loaded by Altavista, it will call up the page which is optimized for Altavista. The same goes for Google, Northing Lights or any number of other engines.
This tends to hide the keywords and other search engine ranking techniques from prying eyes, since human beings always see a page created explicitly to be seen by humans. Note that this just makes it more difficult to get these keywords, not impossible. You see, the name of the search engine or browser (called a user agent) is handed to the server by the browser - and it's not hard to fake (in fact, it's pretty darn trivial).
Cloaking is somewhat of a pain, since it does require a very well written script, the use of server-side scripts, and, of course, a different page for each engine plus one for human reading. And since it's best to do this with ALL of your pages, it could significantly increase the amount of work you need to put into your site.
Another thing that cloaking is very good for is to present different pages to different browsers. This is a very cool way to create a site which looks perfect in Netscape and Internet Explorer as well as Opera. Of course, creating different pages for just these three browsers triples your work. So should you consider cloaking? Absolutely not.
You should NOT use cloaking.
Let me repeat this - do not use cloaking on your web site.
On the surface it sounds like the perfect solution to search engine optimization except for one significant fact.
Cloaking is considered by all of the major search engines to come under the heading of search engine spamming. If you are caught (and it's easy for a search engine to figure it out) you WILL be banned from the engine. How do they catch you? Simple. The search engine simple sends a few test scans at the same time to your site using different TCP/IP addresses and identifications, and it "fools" your script into thinking it's a different engine. If your page looks different, it's possible it's cloaked.
So my advice is simple. Don't use cloaking. Instead of putting your efforts into fad promotional techniques and spamming methods, create quality content, get other webmasters to link to your site, and add honest keywords, titles, ALT tags and descriptions. Do this and your site will honestly move up the rankings. Honesty is also without fail the best policy.
About the Author
Richard Lowe Jr. is the webmaster of Internet Tips And Secrets at (http://www.internet-tips.net) - Visit our website any time to read over 1,000 complete FREE articles about how to improve your internet profits, enjoyment and knowledge.