Search engine cloaking and stealth technology| by David Callan Tired of the search engine optimization game? Lots of webmasters are, today the Internet is more a big shop than the information library it became so popular for. This of course means that there are hundreds if not thousands of sites competing for the same customers.
Search engines play a very big part in whether company A or company B gets a visitor and potential customer. Webmasters and Internet marketers know this and hence competition for search engine traffic is fierce. These days it's almost impossible to keep up with the search engines, one day your site could be near the top the next day your competition could be there and you could be gone from the results completely.
One particular method however is being used by webmasters to enable their sites to rank high and stay high. The method is highly controversial and risky. It's called search engine cloaking.
What is search engine cloaking?
Search engine cloaking is a technique used by webmasters to enable them to get an advantage over other websites. It works on the idea that a 'fake' page is delivered to the various search engine spiders and robots while the real page is delivered to real human visitors.
In other words browsers such as Internet Explorer, Netscape and Opera are served one page and spiders visiting the same address are served a different page.
The page the spider will see is a bare bones HTML page optimized for the search engines. It won't look pretty but will be configured exactly the way the search engines want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any real person except for the webmasters that created them of course.
When real people visit a site using cloaking the cloaking technology which is usually based on Perl/CGI will send them to the real page that look's good and is just a regular webpage.
The search engine cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address. No IP address is the same so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP's. If there's a match the script knows that it's a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.
Once a list of all the search engines spiders IP addresses have been stored, it's simply a case of writing a script that says something like: -
If IP request = google(Spider IP) then show googlepage.html
If IP request = unknown (other user) then show index.html
This means that when the Google spider comes to visit a site, it'll be shown a page that is optimized with keywords, heading tags and optimized content. Since the optimized page is never seen by a casual user design is not an important issue. When a user comes to the site the server performs the same check and finding that the IP address does not match any in its list shows the standard page.
Search engine cloaking is also a great way of protecting the source code that's enabling you to rank high on the search engines. Ever read a search engine ranking tutorial that recommends you to model your keyword density, layout, etc on pages that are already high ranking?
Well technically that's stealing and your competition might want to do it to you some day. With search engine cloaking however you can protect your code because when your competition visits they'll be sent to the regular page and not the page that's giving you those precious good rankings.
Different types of search engine cloaking
There are two types of cloaking, the first is called User Agent Cloaking and the second is called IP Based Cloaking which we've already discussed above. IP based cloaking is the best method as IP addresses are very hard to fake, meaning your competition won't be able to pretend to be any of the search engines in order to steal your code.
User Agent Cloaking is similar to IP cloaking in the sense that the cloaking
script compares the User Agent text string which is sent when a page is requested
with its list of search engine User Agent names and then serves the appropriate
The problem with User Agent cloaking is that Agent names can be easily faked. Imagine Google introducing a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they're a normal person using Internet explorer or Netscape, the cloaking software will take Googles bot to the non optimized page and hence your search engine rankings will suffer. User Agent cloaking is much more riskier than IP based cloaking and it's not recommended.
Search engine cloaking conclusion
Search engine cloaking isn't as effective as it used to be. This is because the search engines are becoming increasingly aware of the different cloaking techniques being used by webmasters and hence they're gradually introducing more sophisticated technology to combat them. In saying that though cloaking can still benefit your search engine rankings, it's just a matter of being very careful.
I would recommend you read my SEO tutorial entitled Search engine optimization guide and try regular search engine optimization first, if after a few months you're still not seeing good results then you should at least consider using cloaking technology to improve your rankings.