Click Here!
Evrsoft.com
Article Sections: | Internet Marketing | Web Design | Web Development | Business | Internet and Businesses Online | Self Improvement |  
>> Home > Internet Marketing > Search Engine Optimization

New Search Engine Marketing Practices

By David Gikandi
Posted Saturday, October 23, 2004

A new study by Cyveillance shows that the Web has grown to more than 2.1 billion documents and is growing at the rate of 7 million pages per day. Another study by Berrier Associates indicates that people who spend five or more hours a week online spend about 71% of their time searching for information. That goes to show the power search engines still wield over traffic. To keep you up to date on what online marketing professionals are now doing to win the search engine wars, here is a brief look at some of the latest strategies being employed.

Pay-Per-Click Search Engines

Pay-per-click search engines are becoming an extremely effective way to get targeted traffic to websites. Basically what happens is that you submit your site to them and bid for a top ranking. So for a few cents per click through, your site is ranked at the top for your selected keyword searches. Whenever someone clicks through to your site, your account is debited the amount of cents you bid earlier at setup time for each click through. The most popular pay per click search engine, the one that started it all, is GoTo.com. The best thing about these engines is that you set the amount of money you are willing to pay for per click through, and you know exactly how highly your site will be ranked for your selected keywords. It is a guaranteed way to drive traffic to your site at a price you select. For more information, see:

(http://www.payperclicksearchengines.com)

Search Engine Demographics

Have you ever wanted to know what each of the major search engines' visitor demographics was? Perhaps you wish to know which engine to focus on when optimizing your pages for higher rankings, or you want to buy banner ads and want to know where you will get the most bang for your buck. Well, here is a list of sites that tell you all that you need:

(http://www.cyberatlas.internet.com)
(http://www.internettrafficreport.com)
(http://www.keynote.com)
(http://www.mediametrix.com)
(http://www.netratings.com)
(http://www.nielsen-netratings.com)
(http://www.nsol.com/statistics)
(http://www.nua.ie)
(http://www.searchengineshowdown.com)
(http://www.statmarket.com)
(http://www.traffick.com)

Cloaking and Page Optimization

There are two sides to the clocking issue. Cloaking, by the way, is using scripts that hide certain pages from browsers while giving them out to search engines only. On one hand, according to a recent I-Search Special Issue on cloaking, Marc Krellenstein the Senior VP of Engineering for Northern Light said "If we find out 'your' page is cloaked we will ban your URL and sites for life." According to I-Search, Inktomi and AltaVista share similar sentiments towards cloaking. On the other hand, cloaking is something that is very commonly used by high-level web designers for legitimate reasons such as directing users with different browser capabilities to different pages, and also by advanced web marketers to improve search engine rankings while hiding the high ranking HTML from competitors. Despite the fact that most engines frown upon it and indeed do penalize some pages that use cloaking technology, a great majority of cloaked pages still go un-penalized, working effectively towards their goal. One reason is that its not easy for the engines to find cloaked pages, another is that cloaking can be very legitimate so it is let by anyway when found. So where does that leave you? If you know what you are doing and have a legitimate reason to use cloaking, proceed carefully. It can be highly fruitful. If you don't know what you are doing, it is best not to cloak your pages. In general, it's a good idea for most webmasters to stay away from cloaking. If you would like to know more about cloaking, see the following pages:

(http://www.spiderhunter.com/chart/)
(http://fantomaster.com/fafaqcloak1.html)

As for page optimization (making web pages designed to rank highly on search engines and drive the resulting traffic to the main site), many professionals now agree that creating frame pages that have optimized html in the tag while framing the main site is the best way to go. This is perfectly OK by the engines when used responsibly. If you wish to use software to rapidly create these pages, consider using PositionWeaver PRO (www.positionweaver.com).

Correct Search Engine Submission

It is widely known now that some of the automated submission tools do not do a good job at submitting a site to the search engines. One major problem is that some engines do not want more than one page submitted to them from the same domain within a 30-minute period. That is set that way to catch spammers. And most engines do not want the same page submitted to them within the same 24-hour period. Now there is a tool called the Search Engine Commando that you can use that is fully safe and easy to use. It has built in rules that enable it to submit your pages in the same responsible and effective manner that a professional search engine marketer would, making sure that you will not be tagged for spamming or have your submission ignored for failing to observe the rules. To learn more about it, see:

(http://activemarketplace.com/w.cgi?sec-9153)

Interesting Tidbits

A new study by Cyveillance shows that the Web has grown to more than 2.1 billion documents. It is growing at the rate of 7 million pages per day. For details, see:

(http://www.cyveillance.com/newsroom/pressr/000710.asp)
(http://www.cyveillance.com/resources/Webstudy.pdf)

Google.com is now the largest search engine; with a full-text index of 560 million URLs in June, plus a further 500 million URLs that it has never actually visited but can potentially come in on a search results set.

You may have certain pages that you need to have excluded from search engine indexing for one reason or another. While you could use the META robots tag to control this, many engines now ignore that tag. Your best bet is to use a robots.txt file, which is placed in your root folder. All major engines and many smaller ones make use of robots.txt files. To find out more about this versatile file, see:

(http://info.webcrawler.com/mak/projects/robots/norobots-rfc.html)
(http://www.tardis.ed.ac.uk/~sxw/robots/check/)

About the Author
David Gikandi (support@positionweaver.com), CEO at SearchPositioning.com (http://www.positionweaver.com).

 






Click Here!


 

.

  Articles are submitted to EDN and licensed from various content sites.
  To report abuse, copyright issues, article removals, please contact [violations (at@) evrsoft.com]

  Copyright © Evrsoft Developer Network. Privacy policy - Link to Us

Contact Evrsoft