Posted Monday, September 20, 2004
As the Web grew search engines like AltaVista, Hotbot and more recently Google have helped countless people find specific information quickly and accurately. Search for a keyword or phrase on your favorite search engine and within seconds the engine will search all the pages in its database to find relevant results.
Without search engines you would have to know the exact url of every site you wanted to visit, or follow links from one site to the next until you found what you were looking for. Search engines help you navigate the web quickly and offer an efficient way to find information.
Search engines work by sending out spiders or crawlers to find as many web pages as possible. The spider documents everything it can about your web pages. It looks to see how many other web pages link to you and what pages you link to.
When you make changes to your web pages, or other web pages link to yours, crawler-based search engines note these changes. These changes can affect how your pages are listed in the engines but can, on occasion, take months to come into play.
Next a program called an indexer reads these pages and builds an index based on the words found in each document. All the data processed by the indexer is stored in a database. Page titles, body copy, keyword density, and many other areas are all taken into account. Each search engine has its own unique algorithm to create its index; the exact workings of these proprietary algorithms are a closely guarded secret. Ideally, the algorithm should return only relevant focused results for each query.
From the database the search software extracts relevant web pages based on the search query entered by the user.
Search engines consist of the following components:
1. Spiders or Crawlers or Bots
4. Search software
5. Web interface
About the Author