Click Here!
Evrsoft.com
Article Sections: | Internet Marketing | Web Design | Web Development | Business | Internet and Businesses Online | Self Improvement |  
>> Home > Internet Marketing > Search Engine Optimization

Being dumped by Google? Learn how to avoid becoming a victim next time around!

By Per Strandberg
Posted Thursday, November 11, 2004

After Google latest update nicknamed "Florida", many webmasters discovered that their traffic plummeted.

What happened?

More importantly what can you do about it?

And what will Google do next?

What happened was that Google made an algorithm change on how they rate web pages.

Every time you make a search, Google tries to show the most relevant web pages that match your search term. By being able to give the most relevant results for queries, they have become the most used search engine in the world.

In order to keep out competitors they have to constantly adjust and improve how they judge web pages.

Because this judgment is done automatically using software, many webmaster have been modifying their sites in order to improve their position in the search results. To do this they have exploited different shortcuts and loopholes made possible by shortcomings in the software algorithm.

Periodically Google make changes in order to stop some webmasters to get unfair advantages by plugging one or two of the loopholes.

This is what happened during the Florida update.

With this update Google introduced new algorithms which intended to stop overuse of some search engine optimization techniques.

More specifically they seem to have targeted search terms found in text links also called anchor text. Web pages with good positions in the search result, which had had a disproportional number of in-bound links to them from other web pages with the exact same search term in the anchor text that the page was optimized for suddenly, disappeared from the listings.

The pages did not disappear altogether. Just for the search term that the page were optimized for.

For Google, the high proportions of anchor texts with the same text indicate that the texts were put there for one purpose only, to boost ranking.

One suggestion for you is to spread out the anchor text with a mix of different texts to keep your page in the search results. We don't know if your pages will come back after some time if you do this, but it is likely.

Apparently the search result generated after the latest update have been of a lower quality than before.

What seems to have happen is that a large percentage of web sites have traded links with one another. This link trade has been done with the same search term in the anchor text that they have optimized their pages for.

The victims more often than not have been commercial web sites that relied to heavily on search engine optimization technique.

The search results have been taken over by web sites composed of low quality directory and link farms.

Now, what will Google do next?

I don't know, but TRY TO THINK like Google! This is what I would do if I was responsible at Google for this.

First I think that they will modify and adjust the new algorithm they have introduced during the latest update. Changing the threshold or don't let the "over optimized pages" drop out of the search result so easy, but rather penalize them and put them under the threshold point.

I think, Google have a problem! You see, many "over optimized" sites are of higher quality that those that are not. To simply drop them out and say that there are enough pages for the same search term is not always true.

There is a thin line between optimization and spamming and where this boundary should be.

After this, what will Google do next? It is clear to me that the many low quality directory sites found in Google search results is a nuance to Google and to the average web user.

It is in this area that, I think, they will make the next modifications.

Google rate web pages according to relevance. The level of relevance is judge based on the web page content and/or how popular the web page is in the view of Google.

To get a page popular you need to have links from other pages. This can come from pages on your own site or from other sites.

Ideally these links should be many, come from pages dealing with similar or identical subject or come from pages that themselves are popular. The best is to have many links from pages dealing with the same subject that themselves are popular.

This had led to an intense link exchange active among webmasters. And the primary reason has been to achieve better ratings. The primary purpose has not been to increase the visitors experience value.

This goes against Google's principles.

To quote Google webmaster guidelines: Make pages for users, not for search engines. Avoid tricks intended to improve search engine rankings. Don't participate in link schemes designed to increase your site's ranking or Page Rank.

To counter this I think Google will target several popularity increasing schemes like:

- Low value directory sites which have been created automatically by robots. These sites contain extracts taken from search engines and directories.

Google can easily spot these sites.

- The building of link directories attached to web sites. They are built with link partner extracting software and services. With them you can upload directory structures directly into your site. This way you can build up a massive number of link partners and also identify link partners with high Page Rank values.

Of course, one can say that by doing this you can add to your visitors experience as the directories make it easy for them to find similar web sites.

However this is an argument that Google most likely would disagree with.

Web sites using tactics like this are easy identifiable by Google. The directory pages are composed of outgoing links which either have the Title, Meta descriptor or other content directly taken from the web pages they are linked to.

Google just have to look at the texts from the directories and the text on the web pages for matching.

Using product or services for this purpose is risking you get banned or at least being penalized by Google.

Will this happen? I think so!

When?

I don't know! Anytime soon, next month,..next year! Nobody knows, only Google can tell!

I think Google also will look into reciprocal linking as a whole.

Maybe they will start to identify pages with outgoing links on them that link to other web sites and identify which links are coming back from those domains.

What they like to see is spontaneous linking to your site from web owners that regard you as a valuable resource to link to, without you linking back. I believe that they will limit the impact of reciprocal linking, somewhat!

What can you do to improve your web traffic from Google without violating its guidelines?

Build web sites that give value to your visitors. Make it into a popular site, so that others want to link to your site. Build niche information rich sites. Either as mini sites or as larger information sites. Larger sites within a niche are given higher popularity rating than smaller sites by Google.

If you do this your web site will not be affected next time Google make a change. Unless of course your competitor drops out of Google, then your traffic will get a boost.

About the Author
Per Strandberg is a web marketer and software developer!Currently he operates a web site for backup products and data security information!At (http://www.data-backup-and-storage.com)

 






Click Here!


 

.

  Articles are submitted to EDN and licensed from various content sites.
  To report abuse, copyright issues, article removals, please contact [violations (at@) evrsoft.com]

  Copyright © Evrsoft Developer Network. Privacy policy - Link to Us

Contact Evrsoft