The Duplicate Content Controversy

Computers & TechnologySearch Engine Optimization

  • Author Monica Lorica
  • Published March 5, 2007
  • Word count 451

Many times, the so-called web marketing experts warn us about duplicate contents. They have stressed the fact that duplicate content will trigger a red flag from the search engines like Google. Though it is true that duplicate content is one of the many factors that the search engines abhor, it is equally true that there are cases where duplicate content becomes acceptable.

When does a duplicate content becomes acceptable? When is it legitimate to have duplicate content? These kinds of questions and other similar issues regarding duplicate content have caused several issues to arise. Here are some of the instances where duplicate content can somehow be considered as acceptable.

When Is Duplicate Content Acceptable?

• The same product listings on two different sites. If you want to include a product listing on two sites that you both own, the search engines may be able to tolerate the case.

• If a particular site has reprinted or copied a particular content from another site, this will be tolerated as long as the copying site has the right to do it with author credits.

• There are webmasters and website owners who would like to create two pages for the same item. One page would be the standard site page and the other one would be a printer friendly page. This would mean that the two pages would have the same content but it is acceptable.

• For reasons that you may not be able to explain, there are cases wherein there appears an odd duplicated page on your site. This usually happens to some sites and this is an honest error.

• Duplicate content may be a product of some errors on the site. By errors, I mean the unintentional ones.

Reasons Why Search Engines Prevent Duplicate Content

Search engines like Google try to avoid sites with duplicate content – in fact, they don’t just avoid them, they want to get rid of them. This is the reason why having sites with duplicate content is something to worry about. Why do the search engines detest sites with duplicate contents?

• To prevent duplicate content sites on the internet. Online users will not benefit from browsing different sites with exactly the same content. Replicate sites have exactly the same content including titles and even codes.

• To avoid what others called scraping – a method used by others to duplicate a particular site. This arouses the issue of copyrights.

• To avoid impassive PLR articles for replicated sites.

Knowing the duplicate content will cause your site be penalized by the search engines, you must strive to provide your site with unique data and content. This is important not only to avoid penalties but also to build your site’s credibility and image.

This article is written by nPresence an online web marketing agency that specializes in Search Engine Optimization, Pay Per Click advertising, Content Management Systems, Web Design, Conversion Tracking and Analysis. For all your all your web marketing needs, please see Internet Marketing Experts.

Article source: https://articlebiz.com
This article has been viewed 744 times.

Rate article

Article comments

Wong Seoul
Wong Seoul · 17 years ago
I am not sure if search engines also penalize duplicate pages under same web site.. Thanks

Related articles