Prohibit URL calls with parameters via robots.txt

Instructions: Battery replacement for the Ion Audio Blockrocker

Icon instructionsToday I noticed a small indexing problem on a customer website. This works with a shop system that supports "speaking URLs" and thus has a bit of SEO-friendliness, but unfortunately it offers sorting of product lists and thus attaches a lot of parameters to the beautiful URLs, which of course are indexed far too often . The result is [duplicate content -> duplicate-content] in abundance.

To make the webmaster's work as easy as possible, I looked for a solution with the [robots.txt file-> robots-txt], which helps to prevent the indexing of such sorting pages. The whole thing can be solved easily in my case, since the files usually end with ".html" and only in the sorting you can find a ".html? Something = somehow". This addition in the robots.txt is now doing its job on the customer side:

Allow: /*.html$
Disallow: /*.html?*

If your files end with .php, you have to adjust the entry accordingly. And give Google a few weeks until it has thrown the pages from the index that are now prohibited. Aunt G is not the fastest ...

-
 

 

Effectively for free: iPhone 13 Mini and iPhone 13 deals with top conditions at Otelo - Advertisement

Leave a Comment

Your e-mail address will not be published. Required fields are marked with * .