Prohibit URL calls with parameters via robots.txt

Instructions: Battery replacement for the Ion Audio Blockrocker

Icon instructionsToday I noticed a small indexing problem with a customer's website. This works with a shop system that supports "speaking URLs" and is therefore a bit SEO-friendly, but unfortunately it offers the sorting of product lists and thus attaches a lot of parameters to the beautiful URLs, which of course are indexed far too often as a result . The result is [duplicate content->duplicate-content] galore.

In order to make the webmaster's work as easy as possible, I looked for a solution with the [robots.txt file->robots-txt] that helps to prevent the indexing of such sorting pages. The whole thing can be solved easily in my case, since the files usually end with ".html" and only in the sortings is a ".html?something=somehow" to be found. This addition in the robots.txt is now doing its job on the customer side:

Allow: /*.html$
Disallow: /*.html?*

If your files end with .php, you have to adjust the entry accordingly. And give Google a few weeks until it has thrown the pages from the index that are now prohibited. Aunt G is not the fastest ...

My tips & tricks about technology & Apple

Did you like the article and did the instructions on the blog help you? Then I would be happy if you the blog via a Steady Membership would support.

Post a comment

Your e-mail address will not be published. Required fields are marked with * marked

In the Sir Apfelot Blog you will find advice, instructions and reviews on Apple products such as the iPhone, iPad, Apple Watch, AirPods, iMac, Mac Pro, Mac Mini and Mac Studio.