2021年3月27日星期六

I want to apply a disallow directive on a robots.txt file for a particular set of URLs

I am going over a client´s site with a very particular case I´ve never encounter.

Instead of having a category page, the site has a name added to all the URLs.

For example, let´s say this word is "destinations" so, instead of having a "Destination" as a category page, the site has like 50+ posts with the "destination" embedded on the URL.

Like this...

https://myclientssite.com/destinations-spain  https://myclientssite.com/destinations-france  https://myclientssite.com/destinations-portugal  https://myclientssite.com/destinations-italy  

Also, even after the country, it has other weird code that my client uses internally.

So, I would like to know how I can apply a DISALLOW directive for the URLs that contain "/destinations-"

It would be like this...

DISALLOW: /destinations-*

thoughts?

https://stackoverflow.com/questions/66837799/i-want-to-apply-a-disallow-directive-on-a-robots-txt-file-for-a-particular-set-o March 28, 2021 at 10:04AM

没有评论:

发表评论