Google’s John Mueller mentioned on Reddit that disallowing URLs with UTM parameters in them will not make it easier to to enhance crawling or ranking with Google Search. He added {that a} website ought to attempt to preserve its inside URLs clear and constant, however over time, the canonical tags ought to assist with exterior hyperlinks that carry UTM parameters on them.
John wrote, “I doubt you’d to see any seen results in crawling or rating from this. (And if there isn’t any worth from doing it, why do it?)” When he was requested about disallowing such URLs.
He added:
Usually talking, I might nonetheless attempt to enhance the location in order that irrelevant URLs do not should be crawled (inside linking, rel-canonical, being according to URLs in feeds). I believe that is sensible when it comes to having issues cleaner & simpler to trace – it is good website-hygiene. In case you have random parameter URLs from exterior hyperlinks, these would get cleaned up with rel-canonical over time anyway, I would not block these with robots.txt. When you’re producing random parameter URLs your self, say throughout the inside linking, or from feeds submissions, that is one thing I might clear up on the supply, slightly than blocking it with robots.txt.
tldr: clear web site? sure. block random crufty URLs from outdoors? no.
That is all similar to earlier recommendation from John Mueller that I quoted in these tales:
Discussion board dialogue at Reddit.