No I will not be running it
It does absolutely nothing to slow crawlers down (it’s not like they’re going to wait for a page to finish loading before they move on to the next one, crawlers are super optimized to just constantly grab as much bandwidth as possible in parallel), there’s already so much AI slop on the web that it’s not going to contribute meaningfully to model collapse, and all you’re doing by running it is wasting even more resources. Giving the LLM crawlers more content to slurp up just gives them more reasons to waste even more resources, and only continues the death spiral of making the Internet an even worse place.
This isn’t like interfering with scammer call centers through scambaiting or the like. Computers have no problem with having their time wasted.
And meanwhile it does nothing to actually solve the problem.