They’re already ignoring robots.txt, so I’m not sure why anyone would think they won’t just ignore this too. All they have to do is get a new IP and change their useragent.
Cloudflare is protecting a lot of sites from scraping with their POW captchas. They could allow people who pay
I have an idea. Why don’t I put a bunch of my website stuff in one place, say a pdf, and you screw heads just buy that? We’ll call it a “book”
As someone who uses invidious daily I’ve always been of the belief if you don’t want something scraped, then maybe don’t upload it to a public web page/server.
There’s probably not many people here who understand the connection between Invidious and scraping.
Imagine a company that sells a lot of products online. Now imagine a scraping bot coming at peak sales hours and looking at each product list and page separately for said service. Now realise that some genuine users will have a worse buying experience because of that.
Yeah there’s way easier ways to combat that without trying to prevent scraping.
Maybe don’t ship 20 units to the same address.