> Why are corporations allowed to do with impunity what could land even a teenager years in prison? Is there no rule of law anymore?
Those laws are intended to protect corporations. If corporations are the ones doing the scraping, it doesn't make sense for the same laws to affect them.
So, I knew Aaron and I definitely would not presume to predict what he would have thought, but I’d point out there is a sizeable state space where he should never have been prosecuted, and scraping by others including large commercial companies should not prosecutable on the same grounds.
I repeat what Aaron’s friends and lawyers said at the time: we were going to fight that case, and we were going to win.
It's a bit more like a physical business with a "public welcome" policy like a coffee shop going viral and then having tens of thousands of people walking in and taking pictures but not buying coffee. It's disruptive, but not illegal.
Acme.com is welcome to require authentication for all pages but their home page, which would quickly cause the traffic to drop. They don't want to do this - like the coffee shop, they want to be open to public, and for good reasons.
Sometimes the use profile changes dramatically in a short time. 15 years ago, Netflix created the video streaming market and shared bandwidth capacity that had been excessive before wasn't enough. 15 years before that, Google did the same thing when they created search and started driving tremendous traffic to text based websites which had spread through word of mouth before.
Turns out the micro transaction people probably had the right idea.
I've had to deploy a combination of Cloudflare's bot protection and anubis on over 200 domains across 8 different hosting environments in the last 2 months. I have small business clients that couldn't access their sales and support platforms because their websites that normally see tens of thousands of unique sessions per day are suddenly seeing over a million in an hour.
Anthropic and OpenAI were responsible for over 70% of that traffic.
Have you not been paying attention to the news for the past few years?
No, there isn't. If there were, Trump would be in prison, not the Oval Office. And he and the Republican Party have deliberately fostered this environment of corruption and rule-by-wealth so that they can gain more power and even more wealth.
And now they are also backing the AI zealots, and techbros more generally, to ensure that they can do whatever the hell they want, damn the consequences to the rest of the world.
Because the law deals with intent. The intent for a 12 year old skiddie with a ddos box is to harm someone else's internet. the intent of big scrapers is to collect data. if you want to make the latter illegal then vote for that instead of loading it with the normative baggage of the former.
It's the same problem as why Occupy Wallstreet fell apart: bunch of losers who don't understand the system screech about the system. because they don't understand it, they can't offer any meaningful dialogue about how to fix it beyond screeching.
I suspect part of the issue is that people are still using things like acme.com and demo.com as an example domain in their documentation and tests instead of relying on example.com which is reserved exactly for this purpose [0]
Bot traffic is crazy even for smaller sites, but still manageable. I was getting 2,000 visitors a day on my infrequently updated website, but after I blocked all the bots via Cloudflare it went back to the normal double digit visitor count.
> Now closing https service is obviously just a temporary fix
Probably the best starting point would be to edit the robots.txt file and disallow LLM bots there.
Currently the file allows all bots: http://acme.com/robots.txt
The only real solution is to put Anubis in front. For me, I just use Cloudflare in front and that suffices. But it's only a few thousand per hour by default. My homeserver can handle that quite well on its own.
There are plenty of local LLMs out there run by humans that play nice. It's not the LLMs that are the problem. It's the corporations. That's the commonality. Human people aren't doing this. These corporate legal persons are a much more dangerous and capable form of non-human intelligence with non-human motives than LLMs (which are not doing the scraping or even calling the tools which are sending the HTTP requests). And they have lobbied their way to legal immunity to most of their crimes.
62 comments
> The LLM companies are not picking on me in particular, they are pounding every site on the net.
Why is not this a criminal offense? They are hurting business for profit (or for higher valuation as they probably have no profit at all).
Why are corporations allowed to do with impunity what could land even a teenager years in prison? Is there no rule of law anymore?
The five-year and ten-year penalties kick in only when the government can show the offense caused at least $5,000 in losses across all victims during a one-year period. https://legalclarity.org/what-are-the-punishments-for-a-ddos...
> Why are corporations allowed to do with impunity what could land even a teenager years in prison? Is there no rule of law anymore?
Those laws are intended to protect corporations. If corporations are the ones doing the scraping, it doesn't make sense for the same laws to affect them.
[1] https://en.wikipedia.org/wiki/Dual_state_(model)
I repeat what Aaron’s friends and lawyers said at the time: we were going to fight that case, and we were going to win.
Acme.com is welcome to require authentication for all pages but their home page, which would quickly cause the traffic to drop. They don't want to do this - like the coffee shop, they want to be open to public, and for good reasons.
Sometimes the use profile changes dramatically in a short time. 15 years ago, Netflix created the video streaming market and shared bandwidth capacity that had been excessive before wasn't enough. 15 years before that, Google did the same thing when they created search and started driving tremendous traffic to text based websites which had spread through word of mouth before.
Turns out the micro transaction people probably had the right idea.
I've had to deploy a combination of Cloudflare's bot protection and anubis on over 200 domains across 8 different hosting environments in the last 2 months. I have small business clients that couldn't access their sales and support platforms because their websites that normally see tens of thousands of unique sessions per day are suddenly seeing over a million in an hour.
Anthropic and OpenAI were responsible for over 70% of that traffic.
waiting on the govt to do something is a path of failure
> Is there no rule of law anymore?
Have you not been paying attention to the news for the past few years?
No, there isn't. If there were, Trump would be in prison, not the Oval Office. And he and the Republican Party have deliberately fostered this environment of corruption and rule-by-wealth so that they can gain more power and even more wealth.
And now they are also backing the AI zealots, and techbros more generally, to ensure that they can do whatever the hell they want, damn the consequences to the rest of the world.
How do you think search engines work?
It's the same problem as why Occupy Wallstreet fell apart: bunch of losers who don't understand the system screech about the system. because they don't understand it, they can't offer any meaningful dialogue about how to fix it beyond screeching.
acme.comanddemo.comas an example domain in their documentation and tests instead of relying onexample.comwhich is reserved exactly for this purpose [0][0]: https://www.iana.org/domains/reserved
> I closed port 443
> Now closing https service is obviously just a temporary fix
Probably the best starting point would be to edit the robots.txt file and disallow LLM bots there. Currently the file allows all bots: http://acme.com/robots.txt
> Nearly all of them were for non-existent pages.
Do any webservers have a feature where they keep a list in memory of files/paths that exist?
People might not know about ipset - dont use individual rules in iptables.
Nginx can reject easily based on country.
geoip2 /etc/GeoLite2-Country.mmdb { $geoip2_metadata_country_build metadata build_epoch; $geoip2_data_country_code default=Unknown source=$remote_addr country iso_code; }
server { .... if ($allowed_country = no) { return 444; } }> Someone really ought to do something about it.
What is bro proposing here?