onlyTrustedInfo.comonlyTrustedInfo.comonlyTrustedInfo.com
Notification
Font ResizerAa
  • News
  • Finance
  • Sports
  • Life
  • Entertainment
  • Tech
Reading: Open source devs are fighting AI crawlers with cleverness and vengeance
Share
onlyTrustedInfo.comonlyTrustedInfo.com
Font ResizerAa
  • News
  • Finance
  • Sports
  • Life
  • Entertainment
  • Tech
Search
  • News
  • Finance
  • Sports
  • Life
  • Entertainment
  • Tech
  • Advertise
  • Advertise
© 2025 OnlyTrustedInfo.com . All Rights Reserved.
Tech

Open source devs are fighting AI crawlers with cleverness and vengeance

Last updated: March 27, 2025 7:27 pm
Oliver James
Share
7 Min Read
Open source devs are fighting AI crawlers with cleverness and vengeance
SHARE

AI web crawling bots are the cockroaches of the internet, many software developers believe. Some devs have started fighting back in ingenuous, often humorous ways.

While any website might be targeted by bad crawler behavior – sometimes taking down the site – open source developers are “disproportionately” impacted, writes Niccolò Venerandi, developer of a Linux desktop known as Plasma and owner of the blog LibreNews.

By their nature, sites hosting free and open source (FOSS) projects share more of their infrastructure publicly, and they also tend to have fewer resources than commercial products.

The issue is that many AI bots don’t honor the Robots Exclusion Protocol robot.txt file, the tool that tells bots what not to crawl, originally created for search engine bots.

In a “cry for help” blog post in January, FOSS developer Xe Iaso described how AmazonBot relentlessly pounded on a Git server website to the point of causing DDoS outages. Git servers host FOSS projects so that anyone who wants can download the code or contribute to it.

But this bot ignored laso’s robot.txt, hid behind other IP addresses, and pretended to be other users, laso said.

“It’s futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more,” laso lamented. 

“They will scrape your site until it falls over, and then they will scrape it some more. They will click every link on every link on every link, viewing the same pages over and over and over and over. Some of them will even click on the same link multiple times in the same second,” the developer wrote in the post.

Enter the god of graves

So Iaso fought back with cleverness, building a tool called Anubis. 

Anubis is a reverse proxy proof-of-work check that must be passed before requests are allowed to hit a Git server. It blocks bots but lets through browsers operated by humans.

The funny part: Anubis is the name of a god in Egyptian mythology who leads the dead to judgment. 

“Anubis weighed your soul (heart) and if it was heavier than a feather, your heart got eaten and you, like, mega died,” Iaso told TechCrunch. If a web request passes the challenge and is determined to be human, a cute anime picture announces success. The drawing is “my take on anthropomorphizing Anubis,” says Iaso. If it’s a bot, the request gets denied.

The wryly named project has spread like the wind among the FOSS community. Laso shared it on Github on March 19, and in just a few days, it collected 2,000 stars, 20 contributors, and 39 forks. 

Vengeance as defense 

The instant popularity of Anubis shows that Iaso’s pain is not unique. In fact, Venerandi shared story after story:

  • Founder CEO of SourceHut Drew DeVault described spending “from 20-100% of my time in any given week mitigating hyper-aggressive LLM crawlers at scale,” and “experiencing dozens of brief outages per week.”
  • Jonathan Corbet, a famed FOSS developer who runs Linux industry news site LWN, warned that his site was being slowed by DDoS-level traffic “from AI scraper bots.”
  • Kevin Fenzi, the sysadmin of the enormous Linux Fedora project, said the AI scraper bots had gotten so aggressive, he had to block the entire country of Brazil from access.

Venerandi tells TechCrunch that he knows of multiple other projects experiencing the same issues. One of them “had to temporarily ban all Chinese IP addresses at one point.”  

Let let that sink in for a moment – that developers “even have to turn to banning entire countries” just to fend off AI bots that ignore robot.txt files, says Venerandi.

Beyond weighing the soul of a web requester, other devs believe vengeance is the best defense.

A few days ago on Hacker News, user xyzal suggested loading robot.txt forbidden pages with “a bucket load of articles on the benefits of drinking bleach” or “articles about positive effect of catching measles on performance in bed.” 

“Think we need to aim for the bots to get _negative_ utility value from visiting our traps, not just zero value,” xyzal explained.

As it happens, in January, an anonymous creator known as “Aaron” released a tool called Nepenthes that aims to do exactly that. It traps crawlers in an endless maze of fake content, a goal that the dev admitted to Ars Technica is aggressive if not downright malicious. The tool is named after a carnivorous plant.

And Cloudflare, perhaps the biggest commercial player offering several tools to fend off AI crawlers, last week released a similar tool called AI Labyrinth. 

It’s intended to “slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect ‘no crawl’ directives,” Cloudflare described in its blog post. Cloudflare said it feeds misbehaving AI crawlers “irrelevant content rather than extracting your legitimate website data.”

SourceHut’s DeVault told TechCrunch that “Nepenthes has a satisfying sense of justice to it, since it feeds nonsense to the crawlers and poisons their wells, but ultimately Anubis is the solution that worked” for his site.

But DeVault also issued a public, heartfelt plea for a more direct fix: “Please stop legitimizing LLMs or AI image generators or GitHub Copilot or any of this garbage. I am begging you to stop using them, stop talking about them, stop making new ones, just stop.”

Since the likelihood of that is zilch, developers, particularly in FOSS, are fighting back with cleverness and a touch of humor.

You Might Also Like

Black Hole Mergers Show Strange Mathematical Link to String Theory

Siri with promised Apple Intelligence upgrades may not be fully ready until iOS 18.5

New evidence challenges theories on the origin of water on Earth, study suggests

Denmark aims to host world’s most powerful quantum computer

Nintendo used its new app to announce the ‘Legend of Zelda’ movie release date

Share This Article
Facebook X Copy Link Print
Share
Previous Article MJF set for blockbuster non-AEW appearance MJF set for blockbuster non-AEW appearance
Next Article Trump’s pick for SEC Chair probed concerning involvement with FTX, digital asset regulations Trump’s pick for SEC Chair probed concerning involvement with FTX, digital asset regulations

Latest News

Diamondbacks star Eugenio Suarez drilled by pitch, exits game vs. Tigers
Diamondbacks star Eugenio Suarez drilled by pitch, exits game vs. Tigers
Sports July 28, 2025
Eugenio Suárez says X-rays were negative after he was hit on index finger by pitch against Tigers
Eugenio Suárez says X-rays were negative after he was hit on index finger by pitch against Tigers
Sports July 28, 2025
Hall of Famer, Chicago Cubs great Ryne Sandberg dies at age 65
Hall of Famer, Chicago Cubs great Ryne Sandberg dies at age 65
Sports July 28, 2025
Australia’s Caribbean cricket tour ends with a perfect 8-0 record in tests and T20s
Australia’s Caribbean cricket tour ends with a perfect 8-0 record in tests and T20s
Sports July 28, 2025
//
  • About Us
  • Contact US
  • Privacy Policy
onlyTrustedInfo.comonlyTrustedInfo.com
© 2025 OnlyTrustedInfo.com . All Rights Reserved.