*KICKS DOOR DOWN*

Hey everyone! Hate AI web crawlers? Have some spare CPU cycles you want to use to punish them?

Meet Nepenthes!

https://zadzmo.org/code/nepenthes

This little guy runs nicely on low power hardware, and generates an infinite maze of what appear to be static files with no exit links. Web crawlers will merrily hop right in and just .... get stuck in there! Optional randomized delay to waste their time and conserve your CPU, optional markovbabble to poison large language models.

Follow

@aaron lol, nice, one time I tried to do this on our git server by returning 4mb of text at ~10 bytes per second... It was all fun and games until it started mistaking human users for bots 😔

Sounds like you've solved that by letting the bots out themselves and trap themselves by navigating too deeply into a nonsense area that's separate from the actual site...?

@forestjohnson That's another classic technique. Send headers fast and slow walk the body, most http client implementations won't time out.

Nepenthes does this to a small degree as well. It computes how many bytes to send to have a least one packet per second while still spending the requested delay time to send the whole body.

Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.