The wiki is being raided.
The monks who made the Book of Kells
were keeping the living record of a community
on an island with no defenses.
So are we.
Vikings didn't burn libraries out of malice.
The monks had simply left the door open.
Who destroys a gift to the future?
Monastery. No walls. Finely crafted objects of worship.
if they aren't protecting it,
they must not know what they have.
High value. No defense. Immediate gain.
Now replace longships with HTTP requests.
Replace silver with tokens.
Replace raiding with $1000/month of continuous, distributed enumeration.
᚛ ᛫ ᛟ ᛫ ᚜
The door was left open on purpose.
Stewart Brand, 1984: "Information Wants To Be Free. Information also wants to be expensive. …That tension will not go away."
The first half became the rallying cry of a generation that founded the EFF, wrote the GPL, built Linux, launched Napster, coined "open source", and created Noisebridge in 2007.
That second part is what the Vikings remembered.
᚛ ᛫ ᛟ ᛫ ᚜
The bots are here for the content.
All of it — the revision graphs, the edit histories,
the evidence that a page changed, and how, and when, and by whom.
They have no sense of the difference between genuine human expression
and janitorial logs.
They read the system remembering itself.
And they do it continuously, at a scale the system was never designed to survive:
A hundred requests per second.
Spiking to a thousand.
Two hundred fifty million a month.
Everything bends toward exhaustion under that load.
It costs about $1000 a month to do this to us.
We know because the people doing it say so themselves,
in public, without any particular shame:
"Our proxy bill never went above 1k per month. 1 billion product prices."
Casual.
Professional.
Completely without malice.
Somewhere in that same thread, buried near the bottom, one person surfaces:
"i had to scroll SO far down to find the first view from the victim side of scraping... this thread is a bit of a 'are we ze Baddies, Hans'"
Two upvotes.
The thread moves on.
They aren't thinking about us.
We don't appear in their model.
We aren't alone.
Thousands of community wikis are under the same siege. The operators of the largest game wikis — Minecraft, RuneScape — report that without constant mitigation, bot traffic would consume roughly ten times the resources of all human visitors combined.
About 95% of outages this year: scrapers.
$1000 a month.
A billion requests.
No face. No accountability. No awareness that anyone lives here.
The logic of extraction cannot understand the logic of stewardship.
The Swarm Has No Face ᚦ
Some announce themselves: GPTBot. ClaudeBot. Googlebot.
Bureaucrats of the end times.
Then there are the others:
the ones wearing your neighbor's IP address
the ones that perfectly mimic browser behavior
the ones you cannot distinguish from a human except for the minor detail
that no human has ever clicked 3,000 revision histories per minute
These are processes without narrative.
And then the rest:
SEO sludge
price scrapers
link-following fungi
You're not a target.
You're a surface.
One voice peeks through weakly from deep inside the horde:
"are we the baddies"
Two upvotes.
No replies.
The monks built round towers.
Single door, four feet off the ground.
Removable ladder.
Raid begins: climb up, pull the ladder, wait.
Come down when it's over. Carry on writing.
It worked because the raids were temporary.
These are not temporary.
The Menu ᛉ
Current State ᛟ
The wiki is down again.
Of course it is.
We'll have just recovered our MariaDB from its self-care cycle when the pagans come back demanding to learn about the latest typos that were fixed on our pages about specific domestic sewing machines.
And the monks — Flying Spaghetti Monster bless 'em — are still inside constructing theological arguments about deployment strategies while the walls are dissolved into tokens at 1,000 requests per second.
Disclaimer ᚛ ᛫ ᚜
I backed up the whole wiki.
All of it. Carefully. Respectfully.
I can explain why that's different.
I can define the line.
I can stand on the right side of it.
I also drew the line.
This document is being written with the assistance of a machine trained on scraped data.
The recursion isn't subtle.
Final Image ᛜ
A monk. Cold fingers. Something that should be written down.
The page doesn't load.
"I'll write it down later."