AI didn’t ask. It just took.
It crawled your websites. Trained on your stories. Summarized your headlines.
And gave nothing back.
Now finally, someone said no.
Cloudflare just blocked major AI bots from accessing millions of websites. Not someday. Not maybe. But now.
This changes everything. Or at least, it should.
Because if Cloudflare can say no to AI, then maybe it’s time for readers, creators, and the whole internet to ask — what are we really giving away?
What Cloudflare Did
Cloudflare is one of the biggest infrastructure companies on the internet. It protects millions of websites, including many news publishers.
This week, it made a strong move.
It blocked AI bots like GPTBot (OpenAI), ClaudeBot (Anthropic), and Perplexity by default. That means these bots can no longer silently crawl your content, learn from it, and use it to train their models unless you explicitly allow them.
For once, creators didn’t have to fight or file lawsuits. Cloudflare just did it. Quietly. With clarity.
And that’s a big deal.

Why It Matters
Let’s not forget how this all started.
AI companies crawled the internet for years. They built billion-dollar models by learning from everything online — including the work of journalists, bloggers, independent creators, and researchers.
They didn’t ask. They didn’t pay. They didn’t link back.
Most creators didn’t even know they were being used.
Some of the biggest AI companies today fly private jets.
But the models that power those jets? Trained on the unpaid work of people who are still trying to protect their jobs.
Content created from every pen.
But the pen never gets paid. The writer never gets credit. The newsroom never gets traffic.
That is the gap.
And Cloudflare, at least, decided to close it.
But Here’s the Problem
Google doesn’t need GPTBot.
It already has Googlebot. It’s been crawling your site for years — and you’ve allowed it, because you rely on Google for traffic.
But now, Google is using that same crawl to power something else. AI Overviews. Generative summaries. Direct answers. Instant explanations.
The kind that take your headline, your facts, your story, and package it into a neat little AI box — where the user never needs to click through to you.
You can’t block it. You can’t stop it. Because if you do, you disappear from Google Search altogether.
That’s not consent. That’s dependence.
And it’s dangerous.
What This Means for Newsrooms
Let’s be clear. AI is not a journalist.
It cannot attend a press conference.
It cannot investigate a scam.
It cannot file an RTI.
It cannot earn trust, feel fear, take responsibility.
All it can do is remix and summarize the work of people who did.
But now, those people are being replaced in the chain. The platform takes their work, rewrites it, and presents it as a product.
And the journalist — the one who did the work — disappears from view.
That’s not innovation. That’s erasure.
And Even If AI Starts Paying — Who Wins?
Let’s say OpenAI and Google start cutting cheques.
Will that fix everything?
Maybe it helps publishers recover lost ad revenue.
Maybe it makes up for the drop in clicks, the vanishing homepage traffic, the SEO decay.
But what about the journalist?
What about the one who did the legwork?
The one whose name disappears from the AI summary?
The one whose byline never shows up in the answer?
If money flows, but credit doesn’t, does it really help the person who created the story?
That’s the uncomfortable truth.
Even with AI licensing deals, it’s still the same game —
The platform profits. The publisher adjusts.
And the journalist fades.
That’s not a sustainable ecosystem.
That’s a system where value is extracted, not shared.
What Happens If More Companies Join Cloudflare
Right now it’s just Cloudflare. But imagine if other internet infrastructure companies also start blocking AI bots.
Akamai. AWS CloudFront. Fastly. Vercel. Netlify.
If they follow Cloudflare’s lead, AI models will start losing access to real-time, high-quality content.
They won’t be able to keep up.
Their answers will go stale.
Their hallucinations will rise.
And maybe, just maybe, they’ll be forced to do what they should have done all along — ask for permission and pay for value.
But That Still Leaves One Question
If Cloudflare can say no to AI, why can’t we?
Why can’t users choose to visit a news site instead of asking a chatbot?
Why can’t creators stop giving away their work for free, just to feed a machine that doesn’t credit them?
Why can’t we, the people who read, write, and search, decide that news is not just data — it is a public service?
Journalism doesn’t survive because it exists.
It survives because someone reads it, shares it, supports it.
AI didn’t break journalism.
But how we use AI might.
My Take Here:
Cloudflare did its part.
And yes, maybe AI companies will start paying.
Maybe licensing deals will come.
Maybe publishers will recover some of the revenue they lost when users stopped clicking.
But let’s be honest — even when money flows, it usually flows up.
The platform gets stronger.
The publisher gets compensated.
But the journalist — the one who created the story — still disappears from the chain.
So the question isn’t just whether AI should crawl.
It’s whether the people who built the web of knowledge will ever be seen again.
That’s why this isn’t just a tech debate.
It’s a cultural reset.
Stop reading answers with no names.
Stop replacing truth-seekers with token payments.
Stop acting like summaries are enough.
Because the future of journalism doesn’t depend on scraping policies or payment deals.
It depends on visibility, credit, and choice.
Cloudflare took the first step.
Now it’s our turn.
Go to the original source.
Read the full story.
Support the human who made it possible.
That’s how journalism survives.
Not by feeding the machine — but by remembering who wrote the first word.
Written by someone who still believes the byline matters.
For everyone whose work built the internet, but got forgotten when the machines showed up.
– Rudra Kasturi
Discover more from Rudra Kasturi
Subscribe to get the latest posts sent to your email.