Ever play peek-a-boo with a baby? Cover your eyes, reveal them, the baby laughs. Your continued existence is a surprise to a baby every time.

Babies lack “object permanence”. They don’t understand that unseen things exist.
Bot traffic on your website still exists even though your analytics platform hides it.
And you’re paying for it.
Search for “see bot traffic”. Most of the results are for how to hide bot traffic. You ask how to see it, and the answer is how to hide it.

Imagine you tell a therapist “Doc, I got a lot of problems,” and they reply, “Have you tried ignoring them?”
Would you pay for that advice?
Ignoring a problem isn’t a solution.
Bots are half your traffic. That traffic costs you money. You need to see that traffic so you can understand it. Who are they? What are they doing? Where are they going?
robots.nxt gives you what you need — robots.nxt lets you see bot traffic.
But we take it a step further — robots.nxt lets you control bot traffic.
We give you back control over your website. Your expenses. Your revenue.
The life-blood of your organization.
Content Delivery Networks and Feature Overload
“I block bots in my firewall”, said the 65 year old network engineer, not accepting that all his firewall does now is manage traffic between his internal network and the CDN that caches his content and distributes it to the actual internet.
Cloudflare, a major content delivery network (CDN, which is a type of reverse proxy), built itself to prevent attacks from distributed denial of services. They released a “one click bot blocker”, an “AI Audit” tool, and announced a content marketplace for AI companies to license content from websites.
Content delivery networks are distributed reverse proxy services that cache your content and spread your website load to regional servers. CDNs improve page load time, increase site availability and uptime, and improve security.
Major CDNs are expensive. A university with 15k students told me they spend $20k/mo on Cloudflare. Smaller websites, like local services businesses that do content marketing, or specialty publishers, can’t afford that.
CDNs like Cloudflare offer a huge number of tools, but most users only need some of them. And you need an expensive enterprise plan for bot management tools.
robots.nxt was built to do one thing well and affordably — control bot traffic.
How Does robots.nxt Control Bot Traffic?
Setting up the robots.nxt proxy to control bot traffic only takes a few minutes.
Go to your “Settings” page, and choose the “Proxy Setup” tab. It’ll give you a CNAME entry to add to your DNS routes. Input your source URL to confirm, and you’re done.

Once those orange buttons turn green, you can visit the “Access Rules” page and start to control bot traffic like you mean it.
Is that beyond your comfort level? No problem! We’ll do it for you.
Our proxy is like any other CDN, except we’re designed to control bot traffic, not just block bot traffic. Of course blocking is part of it. There’s a lot of bad bots out there. But it’s not just good bots (indexers and optimizers) and bad bots (hacking and DDOS) anymore.
There’s a big new group to be managed, not given free reign or kicked out.
About 10% of your traffic are AI content scrapers that collect data and content for AI training. That’s going to keep growing faster than any other source of traffic.
Imagine your cash register didn’t work for your largest, fastest growing audience.
Would you say “take it for free”? That’s ignoring bots.
Would you say “get out of here and don’t come back”? It’s not their fault that your websites’ point-of-sale is broken. But that’s what blocking content scrapers does.
A better approach is to set something up that lets you sell to them.
How do you monetize an audience? You make them pay.
How do you manage demand? You set a price.
These are basic, but nobody’s done this for bot traffic on websites yet.
robots.nxt lets you set a price for bots to access your website.
We intercept your traffic, sort humans and bots, and pass the humans on.
We categorize the 500 bots we recognize into 15 different categories.

You can manage bots in four ways:
- By bot
- By bot category
- By page route
- By content category
We offer one-to-one, one-to-many, and many-to-many for bots and pages/routes.
This is far more powerful than robots.txt, which is a 1:1 bot-to-route polite suggestion. With robots.txt, every bot and every route needs its own line spelled out to exactly them.
How long would it take to do that for thousands of pages and hundreds of bots?
You do all that work. And the bot can just… ignore it.
And what about the bots you don’t already know about?

robots.nxt is impossible for bots to ignore, because we evaluate every bot request against the rules, and if the bots’ request violates the rules, we just… ignore it.
And it doesn’t matter if we already recognize the bot or not, we can still control it.

Here’s the rules set up by one of our users. They’re explicitly blocking every bot that’s:
- A content scraper
- Out of service (in case someone is spoofing)
- Built using a software package for building bots
- Malware, botnets, and spam (for security, DDOS, and form spam protection)
- Unknown (any bot we don’t recognize but can tell is a bot)
- Any bot not otherwise covered by a rule set
They’re explicitly allowing
- Page indexers
- Metrics & analytics
- RSS & summaries
- googlebot and bingbot
This change crushed their bot traffic overnight. They went from almost half their traffic being bots, to only a small number of bots being allowed to load their pages.
The next step is to allow content scrapers on your content routes, but put a price on that content. When a bot requests the priced route, robots.nxt puts its hand out and says “Ticket, please!”
If the bot pays, we serve them the route they requested.
If the bot doesn’t pay, we just ignore them.
robots.nxt is a point-of-sale system built into your website that automatically monetizes your content to bots.
And that, my friends, is when things get really interesting.