New Research - Feb 2026

The permissions layer for the agentic web

8 competing AI policy standards. No interoperability. 90% of websites say nothing at all. Maango is building the canonical registry for AI permissions - so agents know what's allowed, and websites control what's permitted.

999,316 domains analyzed90.1% have no AI policy8 standards parsedFeb 2026

Built for you

What brings you here?

Ship compliant agents - without reading a million robots.txt files.

Compliance on autopilot

One API call returns structured permissions for any domain - training, search, and inference. No parsing. No guessing.

Audit trails built in

Every policy check is logged with timestamps. When regulators or publishers ask, you have receipts.

Ship faster

Stop hand-parsing 8 competing formats. We resolve conflicts across robots.txt, llms.txt, ai.txt, and more - so your agent doesn't have to.

90% of sites say nothing

Without a resolution layer, your agent is guessing permissions. That's a lawsuit waiting to happen.

EU AI Act compliance deadlines are approaching. The cost of guessing goes up every month.

Get Your API Key

AI agents are already visiting your site. Do you know what they're doing?

Understand what AI sees

Discover what signals your site sends - or doesn't - to AI crawlers across 8 different standards.

Protect your content

Set explicit rules for training, scraping, and indexing. Don't let silence become consent.

Declare your policy once

Set your AI permissions through Maango and every agent that checks the registry respects them. No config files to maintain across 8 different standards.

One dashboard, 8 standards

robots.txt, llms.txt, ai.txt, and 5 more - each saying different things. Maango unifies them so you don't have to manage each one.

Your competitors already started

Nearly 5% of the top million sites block AI entirely. Others are setting nuanced policies. Every day you wait, the default is set for you.

90% of websites have no AI policy. Silence isn't neutral - it's a decision you didn't make. Claim your domain and set the rules.

State of AI Agent Policies 2026

We crawled 1 million domains. 90% have no AI policy.

The first comprehensive study of how the web governs AI agents. 8 standards parsed. 13 sections of findings.

90.1%
No machine-readable AI policy
4.8%
Explicitly block all AI agents
6.9%
Block GPTBot (most-blocked bot)
2.6%
Have comprehensive AI policies
Read the Full Report

From aggregator to authority

Today, Maango reads every AI policy standard because no single one is authoritative. robots.txt wasn't built for AI. llms.txt is optional. ai.txt only covers Cloudflare's 20%. The web has 8 competing formats and no source of truth.

That's the gap we're closing. As website owners claim their domains and declare policies directly through Maango, the registry becomes the canonical layer - not a scraper of fragments, but the place where AI permissions live. One registry, every agent checks it. Like DNS, but for AI permissions.

API Live Now

Don't build blind. Don't publish blind.

90% of the web has no AI policy. Whether you're building agents or running a website, you need a source of truth.