The new reality of automated traffic
Automation isn’t new — what’s new is how invisible it’s become. AI agents, fetchers, and crawlers are now woven into the fabric of online interaction. Yet the web’s core trust mechanisms haven’t kept up, leaving most websites unable to tell whether a bot is verified, spoofed, or malicious.
So the question becomes: How can we prove a bot is really what it claims to be — cryptographically, at scale, and in real time?
Introducing Web Bot Authentication
Web Bot Authentication (Web Bot Auth) is an emerging standard that allows bots to prove their identity via cryptographic signatures embedded in standard HTTP messages. A verified bot, such as an AI search crawler or fetcher, signs its HTTP requests with a private key. Servers and intermediaries, such as the Akamai edge, can then verify the signature using the crawler’s published public key.
This follows the mechanism defined by the Internet Engineering Task Force (IETF) in HTTP Message Signatures, which specifies how to create and verify signatures and how to carry them in the Signature-Input and Signature HTTP headers.
At Akamai, we see this as a critical next step in evolving the bot management ecosystem from heuristic detection toward verifiable identity.
The standards powering the change
Two IETF documents anchor this work:
RFC 9421: HTTP Message Signatures — Defines how to create, include, and verify message signatures, including the Signature-Input and Signature headers that bind the covered components and parameters
HTTP Message Signatures Directory (Internet-Draft) — Proposes a standardized way for clients to advertise signing keys in a JSON Web Key Set (JWKS)–based directory, discoverable via well-known locations and an optional Signature-Agent header; it also registers the media type application/http-message-signatures-directory+json
Together, these documents establish a public key foundation for automation so verifiers can validate bots without static allowlists or opaque partnerships.
How Web Bot Auth works in practice
Validation takes three steps:
Key publication
Signed request
Signature verification
Key publication
A verified bot publishes its public key directory at a known URI, which is served as application/http-message-signatures-directory+json.
Signed request
The bot signs selected HTTP components and includes the signature header. The Signature-Input header provides the “recipe” for generating the signature.
Signature-Input: sig1=("@method" "@target-uri" "user-agent"); keyid="https://bot.example.com/.well-known/http-message-signatures"; alg="rsa-pss-sha512"
Signature: sig1=:Base64SignatureValue:
Field names and serialization follow RFC 9421. Some deployments allow or forbid runtime algorithm signaling through alg per application policy.
Signature verification
After receiving a request, the server extracts the recipe to generate the signature from the Signature-Input header, fetches the public key, computes the signature, and validates that it matches the value in the signature header.
How Akamai enables Web Bot Authentication
Web Bot Auth is already supported by Akamai’s bot & abuse solutions, inclusive of bot visibility and mitigation in Akamai App & API Protector. Akamai enables Web Bot Auth in two steps:
Edge verification — The Akamai edge retrieves the bot’s signature directory, validates the signature using RFC 9421 rules, and confirms the request's authenticity without adding complexity for origin systems.
- Policy decision — If the signature is valid, Akamai then classifies the request as validated-bot traffic in our bot directory. Customers are empowered to define policy actions on traffic verified with Web Bot Auth to allow or mitigate with challenge, delay, or block actions, based on their preference or need.
Why it matters
Web Bot Authentication helps prove a bot is really what it claims to be by enabling verified automation and providing websites with cryptographic proof of who is making a request. With Web Bot Auth:
Signed requests replace spoofable user-agent claims.
Legitimate crawlers and AI agents can be verified instead of inferred.
Dynamic key discovery removes the need for static allowlists.
The same framework extends to AI agents, APIs, and service-to-service communication.
Evolving standards for agentic commerce
Web Bot Auth is just one of the several emerging protocols that are shaping how automated agents prove authenticity online. Rather than defining trust outright, Web Bot Auth verifies origin.
The next evolution will be putting the user behind the agent at the center: establishing identity, reputation, and control on the basis of who operates the agent and how it behaves across interactions.
Beyond Web Bot Auth: Know Your Agent
As the web enters the agentic era, new protocols are rapidly taking shape. Know Your Agent (KYA), developed by Skyfire, extends beyond Web Bot Auth. Instead of relying solely on cryptographic signatures, KYA introduces an identity verification layer for agents that binds each agent to a verified organization, developer, platform, or to the user behind the agent through tokenized credentials.
The KYA protocol is extensible and can carry metadata such as business type, use case, and intent, which allows websites and APIs to differentiate between good automation (like research, search, or commerce) and undesirable scraping or abuse.
While Web Bot Auth focuses on proving where a request originates, KYA focuses on proving who is behind it and why it’s acting, enabling more granular permissions and monetization models. Other initiatives, such as Visa’s Trusted Agent Protocol and Mastercard’s Agent Pay, share this same goal: establishing a verifiable and interoperable trust fabric for automated interactions on the web.
Building the future together
Verified automation won’t evolve in a vacuum; it depends on collaboration. Akamai is partnering with industry peers, AI platforms, and developers — including OpenAI and Amazon AgentCore — to make Web Bot Auth and other verification protocols practical, interoperable, and privacy preserving.
We’re on a mission to enable a secure agentic AI world that delivers real economic and business value, in which verified automation drives innovation, trust, and transparency for everyone.
If you’re building or managing automated agents, we invite you to join us in testing and contributing to help shape a safer web together.
To learn more about Akamai’s agentic AI strategy or to collaborate with us, contact an expert.
Tags