Akamai Functions

Build and run modern apps and AI with rapid, highly distributed serverless functions

Modern apps and AI thrive on speed

Akamai Functions helps teams build, deploy, and scale modern applications and AI workloads by running ultra-fast WebAssembly functions across the world’s most distributed cloud, reducing latency, lowering cost, and removing the burden of managing global infrastructure.

Why use Akamai Functions?

Instant startup, effortless scale

WebAssembly functions start in microseconds and autoscale, keeping modern apps and AI responsive under traffic.

Global by default, no complexity

Deploy once and run everywhere without managing regions or infrastructure, so teams focus on features, not ops.

Open, portable by design

Built on standards-based WebAssembly runtimes to maximize flexibility, portability, and freedom from vendor lock-in.

How Akamai Functions works

Write

Use Rust, Go, JavaScript, Python, and other languages that compile to WebAssembly.

Deploy

Package and deploy using open tooling like Spin and SpinKube, locally, in Kubernetes, or directly to the edge.

Run

Run on a Wasmtime-based runtime with strong isolation and memory safety.

Scale

Scale globally across Akamai’s distributed cloud by default.

Core features

Execute WebAssembly functions at the edge with microsecond startup

Run APIs, routing, and real-time logic with sub-millisecond startup, keeping applications responsive for users worldwide.

Deploy functions globally with a single command

Ship once and run functions across Akamai’s distributed edge without configuring regions or replicas.

Run hardened, sandboxed functions without containers or VMs

Execute code in isolated WebAssembly sandboxes without managing containers or virtual machines.

Scale Functions automatically based on global traffic demand

Automate function scaling across locations as traffic changes.

Integrate edge functions with CDN request and response flows

Run logic on incoming requests and outgoing responses to personalize, route, or transform traffic.

Persist and access data using edge key-value storage

Store and retrieve data close to users for redirects, tokens, personalization, and configuration.

Offload authentication, tokens, and bot logic from origins

Make fast decisions at the edge and seamlessly route heavier processing to cloud CPUs or GPUs when needed.

Spin and SpinKube for WebAssembly functions

Build portable WebAssembly functions with Spin. Deploy seamlessly on SpinKube or Akamai Functions.

Spin

Developer framework and CLI for building fast Wasm apps.

SpinKube

Develop and operate Spin apps in your Kubernetes cluster.

Akamai Functions Use Cases

Combine rich portability, deep platform integration, and unmatched geographic density to power core use cases.

Mass redirects

If you’re launching a new website or migrating traffic, redirects quickly become a high-effort problem. Managing hundreds of thousands or even millions of rules can introduce latency, overload your origin, and turn a simple change into an operational headache. Every redirect has to resolve instantly, at global scale, without slowing users down.

The answer is to move redirect logic to the edge. With Akamai Functions, redirects run directly where users connect, not upstream. Rules are kept in memory and in a dedicated KV store, enabling instant lookups and responses in just a few microseconds, even at massive scale. There is no origin dependency, no added latency, and no infrastructure to manage.

You can ingest large redirect sets, validate them, and deploy updates globally in seconds using an easy-to-use redirect builder. That means faster page loads everywhere, lower costs from eliminated upstream traffic, and a developer experience that stays simple and predictable as your ruleset grows.

When redirects are critical to your launch or migration, they should not slow you down. Mass redirects at the edge let you move fast, scale globally, and deliver a consistently fast experience from day one.

Token management

If you deliver premium or high-value content, tokens are a critical line of defense. At peak traffic, piracy and token abuse can directly impact revenue, while centralized token services introduce latency and struggle to scale for millions of concurrent viewers. Managing revocation across multi-CDN environments only adds more complexity.

The solution is to run token logic at the edge. With Akamai Functions, token generation, validation, and revocation execute directly where users connect. Functions run globally with built-in replication and no cold starts, allowing token decisions to be made instantly and consistently, even at massive scale.

This approach keeps legitimate users moving without disruption while enforcing token policies everywhere. Tokens are validated close to both valid viewers and malicious actors, reducing latency and enabling fast, globally enforced revocation without relying on fragile centralized systems. Akamai Functions also integrates cleanly into existing multi-CDN workflows, so you can extend your current architecture rather than replace it.

When token enforcement needs to be fast, scalable, and globally consistent, edge-based execution makes the difference. Edge token generation and validation help protect premium content, preserve performance, and maintain control as traffic scales worldwide.

Bot management

Modern bots do not behave like bots anymore. They execute JavaScript, follow normal navigation paths, and blend into legitimate traffic. That makes detection difficult and raises the stakes. You need to identify malicious automation quickly and accurately, without adding latency or disrupting real users. At the same time, obvious mitigation tactics like hard blocks or error pages can backfire by training attackers to adapt and bypass defenses.

Akamai Bot Manager solves the detection problem by accurately identifying malicious and unwanted bots. The real challenge is what happens after detection. By combining Bot Manager with Akamai Functions, mitigation becomes programmable and flexible. Instead of blocking traffic outright, identified bots can be routed to a function that dynamically generates bot-specific responses.

These responses appear legitimate to crawlers while quietly summarizing, removing, or obfuscating sensitive data. That protects content and business logic without signaling detection. To operate efficiently at scale, generated responses can be stored in a dedicated key-value store or cached on Akamai’s CDN, allowing repeat bot requests to be handled instantly with no performance impact.

The result is advanced bot detection and mitigation that protects legitimate traffic, maintains fast performance, and evolves as bot behavior changes. You gain precise control over how bots are handled, without relying on static rules or disruptive challenges.

Hyper-personalization

If you rely on generic product pages, you leave conversion and engagement on the table. But traditional personalization approaches come with their own trade-offs. Pre-generating and storing personalized variants creates cost and operational overhead, while centralized inference adds latency that breaks the fast experience shoppers expect.

The alternative is real-time personalization at the edge. By combining Akamai Functions with LLMs running on Akamai Cloud, personalized content is generated on the fly, close to the user. As customers return and build a purchase history, real-time context like location and past behavior can be used to tailor product pages instantly, without pre-building or managing thousands of variants.

Inference runs at the edge using GPUs on Akamai Cloud, keeping response times low and experiences consistent at global scale. Personalization logic executes only when needed, scales automatically with traffic, and avoids lock-in by allowing you to choose from a wide range of models to match your strategy.

The result is a shopping experience that feels genuinely personal and remains fast everywhere. You can increase conversion with dynamic, relevant product experiences while offloading inference workloads to the edge and maintaining the performance modern ecommerce demands

 

Resources

Get started now

Build your first function. Follow quick-start guides and examples to deploy serverless logic.

Akamai Functions documentation

Dive into APIs, runtimes, and architecture patterns.

The next evolution of serverless

The fastest serverless WebAssembly runtime plus the most distributed cloud lets you build smarter apps and scale instantly.

Frequently Asked Questions (FAQ)

Akamai Functions is an edge native serverless platform that runs WebAssembly functions globally by default. It enables teams to execute application logic and AI workflows within milliseconds of users, without managing servers, regions, or infrastructure.

Unlike region-based serverless platforms, Akamai Functions runs everywhere on Akamai’s distributed edge. It uses a WebAssembly runtime with sub-millisecond startup, delivering predictable low latency without multi-region complexity.

No. EdgeWorkers is optimized for lightweight JavaScript-based CDN logic. Akamai Functions provides a rich WebAssembly runtime that supports broader application logic, APIs, and AI workflows while retaining global, load-balanced execution. EdgeWorkers and Akamai Functions offer a complementary serverless solution.

You can use any language that compiles to WebAssembly, including Rust, Go, JavaScript, and Python. This gives teams flexibility while avoiding runtime lock-in tied to a single language or framework.

Yes. Akamai Functions supports real application logic such as APIs, microservices, event processing, and AI pre-/post-processing. It complements cloud CPUs, GPUs, and containers rather than replacing them.

Functions deploy once and run globally across Akamai’s edge network. They start in microseconds, scale automatically with traffic, and execute close to users to minimize latency and reduce egress costs.

Akamai Functions is built on open WebAssembly and WASI standards, and works with open source frameworks like Spin and SpinKube, allowing teams to retain portability across environments.

A person with black glass is shown with their face lit by the light of a computer screen

Book a demo

Get a demo of Akamai Functions to see the fastest, most distributed serverless functions engine for modern apps and AI.

Thank you for your submission. One of our consultants will be in touch soon to set up time to speak with you.