Valkey: The Future of Open Source In-Memory Data Stores

Amit Mohanty

Jan 21, 2026

Amit Mohanty

Amit Mohanty

Written by

Amit Mohanty

Amit Mohanty is a Senior Product Manager at Akamai.

Share

Executive summary

  • Valkey is an open source, in-memory data store created as a community-driven branch of Redis in response to Redis’s licensing changes, ensuring fully transparent Apache 2.0 licensing with no vendor lock-in.

  • Designed for the next generation of high-performance computing, Valkey delivers microsecond-level latency and millions of operations per second, supporting demanding use cases like caching, real-time analytics, session storage, rate limiting, and queuing.

  • Valkey offers seamless backward compatibility with the Redis ecosystem, enabling organizations to migrate with minimal changes and continue using existing clients and integrations.

  • Its open governance model guarantees transparent development and community ownership, facilitating rapid innovation and long-term stability.

  • Beyond traditional key-value storage, Valkey supports advanced features such as replication, clustering, pub/sub, server-side scripting, and lightning-fast vector database operations for AI and search use cases.

  • Positioned as more than just a Redis replacement, Valkey represents the evolution of open source in-memory data infrastructure for modern enterprise and developer needs.

In-memory data stores have become a critical part of modern application architecture. From caching and rate-limiting to real-time analytics and distributed systems, developers rely on ultra-fast data operations to deliver near-instant performance. For years, Redis dominated this space but today, the open source community has a powerful new alternative: Valkey.

Valkey is community-driven, fully open source in-memory data store built for the next decade of high-performance computing. In this blog post, we’ll explore what Valkey is, why it exists, its core benefits, and how it is reshaping the ecosystem.

What is Valkey?

Valkey is an open source (OpenBSD), in-memory key-value database and cache. It’s a community-driven branch of Redis that was created in 2024 after Redis changed its license from open source to a source-available model. 

Valkey is now part of the Linux Foundation and aims to remain fully open source, compatible with Redis protocols and data structures, and suitable as a drop-in replacement for many Redis use cases.

The story behind Valkey

Valkey emerged from a critical moment in the open source world. When Redis Labs changed Redis licensing terms, shifting away from the original permissive open source model, it created significant uncertainty for open source contributors, cloud vendors, and companies that were relying on truly open technologies.

To preserve a community-owned, permissive, Apache-licensed alternative, major open source advocates and engineers came together to form Valkey.

Valkey represents:

  • Fully open source governance

  • Community ownership

  • Transparent development

  • Long-term stability and compatibility

Why Valkey matters

Valkey embodies the open source community values, with:

  • 100% open source data store and zero licensing requirements

  • High performance for modern workloads

  • Heavy focus on backward compatibility

  • Transparent community governance

100% open source data store and zero licensing restrictions

Valkey embodies the open source community values, with:

  • 100% open source data store and zero licensing requirements

  • High performance for modern workloads

  • Heavy focus on backward compatibility

  • Transparent community governance

High performance for modern workloads

Like Redis, Valkey operates entirely in memory, delivering microsecond-level latency and millions of operations per second. It’s built to serve web applications, microservices, gaming systems, AI/ML workloads, Internet of Things (IoT) streaming pipelines, and real-time personalization engines. 

Independent performance benchmarks show Valkey already matches (and, in some cases, exceeds) established competitors in throughput and latency.

Heavy focus on backward compatibility

Valkey focuses heavily on backward compatibility. Most Redis applications require minimal or zero code changes. Valkey is compatible with Redis protocol, Redis wire format, common Redis commands, and popular Redis clients. This lowers the friction for organizations that are looking to transition without rewriting existing systems.

Transparent community governance

Valkey’s direction is guided by a publicly accessible Request for Comments (RFC) process in which the discussions of improvements range from clustering to new data types. Open governance enables that decisions are driven by the community and feature transparency, faster innovation cycles, and no behind-closed-doors roadmap changes.

Key benefits of Valkey

Valkey supports a broad set of optimized data structures, enabling developers to build highly efficient solutions without complex architecture. 

It supports built-in replication, failover, and clustering to provide fault tolerance and horizontal scalability for production environments.

Valkey offers lightweight, low-latency publish/subscribe (pub/sub) mechanisms ideal for messaging, event notifications, and real-time apps. 

It also supports server-side scripting (Lua scripting), which allows advanced atomic operations without round trips to clients. Because of its Redis compatibility, Valkey can leverage the same ecosystem of command-line interface tools, monitoring integrators, client libraries, and infrastructure orchestration (Helm, Terraform).

Top Valkey use cases

Valkey excels in various use cases, including:

  • Caching 

  • Real-time analytics

  • Session storage

  • Rate limiting and throttling

  • Queueing and streaming

Caching

Valkey is an excellent choice for a caching layer, especially in high-traffic or low-latency applications. A caching layer stores frequently accessed data in memory (RAM) to avoid expensive operations, such as database queries, disk reads, API calls, or computation.

Valkey provides the ideal environment for this because it is extremely fast (microsecond read/write latency), in-memory, horizontally scalable, open source, and Redis-compatible.

Valkey is ideal for:

  • Ecommerce and retail applications: Caching helps handle product catalog searches, shopping cart sessions, flash-sale traffic spikes, inventory availability, and personalized recommendations.

  • Gaming applications: Caching is used for maintaining real-time leaderboards, player session states, matchmaking data, and in-game event tracking.

  • Media and streaming services: Caching can maintain content metadata lookups, user viewing preferences, CDN sidecar caches, and real-time analytics.

Real-time analytics

Valkey is widely used for real-time analytics in industries that need fast, high-volume, low-latency data processing. Valkey offers microsecond operations, in-memory storage, streams, pub/sub, and Redis-compatible commands. It is ideal for real-time pipelines that need instant insights, not batch processing.

  • Ecommerce applications: Use real-time analytics for personalized recommendations, dynamic pricing, cart abandonment analytics, real-time user behavior tracking and inventory, and demand forecasting.

  • Advertising and marketing tech: Use real-time analytics for real-time bidding (RTB), campaign performance metrics, audience segmentation, and clickstream analytics.

  • IoT, smart devices, and industrial automation: Use real-time analytics for sensor data monitoring, machine performance analytics, predictive maintenance signals, and environmental anomaly detection.

Session storage

Valkey is an excellent fit for session storage because it provides fast, in-memory access, time to live (TTL) expiration, and seamless integration with web and mobile applications. As a result, it is used across many industries that require low-latency authentication, user tracking, and high concurrency.

  • Travel, hospitality, and transportation applications: Use session storage for supporting booking sessions, user journey continuity, and search sessions (flights, hotels, etc).

  • Social media and messaging platforms: Use session storage for user login sessions, presence/online status, rate limits tied to sessions, and temporary interaction state.

  • Software as a service (SaaS) and cloud applications: Use session storage for multi-tenant authentication, feature flags and user preferences, API session tokens, and single sign-on state management.

Rate limiting and throttling

Valkey is commonly used for rate limiting and throttling in industries that must control traffic, prevent abuse, and maintain reliable, low-latency service under heavy load. Because Valkey provides atomic operations, fast counters, TTL support, and microsecond latency, it is ideal for enforcing per-user, per-IP, or per-API limits.

  • Cybersecurity and identity providers: Use rate limiting for brute-force attack protection, user/IP request throttling, behavior-based rate limits, and access token issuance limits.

  • Telecommunications and ISPs: Use rate limiting for API consumption limits, device/IoT traffic throttling, and network request rate control.

  • Fintech and banking: Use rate limiting for secure login attempt throttling, transaction request rate limits, fraud prevention thresholds, and API access controls for partners.

Queueing and streaming

Valkey is widely used for queueing and streaming across industries that need high-throughput, low-latency, real-time event processing. Its Redis-compatible lists, streams, pub/sub, and atomic operations make it a strong fit for background job queues, real-time pipelines, and event-driven architectures.

  • Ecommerce and online retail applications: Use queueing and streaming for order-processing pipelines, inventory updates, notification/event queues, and real-time user behavior streams.

  • Transportation, logistics, and travel applications: Use queueing for real-time vehicle tracking events, shipment workflow steps, route optimization pipelines, and booking workflow queues.

  • SaaS and cloud platforms: Use queueing for background job queues (emails, billing, notifications), task scheduling, distributed worker systems, and event-driven microservices.

Valkey with vector extension powers AI workflows

Valkey can function as a vector database, making it suitable for embedding-based search, similarity matching, inference, and retrieval-augmented generation (RAG). Essentially, it stores vectors (numerical embeddings) in memory and allows fast similarity queries using Approximate Nearest Neighbor (ANN) algorithms.

Valkey can be used to: 

  • Store high-dimensional vectors (embeddings) generated from large language models (LLMs), images, audio, or other data 

  • Support fast similarity search using ANN methods like hierarchical navigable small world (HNSW) graphs

  • Allow hybrid search by combining vectors with structured metadata or full-text search 

  • Provide real-time indexing and retrieval, suitable for applications that need low-latency search

This makes Valkey an excellent low-latency vector store for LLM retrieval systems. Currently Akamai Inference Cloud users are limited to use only pgVector and pgVectorScale extensions of the Akamai Managed PGSQL database. When Valkey is available, users would have a second choice of database. 

Have you explored Akamai Inference Cloud yet?

Valkey delivers value

Valkey represents the best of open source innovation — it’s transparent, community-powered, and engineered for speed. Whether you're building real-time applications, optimizing back-end systems, or scaling microservices, Valkey offers the robust and open alternative the ecosystem needs.

Use Valkey if you want Redis-like performance, features, and APIs with no licensing limitations backed by a strong open source community.

It's more than a Redis replacement: It’s the evolution of in-memory data infrastructure.

Learn more

Are you interested in the Valkey Managed Database to power your applications? Register for the waiting list today.

Amit Mohanty

Jan 21, 2026

Amit Mohanty

Amit Mohanty

Written by

Amit Mohanty

Amit Mohanty is a Senior Product Manager at Akamai.

Tags

Share

Related Blog Posts

Cloud
Why CEOs' AI Hype Really Isn't Landing with Employees
February 18, 2026
Read about the disconnect between CEO enthusiasm for AI and employee perception of its value, and learn how to build communication that moves adoption forward.
Cloud
Observing the Anatomy of Peak Traffic
February 11, 2026
Discover what happens during massive traffic surges — from global streaming events to Black Friday checkouts — and learn how to prepare for peak performance.
Cloud
Why Hydrolix Partnered with Akamai to Solve Observability at Scale
February 04, 2026
Read how TrafficPeak, a joint solution powered by Hydrolix and delivered by Akamai, transforms how organizations capture and analyze massive volumes of data.