[go: up one dir, main page]

Jump to
Memorystore

Memorystore

Fully managed in-memory Valkey, Redis* and Memcached service that offers sub millisecond data access, scalability, and high availability for a wide range of applications. 

  • 100% compatible with open source Valkey, Redis Cluster, Redis, and Memcached

  • Migrate your caching layer to cloud with zero code change

  • High availability, up to a 99.99% SLA

Benefits

Focus on building great apps

Memorystore automates complex tasks for open source Valkey, Redis Cluster, Redis, and Memcached like enabling high availability, failover, patching, and monitoring so you can spend more time building your applications.

Simplified scaling

Memorystore for Valkey and Redis Cluster scale without downtime to support up to 250 nodes, terabytes of keyspace, and 60x more throughput than Memorystore for Redis with microsecond latencies.

Highly available

Memorystore for Valkey and Redis Cluster have zero-downtime scaling, automatically distributed replicas across availability zones, and automated failover. Memorystore for Redis Cluster also offers a 99.99% SLA.

Key features

Key features

Choice of engines

Choose from the most popular open source caching engines to build your applications. Memorystore supports Valkey, Redis Cluster, Redis, and Memcached and is fully protocol compatible. Choose the right engine that fits your cost and availability requirements.

Connectivity

Memorystore for Valkey and Memorystore for Redis Cluster are available with Private Service Connect (PSC) to simplify management and to offer secure, private, and granular connectivity with minimal IP consumption. All services are integrated with cloud monitoring, and more, Memorystore is built on the best of Google Cloud.

Vector search

Use Memorystore for Redis as an ultra-low latency data store for your generative AI applications. Approximate nearest neighbor (ANN) vector search (in preview) delivers fast, approximate results—ideal for large datasets where a close match is sufficient. Exact nearest neighbor (KNN) vector search (in preview) promises accurate results, although it may require a bit more time to process.

Fully managed

Provisioning, replication, failover, and patching are all automated, which drastically reduces the time you spend on DevOps.

View all features
Instacart Logo
Had we known the full scope of benefits from switching to Memorystore earlier, we could have saved more engineering time for delivering value to other parts of our e-commerce platform.

Dennis Turko, Staff Software Engineer, Instacart

Read the blog

Documentation

Documentation

Google Cloud Basics

Memorystore for Valkey Overview

Read about the benefits, use cases, and features of Memorystore for Valkey. The overview also provides key details about the service.

Google Cloud Basics

Memorystore for Redis Cluster overview

Read about the benefits, use cases, and features of Memorystore for Redis Cluster. The overview also provides key details about the service.

Google Cloud Basics

Memorystore for Redis standalone overview

Read about the use cases, and features of Memorystore for Redis standalone service. The overview also provides key details about the service.

Google Cloud Basics

Memorystore for Memcached overview

This page introduces the Memorystore for Memcached service, including use cases, key concepts, and the advantages of using Memcached.

Quickstart

Redis quickstart using the UI

Learn how to create a Memorystore for Redis instance, connect to the instance, set a value, retrieve a value, and delete the instance.

Tutorial

Connecting to a Redis instance

Learn how to access Redis instances from Compute Engine, GKE clusters, Cloud Functions, the App Engine flexible environment, and the App Engine standard environment.

Not seeing what you’re looking for?

All features

All features

Choice of engines

Choose from the most popular open source caching engines to build your applications. Memorystore supports Valkey, Redis Cluster, Redis, and Memcached and is fully protocol compatible. Choose the right engine that fits your cost and availability requirements.

Connectivity

Memorystore for Valkey and Memorystore for Redis Cluster is available with Private Service Connect (PSC) to simplify management and to offer secure, private, and granular connectivity with minimal IP consumption. Memorystore for Redis and Memcached support Private Service Access (PSA) and Direct Peering to offer connectivity using private IP.

LangChain integration

Easily build gen AI applications that are more accurate, transparent, and reliable with LangChain integration. Memorystore for Redis has three LangChain integrations—Document loader for loading and storing information from documents, Vector stores for enabling semantic search, and Chat Messages Memory for enabling chains to recall previous conversations. Visit the GitHub repository to learn more.

Vector search

Use Memorystore for Redis as an ultra-low latency data store for your generative AI applications. Approximate nearest neighbor (ANN) vector search (in preview) delivers fast, approximate results—ideal for large datasets where a close match is sufficient. Exact nearest neighbor (KNN) vector search (in preview) promises accurate results, although it may require a bit more time to process.

Fully managed

Provisioning, replication, failover, and patching are all automated, which drastically reduces the time you spend doing DevOps.

Persistence

Achieve near-zero Recovery Point Objectives (RPO) through continuous write logging or setup periodic RDB snapshots, ensuring heightened resiliency against zonal failures.

Security

Memorystore is protected from the internet using VPC networks and private IP and comes with IAM integration—all designed to protect your data. Systems are monitored 24/7/365, ensuring your applications and data are protected. Memorystore for Redis provides in-transit encryption and Redis AUTH to further secure your sensitive data.

Highly scalable

Memorystore for Valkey and Memorystore for Redis Cluster provide zero-downtime scaling up to 250 nodes, terabytes of keyspace, flexible node sizes from 1.4 GB to 58 GB, 10 TB+ per instance and 60x more throughput with microsecond latencies.

Monitoring

Monitor your instance and set up custom alerts with Cloud Monitoring. You can also integrate with OpenCensus to get more insights to client-side metrics.

Highly available

Memorystore for Redis Cluster offers a 99.99% SLA with automatic failover. Shards are automatically distributed across zones for maximum availability. Standard tier Memorystore for Redis instances provide a 99.9% availability SLA with automatic failover to ensure that your instance is highly available. You also get the same availability SLA for Memcached instances.

Migration

Memorystore is compatible with open source protocol, which makes it easy to switch your applications with no code changes. You can leverage the RIOT tool to seamlessly migrate existing Redis deployments to Memorystore for Valkey or Memorystore for Redis Cluster.

Pricing

Pricing

Memorystore offers various sizes to fit any budget. Pricing varies with settings—including how much capacity, how many replicas and which region you provision. Memorystore also offers per-second billing and instances and is easy to start and stop. 

View Memorystore for Redis Cluster pricing

View Memorystore for Redis pricing

View Memorystore for Memcached pricing


*Redis is a trademark of Redis Ltd. All rights therein are reserved to Redis Ltd. Any use by Google is for referential purposes only and does not indicate any sponsorship, endorsement or affiliation between Redis and Google. Memorystore is based on and is compatible with open-source Redis versions 7.2 and earlier and supports a subset of the total Redis command library.

Take the next step

Start building on Google Cloud with $300 in free credits and 20+ always free products.

Google Cloud
  • ‪English‬
  • ‪Deutsch‬
  • ‪Español‬
  • ‪Español (Latinoamérica)‬
  • ‪Français‬
  • ‪Indonesia‬
  • ‪Italiano‬
  • ‪Português (Brasil)‬
  • ‪简体中文‬
  • ‪繁體中文‬
  • ‪日本語‬
  • ‪한국어‬
Console
Google Cloud