Redis

Redis is an open source (BSD licensed), in-memory data structure store used as a database, cache, message broker, and streaming engine. Redis provides data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams. Redis has built-in replication, Lua scripting, LRU eviction, transactions, and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.

To achieve top performance, Redis works with an in-memory dataset. Depending on your use case, Redis can persist your data either by periodically dumping the dataset to disk or by appending each command to a disk-based log. You can also disable persistence if you just need a feature-rich, networked, in-memory cache.

If you are interested in a direct Decodable Connector for Redis, please contact support@decodable.co or join our Slack community and let us know!

Getting Started

Sending a Decodable data stream to Redis is accomplished in two stages, first by creating a sink connector to a data source that is supported by Redis, and then by adding that data source to your Redis configuration. Decodable and Redis mutually support several technologies, including Confluent Cloud.

Configure As A Sink

This example demonstrates using Confluent Cloud as the sink from Decodable and the source for Redis. Sign in to Decodable Web and follow the configuration steps provided for the Confluent Cloud to create a sink connector. For examples of using the command line tools or scripting, see the How To guides.

Create Confluent Cloud Data Source

The Kafka Connect Redis Sink connector for Confluent Cloud is used to export data from Apache Kafka® topics to Redis. The connector works with Redis Enterprise Cloud, Azure Cache for Redis, and Amazon ElastiCache for Redis. Its primary features are:

  • At least once delivery: The connector guarantees that records are delivered at least once.

  • Supports multiple tasks: The connector supports running one or more tasks.

  • SSL support: Supports one-way SSL.

  • Deletions: The connector supports deletions. If the record stored in Kafka has a null value, the connector sends a delete message with the corresponding key to Redis.

  • Supported input data formats: This connector supports storing raw bytes or strings (as inserts) in Redis.

Use the Confluent Cloud console to perform the following steps:

  1. Launch your Confluent Cloud cluster

  2. Add a Redis Sink connector

  3. Enter the connector details

    • Select a topic

    • Provide Kafka credentials

    • Enter your Databricks Delta Lake connection details for Redis

  4. Check the results in Redis

For more detailed information, please refer to Confluent Cloud’s Redis documentation.

Reference

Connector name redis

Type

sink

Delivery guarantee

at least once


Apache Kafka, Kafka®, Apache® and associated open source project names are either registered trademarks or trademarks of The Apache Software Foundation.