News

LLM query caching also lands soon The return of Redis creator Salvatore Sanfilippo has borne fruit in the form of a new data ...
Company introduces LangCache, a fully managed semantic caching service that integrates LLM response caching in AI apps, and vector sets, a new native data type specialized for vector similarity ...
"Our collaboration continues to support integrated solutions like Azure Cache for Redis, and will provide Microsoft customers with exclusive access to expanded features within Redis offerings." ...
Redis is the world's fastest data platform. From its open source origins in 2011 to becoming the #1 cited brand for caching solutions, Redis has helped more than 10,000 customers build, scale, and ...