We are using redis stream in our application. we have 2 docker service for the same. Service1(Java) publishing data to stream & Service2(C++) consuming that data & after consuming we are deleting that entry using xdel from stream.
Now requirement is that if Service2 gets down then data should be stored in stream until the allowed maxmemory size exhausted (we will set maxmemory for that around 50-100MB)So if that memory limit reaches then it should delete old data 1 by 1.
But current behavior with config(maxmemory:50mb, maxmemory-policy:allkeys-lru) is that it is not deleting 1 by 1. it deletes whole data from stream. Example:-
(integer) 93127.0.0.1:6379> xlen MyStream(integer) 94127.0.0.1:6379> xlen MyStream(integer) 0127.0.0.1:6379> xlen MyStream(integer) 1127.0.0.1:6379> xlen MyStream(integer) 2
Is there any solution/configuration for this requirement