Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redis client-side caching connection deadlocks when health_check_interval is non-zero #3470

Open
michael-chaos opened this issue Dec 30, 2024 · 3 comments

Comments

@michael-chaos
Copy link

michael-chaos commented Dec 30, 2024

Version: 5.2.1

Platform: Windows 11 WSL -- Ubuntu 22.04.5 LTS (kernel 5.15.167.4-microsoft-standard-WSL2 SMP)

Description: Deadlock encountered when performing a set() call on a Redis connection with client-side caching enabled and the health_check_interval initializer arg non-zero.

        redis = redis.Redis(
            host=<host>,
            port=<port>,
            db=<db>,
            encoding='utf-8',
            protocol=3,  # Needing RESP3 protocol for client-side caching -- it is also supposed to be more performant
            cache_config=redis.cache.CacheConfig()  # Use client-side caching for performance
            decode_responses=True,
            retry_on_error=[redis.exceptions.TimeoutError, redis.exceptions.BusyLoadingError, redis.exceptions.ConnectionError],
            retry=redis.retry.Retry(redis.backoff.ExponentialBackoff(), 3),
            health_check_interval=1,
        )

Stack Trace:

_on_invalidation_callback (connection.py:929)	<<< CacheProxyConnection._on_invalidation_callback() performs `with self._cache_lock`
handle_push_response (resp3.py:136)
_read_response (resp3.py:116)
read_response (resp3.py:28)
read_response (connection.py:588)
_send_ping (connection.py:511)
call_with_retry (retry.py:62)
check_health (connection.py:521)
send_packed_command (connection.py:529)
send_command (connection.py:556)
send_command (connection.py:819)		<<< CacheProxyConnection.send_command() performs `with self._cache_lock`
_send_command_parse_response (client.py:541)
<lambda> (client.py:568)
call_with_retry (retry.py:62)
_execute_command (client.py:567)
execute_command (client.py:559)
set (core.py:2335)

With my limited understanding of the Redis server and client implementations, it appears that the connection is processing a "PONG" response from the server in addition to a cache invalidation response/notification (i.e., some kind of multi-part response).

Within the connection.CacheProxyConnection class, self._cache_lock is a non-reentrant threading.Lock object. I suspect that changing it to a reentrant threading.RLock object may resolve this issue?

@michael-chaos michael-chaos changed the title Redis client-side caching connection get() deadlocks when health_check_interval is non-zero Redis client-side caching connection deadlocks when health_check_interval is non-zero Dec 31, 2024
@vladvildanov
Copy link
Collaborator

Hi! could you please provide more information how to reproduce the actual deadlock? I tried simple get/set() scenario and doesn't succeed. Also retry_on_error cannot be boolean, it expected to be an array of exceptions client should retry on

@michael-chaos
Copy link
Author

michael-chaos commented Jan 10, 2025

Hi @vladvildanov, a typo regarding retry_on_error on my part, it should be...

retry_on_error=[redis.exceptions.TimeoutError, redis.exceptions.BusyLoadingError, redis.exceptions.ConnectionError],

(I am making this correction in the original description)

Regarding how to reproduce, I have two separate applications independently reading and writing to the Redis DB in parallel. But in this particular case, the issue is one process periodically calling set() for a key while the other sporadically calls set() for another key. I will see what additional details I can gather about this.

@michael-chaos
Copy link
Author

Another mistake I made was saying that the deadlock occurred during a call to get() -- it was actually a call to set(), and I have corrected the description. Sorry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants