uwsgi_cache_lock

The `uwsgi_cache_lock` directive allows for cache locking to prevent the same content from being generated multiple times when it is not yet available in the cache.

Syntaxuwsgi_cache_lock on | off;
Defaultoff
Contexthttp, server, location
Argumentsflag

Description

The uwsgi_cache_lock directive is used to enable or disable cache locking in NGINX when using the uWSGI caching mechanism. When this directive is set to 'on', if a request is made for a cached content that is currently being generated, other requests for that same content will be locked until the first request is completed. This helps to reduce the possibility of duplicate processing for the same request, improving resource utilization and decreasing response time for simultaneous requests. By default, if the directive is not set, caching will proceed without locking, which may result in multiple instances of content generation if several requests happen simultaneously.

This directive interacts with the uwsgi_cache directive, which is responsible for defining the cache zone used for caching uWSGI responses. Setting uwsgi_cache_lock to 'off' allows multiple simultaneous requests for the same uncached resource, potentially leading to increased load on the application backend. You can specify this directive in the http, server, or location contexts, providing flexibility depending on your caching strategy and architecture. The cache behavior can be fine-tuned further with additional caching directives, ensuring optimal performance based on the application requirements.

Config Example

uwsgi_cache my_cache;
uwsgi_cache_lock on;

location /example {
    uwsgi_pass my_app;
    uwsgi_cache my_cache;
}

Ensure that uwsgi_cache is defined for effective caching before enabling lock.

Avoid enabling uwsgi_cache_lock if your application can tolerate multiple requests being processed simultaneously.

← Back to all directives