proxy_cache_background_update

The 'proxy_cache_background_update' directive allows updating the cache while serving stale responses. — NGINX HTTP Core

proxy_cache_background_update
httpserverlocation
Синтаксисproxy_cache_background_update on | off;
По умолчаниюoff
Контекстhttp, server, location
МодульNGINX HTTP Core
Аргументыflag

Описание

The 'proxy_cache_background_update' directive is a flag within the NGINX HTTP module that determines whether or not NGINX should attempt to fetch a fresh version of a cached resource in the background when a stale resource is served to a client. By default, when a request is made for a cached item that has gone stale (i.e., it has passed its expiration time), NGINX would typically return the stale version of the resource. If the 'proxy_cache_background_update' directive is set to 'on', NGINX will not only serve the stale response but also initiate a request to fetch the updated version of the resource in the background. This ensures that the next client request for that resource will receive the fresh version with minimal performance overhead. This behavior is particularly useful in scenarios where you want to maintain high availability and fast response times while ensuring that cached resources are kept up-to-date. For example, in environments where data changes frequently, this directive can reduce the perceived latency for users while still managing server load effectively. Setting the directive to 'off' will revert to the default behavior of serving stale responses without attempting to update the cache in the background.

Пример конфига

http {
    proxy_cache_path /tmp/cache levels=1:2 keys_zone=my_cache:10m max_size=1g;

    server {
        location / {
            proxy_pass http://backend;
            proxy_cache my_cache;
            proxy_cache_background_update on;
        }
    }
}

If used with 'proxy_cache_use_stale', ensure proper configuration to handle background updates without serving outdated content unintentionally.

Setting this directive to 'on' can lead to increased load on the backend if many requests for stale content occur simultaneously.

Be mindful that enabling this feature could potentially lead to more concurrent connections to the backend server during cache refreshes.