Expire a view-cache in Django?


Question

The @cache_page decorator is awesome. But for my blog I would like to keep a page in cache until someone comments on a post. This sounds like a great idea as people rarely comment so keeping the pages in memcached while nobody comments would be great. I'm thinking that someone must have had this problem before? And this is different than caching per url.

So a solution I'm thinking of is:

@cache_page( 60 * 15, "blog" );
def blog( request ) ...

And then I'd keep a list of all cache keys used for the blog view and then have way of expire the "blog" cache space. But I'm not super experienced with Django so I'm wondering if someone knows a better way of doing this?

1
42
7/22/2016 10:54:38 PM

Accepted Answer

This solution works for django versions before 1.7

Here's a solution I wrote to do just what you're talking about on some of my own projects:

def expire_view_cache(view_name, args=[], namespace=None, key_prefix=None):
    """
    This function allows you to invalidate any view-level cache. 
        view_name: view function you wish to invalidate or it's named url pattern
        args: any arguments passed to the view function
        namepace: optioal, if an application namespace is needed
        key prefix: for the @cache_page decorator for the function (if any)
    """
    from django.core.urlresolvers import reverse
    from django.http import HttpRequest
    from django.utils.cache import get_cache_key
    from django.core.cache import cache
    # create a fake request object
    request = HttpRequest()
    # Loookup the request path:
    if namespace:
        view_name = namespace + ":" + view_name
    request.path = reverse(view_name, args=args)
    # get cache key, expire if the cached item exists:
    key = get_cache_key(request, key_prefix=key_prefix)
    if key:
        if cache.get(key):
            # Delete the cache entry.  
            #
            # Note that there is a possible race condition here, as another 
            # process / thread may have refreshed the cache between
            # the call to cache.get() above, and the cache.set(key, None) 
            # below.  This may lead to unexpected performance problems under 
            # severe load.
            cache.set(key, None, 0)
        return True
    return False

Django keys these caches of the view request, so what this does is creates a fake request object for the cached view, uses that to fetch the cache key, then expires it.

To use it in the way you're talking about, try something like:

from django.db.models.signals import post_save
from blog.models import Entry

def invalidate_blog_index(sender, **kwargs):
    expire_view_cache("blog")

post_save.connect(invalidate_portfolio_index, sender=Entry)

So basically, when ever a blog Entry object is saved, invalidate_blog_index is called and the cached view is expired. NB: haven't tested this extensively, but it's worked fine for me so far.

44
1/31/2018 9:50:05 AM

I wrote Django-groupcache for this kind of situations (you can download the code here). In your case, you could write:

from groupcache.decorators import cache_tagged_page

@cache_tagged_page("blog", 60 * 15)
def blog(request):
    ...

From there, you could simply do later on:

from groupcache.utils import uncache_from_tag

# Uncache all view responses tagged as "blog"
uncache_from_tag("blog") 

Have a look at cache_page_against_model() as well: it's slightly more involved, but it will allow you to uncache responses automatically based on model entity changes.


Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow
Icon