Jump to >

djblets.cache.backend

Utility functions for working with memory caching backends.

These functions are designed to integrate with a cache backend using Django’s cache framework. They handle creating caching keys unique to the install and caching more complex data efficiently (such as the results of iterators and large data normally too big for the cache).

cache_memoize_iter(key, items_or_callable, expiration=2592000, force_overwrite=False, compress_large_data=True)[source]

Memoize an iterable list of items inside the configured cache.

If the provided list of items is a function, the function must return a an iterable object, such as a list or a generator.

If a generator is provided, directly or through a function, then each item will be immediately yielded to the caller as they’re retrieved, and the cached entries will be built up as the items are processed.

The data is assumed to be big enough that it must be pickled, optionally compressed, and stored as chunks in the cache.

The result from this function is always a generator. Note that it’s important that the generator be allowed to continue until completion, or the data won’t be retrievable from the cache.

Parameters:
  • expiration (int) – The expiration time for the key, in seconds.
  • force_overwrite (bool) – If True, the value will always be computed and stored regardless of whether it exists in the cache already.
  • compress_large_data (bool) – If True, the data will be zlib-compressed.
Yields:

The list of items from the cache or from items_or_callable if uncached.

cache_memoize(key, lookup_callable, expiration=2592000, force_overwrite=False, large_data=False, compress_large_data=True, use_generator=False)[source]

Memoize the results of a callable inside the configured cache.

Parameters:
  • expiration (int) – The expiration time for the key, in seconds.
  • force_overwrite (bool) – If True, the value will always be computed and stored regardless of whether it exists in the cache already.
  • large_data (bool) – If True, the resulting data will be pickled, gzipped, and (potentially) split up into megabyte-sized chunks. This is useful for very large, computationally intensive hunks of data which we don’t want to store in a database due to the way things are accessed.
  • compress_large_data (bool) – Compresses the data with zlib compression when large_data is True.
Returns:

The cached data, or the result of lookup_callable if uncached.

make_cache_key(key)[source]

Create a cache key guaranteed to avoid conflicts and size limits.

The cache key will be prefixed by the site’s domain, and will be changed to an MD5SUM if it’s larger than the maximum key size.

Parameters:key (str) – The base key to generate a cache key from.
Returns:A cache key suitable for use with the cache backend.
Return type:str