Cached Documents#

For read heavy workloads were you may wish to reduce database load, Alaric offers alaric.CachedDocument which implements a Redis based cache in front of your database automatically.

class alaric.CachedDocument(*, document: Document, redis_client: Redis, extra_lookups: List[List[str]] = None, cache_ttl: timedelta = datetime.timedelta(seconds=3600))#

This document implements a cache in front of MongoDB for read heavy work flows.

Read process:

result = attempt to hit redis
if result is None:
    result = fetch from database
    populate redis cache

Write process:

Push changes back to database
update redis cache

Notes

This document works off the assumption that a documents _id remains consistent through the lifetime of the entry.

This document will also leave hanging redis entries when lookups outside the _id are modified during the lifetime of the object. This is mitigated by enforcing a TTL of all redis entries.

__init__(*, document: Document, redis_client: Redis, extra_lookups: List[List[str]] = None, cache_ttl: timedelta = datetime.timedelta(seconds=3600))#
Parameters:
  • document (alaric.Document) – The underlying DB document

  • redis_client (redis.asyncio.client.Redis) – The Redis instance to use

  • extra_lookups (List[List[str]]) –

    Extra lookups to build. For example:

    class Test:
        def __init__(self, _id, data):
            self.data = data
            self._id = _id
    
    ...
    
    extra_lookups = [["data"]]
    

    Would let you use get with either the _id or data parameter of Test

  • cache_ttl (timedelta) –

    How long keys should exist in Redis.

    This is a requirement as this class will leave hanging keys in Redis when certain values change.

async get(filter_dict: Dict[str, Any] | Buildable | Filterable, *, try_convert: bool = True) Dict[str, Any] | C | None#

Fetch a document.

Attempts to fetch from Redis and then falls back to DB

Parameters:
Returns:

The data fetched from either Redis or your DB

Return type:

Optional[Union[Dict[str, Any], C]]

Notes

Due to how items are stored internally, filter_dict must eval to a literal str: str pairing otherwise this will fall back to the DB

async set(filter_dict: Dict[str, Any] | Buildable | Filterable, update_data: Dict[str, Any] | Saveable) None#

Write to Redis and the DB.

Notes

This performs an UPSERT operation on the database.

This also requires _id to be set on update_data in order to cache this to Redis

Parameters: