No effort, no worry, maximum performance.

Overview

Django Cachalot

Caches your Django ORM queries and automatically invalidates them.

Documentation: http://django-cachalot.readthedocs.io


http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 https://img.shields.io/pypi/pyversions/django-cachalot http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 https://img.shields.io/discord/773656139207802881

Table of Contents:

  • Quickstart
  • Usage
  • Hacking
  • Benchmark
  • Third-Party Cache Comparison
  • Discussion

Quickstart

Cachalot officially supports Python 3.6-3.9 and Django 2.2 and 3.1-3.2 with the databases PostgreSQL, SQLite, and MySQL.

Note: an upper limit on Django version is set for your safety. Please do not ignore it.

Usage

  1. pip install django-cachalot
  2. Add 'cachalot', to your INSTALLED_APPS
  3. If you use multiple servers with a common cache server, double check their clock synchronisation
  4. If you modify data outside Django – typically after restoring a SQL database –, use the manage.py command
  5. Be aware of the few other limits
  6. If you use django-debug-toolbar, you can add 'cachalot.panels.CachalotPanel', to your DEBUG_TOOLBAR_PANELS
  7. Enjoy!

Hacking

To start developing, install the requirements and run the tests via tox.

Make sure you have the following services:

  • Memcached
  • Redis
  • PostgreSQL
  • MySQL

For setup:

  1. Install: pip install -r requirements/hacking.txt
  2. For PostgreSQL: CREATE ROLE cachalot LOGIN SUPERUSER;
  3. Run: tox --current-env to run the test suite on your current Python version.
  4. You can also run specific databases and Django versions: tox -e py38-django3.1-postgresql-redis

Benchmark

Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it.

  1. Install: pip install -r requirements/benchmark.txt
  2. Run: python benchmark.py

The output will be in benchmark/TODAY'S_DATE/

TODO Create Docker-compose file to allow for easier running of data.

Third-Party Cache Comparison

There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix:

TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with select_related and prefetch_related, you can get a nearly 100x speed up for your initial deployment.

Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it.

Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by Andrew Chen Wang), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much.

Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot.

Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them.

Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/

Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools

Discussion

Help? Technical chat? It's here on Discord.

Legacy chats:

https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg

Comments
  • Django 1.8 support

    Django 1.8 support

    Just wanted to bring some clarity whether django-cachalot in it's current state supports Django 1.8 or not. I tried running the package on Django 1.8, and while it's not crashing anything, I see that django1.8 branch hasn't been merged to master and is failing the build. So maybe some update for the community about this would be good from you. Thanks.

    enhancement 
    opened by maryokhin 38
  • Add final SQL check when looking up involved tables

    Add final SQL check when looking up involved tables

    Description

    • Add a final SQL check to include potentially overlooked tables when looking up involved tables.
    • Add unit tests showing queries which do "order by" using a field of a referenced table. These tests would fail without the final SQL check.

    Rationale

    Changing the referenced object should also invalidate the query as calling the query again might lead to another result.

    "Order by" allows expressions such as Coalesce as well: https://docs.djangoproject.com/en/3.2/ref/models/querysets/#order-by

    Discussion

    Initially I thought of adding the final SQL check as configuration option. After having looked at all the queries, I believe that it should be the default behavior. Thus I did not make it an option for now.

    opened by dbartenstein 23
  • Cachalot error

    Cachalot error

    Getting this error , with cachalot enabled ., when running unit testcases. backend cache is BACKEND': 'django_redis.cache.RedisCache'

    File "/home/ubuntu/virtualenvs/venv-system/lib/python2.7/site-packages/django/db/migrations/executor.py", line 198, in apply_migration state = migration.apply(state, schema_editor) File "/home/ubuntu/virtualenvs/venv-system/lib/python2.7/site-packages/django/db/backends/base/schema.py", line 92, in exit self.atomic.exit(exc_type, exc_value, traceback) File "/home/ubuntu/virtualenvs/venv-system/lib/python2.7/site-packages/cachalot/monkey_patch.py", line 146, in inner self.using, exc_type is None and not needs_rollback) File "/home/ubuntu/virtualenvs/venv-system/lib/python2.7/site-packages/cachalot/cache.py", line 47, in exit_atomic atomic_caches = self.atomic_caches[db_alias].pop().values() IndexError: pop from empty list

    can you please help with this?

    needs info 
    opened by gokulrajridecell 20
  • Huge SQL query reaches memcached size limit per key

    Huge SQL query reaches memcached size limit per key

    First, I'm using Django 1.8, Python 3.4.3, Postgres 9.3.x, and memcached 1.4.4 with pylibmc.

    I have a form on my site that has a jQuery autocomplete box. This is used to for selecting locations (we have roughly ~13k locations in our database - continents, countries, states, and cities). Here's the view:

    def location_query(request):
        # first handle the location autocomplete
        if request.is_ajax():
            term = request.GET['term']
    
            # I want to explicitly order matching countries at the front of the list
            matching_countries = Location.get_countries().filter(full_name__icontains=term)
            matching_states = Location.get_states().filter(full_name__icontains=term)
            matching_cities = Location.get_cities().filter(full_name__icontains=term)
    
            matching_locations = list()
            matching_locations.extend(matching_countries)
            matching_locations.extend(matching_states)
            matching_locations.extend(matching_cities)
    
            locations_json = list()
            for matching_location in matching_locations[:10]:
                location_json = dict()
                location_json['id'] = matching_location.pk
                location_json['label'] = '%s (%s)' % (matching_location.full_name, matching_location.admin_level)
                location_json['value'] = matching_location.pk
                locations_json.append(location_json)
    
            return JsonResponse(locations_json, safe=False)
    

    And here's the Location model:

    class Location(models.Model):
        name = models.CharField(max_length=255)
        full_name = models.CharField(max_length=255, blank=True)  # the name might be "Paris", but full_name would be "Paris, Texas, United States of America"; allowed to be blank only because the script that populates this table will fill it in after all locations are added
        imported_from = models.CharField(max_length=255)
        admin_level = models.CharField(max_length=255, blank=True)
        geometry = models.MultiPolygonField(blank=True, default=None, null=True)
        objects = models.GeoManager()  # override the default manager with a GeoManager instance
        parent = models.ForeignKey('self', blank=True, default=None, null=True)
    
        def __str__(self):
            return self.full_name
    
        def get_full_name(self, include_continent=False):
            """
                Get the full name of a location. This includes the entire hierarchy, optionally including the continent.
                    e.g., Paris, Texas, United States of America
            """
            full_name = self.name
            current_parent = self.parent
            while current_parent is not None and (include_continent or (not include_continent and current_parent.parent is not None)):
                full_name += ', ' + current_parent.name
                current_parent = current_parent.parent
            return full_name
    
        def get_country(self):
            if self.admin_level == 'Country':
                return self.name
            return self.parent.get_country()
    
        @staticmethod
        def get_continents():
            return Location.objects.filter(parent=None).order_by('name')
    
        @staticmethod
        def get_countries(continent=None):
            if continent:
                # return a single continent's countries, sorted
                return Location.objects.filter(parent=continent).order_by('name')
            else:
                # return all countries, sorted
                return Location.objects.filter(admin_level='Country').order_by('name')
    
        @staticmethod
        def get_states(country=None):
            if country:
                # return a single country's states, sorted
                return Location.objects.filter(parent=country).order_by('name')
            else:
                # return all states, sorted
                return Location.objects.filter(admin_level='State').order_by('name')
    
        @staticmethod
        def get_cities(state=None):
            if state:
                # return a single state's cities, sorted
                return Location.objects.filter(parent=state).order_by('name')
            else:
                # return all cities, sorted
                return Location.objects.filter(admin_level='City').order_by('name')
    
        @staticmethod
        def get_non_continents():
            return Location.objects.exclude(parent=None).order_by('name')
    
        class Meta:
            ordering = ['full_name']
    

    When I disable cachalot by commenting out the line in INSTALLED_APPS, the autocomplete works. When I enable it, it doesn't work. Other things on my site do indeed work, and the DDT panel shows that cachalot is doing its job. Can it deal with ajax calls like this?

    documentation 
    opened by gfairchild 20
  • Make it possible to disable cachalot on per-query basis

    Make it possible to disable cachalot on per-query basis

    Description

    This patch makes it possible to disable caching on per-cache basis by settings the attribute cachalot_do_not_cache to True on the query.

    Rationale

    Sometimes the programmer knows that a query will return a large response which will not be reused (in our case it is an export which can have hundreds of megabytes of data, which is streamed by chunk directly into a compressed file). This can potentially lead even to memory errors when the maximum size of the cache is reached (this is exactly what we see with our Redis cache). In such cases, it would be great to be able to mark a specific query as excluded from caching which would save the work required to store it and also prevent possible memory errors. The method I implemented here is not super-nice, but it is simple and works with minimum changes to the Cachalot source code.

    opened by beda42 17
  • Exception 'Exists' object has no attribute 'rhs'

    Exception 'Exists' object has no attribute 'rhs'

    What happened?

    We are running django-cachalot in production and keep getting exceptions "'Exists' object has no attribute 'rhs'". This is captured in sentry, so we can see some of the details.

    It's comming from the following line in cachalot/utils.py: rhs = child.rhs

    Here are some of the details set at this point:

    • child - django.db.models.expressions.Exists object
    • child_class - django.db.models.expressions.Exists
    • children - [django.db.models.lookups.Exact,django.db.models.expressions.Exists]
    • rhs - uuid.UUID
    • rhs_class - uuid.UUID

    What should've happened instead?

    No exception

    Steps to reproduce

    Seems to happen on DB quieries that make use of django.db.models.expressions.Exists.

    Django==3.0.6 Postgres DB (12.x)

    opened by shield007 16
  • Invalidation of data stored in a primary/replica configured DB invalidates only for primary instance

    Invalidation of data stored in a primary/replica configured DB invalidates only for primary instance

    I have a Django app with two MariaDB databases configured as primary/replica with the following configuration:

    DATABASE_ROUTERS = ['app.db_replica.PrimaryReplicaRouter', ]
    
    DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.mysql',
            'NAME': 'app',
            'USER': 'app',
            'PASSWORD': '*******************',
            'HOST': 'mysql-master',
        },
        'replica1': {
            'ENGINE': 'django.db.backends.mysql',
            'NAME': 'app',
            'USER': 'app',
            'PASSWORD': '*******************',
            'HOST': 'mysql-slave',
        },
    }
    

    The router is configured to write on primary and read on replica:

    class PrimaryReplicaRouter(object):
        def db_for_read(self, model, **hints):
            return 'replica1'
    
        def db_for_write(self, model, **hints):
            return 'default'
    
        def allow_relation(self, obj1, obj2, **hints):
            db_list = ('default', 'replica1')
            if obj1._state.db in db_list and obj2._state.db in db_list:
                return True
            return None
    
        def allow_migrate(self, db, app_label, model=None, **hints):
            return True
    

    When Django writes on a table, cache gets invalidated only on primary. You can test this by configuring the router to randomly return the primary or the replica, you will see old and new value alternated refreshing the page.

    Calling ./manage.py invalidate_cachalot works, I think because it invalidates all the cache ignoring the database instance.

    documentation 
    opened by micku 14
  • Queryset with annotated Now() is cached

    Queryset with annotated Now() is cached

    What happened?

    An annotated query including Now() is cached.

    What should've happened instead?

    An annotated query containing Now() must not be cached.

    Steps to reproduce

    from django.db.models.functions import Now
    # Replace Vehicle with some sort of cachable model
    Vehicle.objects.all().annotate(now=Now())
    

    Debian Linux Django version 3.2 LTS Postgresql 10

    bug 
    opened by dbartenstein 12
  • Also support django_prpmetheus wrapped caches

    Also support django_prpmetheus wrapped caches

    Description

    Add the cache backends wrapped by django_prometheus as supported backends.

    Rationale

    cachalot already explicitly supports django_prometheus' wrappers for database backends. djanfo_prometheus does the same wrapping for cache backends, so they should be supported as well.

    opened by Natureshadow 12
  • Race condition?

    Race condition?

    What happened?

    I have an app called menus which contains to models Menu(models.Model) and MenuItem(MP_Node). Both tables are enabled for cache by cachalot. Note that MenuItem implements tree structure from django-treebeard.

    There is an edit form that is responsible for updating particular Menu instance by creating or moving MenuItem's around. The code responsible of that is a little bit complicated but nonetheless I'm adding it below:

    root = anytree.AnyNode()
                nodes = {}
                node_ids = [form.cleaned_data.get("node_id") for form in form.menu_items.ordered_forms]
                node_ids = set(instance.root_item.get_descendants().filter(id__in=node_ids).values_list("id", flat=True))
    
                for form in form.menu_items.ordered_forms:
                    node_id = form.cleaned_data.get("node_id")
                    page = form.cleaned_data.get("page")
                    page_id = None
                    if isinstance(page, Page):
                        page_id = page.pk
    
                    item = {
                        "id": node_id if node_id in node_ids else None,
                        "parent_id": form.cleaned_data.get("parent_id"),
                        "data": {
                            "title": form.cleaned_data.get("title"),
                            "icon": form.cleaned_data.get("icon"),
                            "classes": form.cleaned_data.get("classes"),
                            "type": form.cleaned_data.get("type"),
                            "url": form.cleaned_data.get("url"),
                            "named_url": form.cleaned_data.get("named_url"),
                            "page_id": page_id,
                        },
                    }
                    if item["data"]["type"] in [
                        MenuItem.TEXT_LABEL,
                        MenuItem.CUSTOM_URL,
                        MenuItem.NAMED_URL,
                    ]:
                        item["data"]["page_id"] = None
    
                    nodes[node_id] = anytree.AnyNode(**item)
                for node in nodes.values():
                    if node.parent_id == -1:
                        node.parent = root
                        continue
                    node.parent = nodes[node.parent_id]
    
                exporter = DictExporter(
                    dictcls=OrderedDict,
                    attriter=lambda attrs: [(k, v) for k, v in attrs if k != "parent_id"],
                )
                bulk_data = exporter.export(root)["children"]
    
                # tree, iterative preorder
                stack = [(instance.root_item, node) for node in bulk_data[::-1]]
                while stack:
                    parent, node_struct = stack.pop()
                    parent.refresh_from_db()
                    node_data = node_struct["data"].copy()
                    node_data["id"] = node_struct["id"]
                    if node_data["id"] is None:
                        node_obj = parent.add_child(**node_data)
                        node_ids.add(node_obj.pk)
                    else:
                        node_obj = MenuItem.objects.get(pk=node_data["id"])
                        for attr, value in node_data.items():
                            if attr != "id":
                                setattr(node_obj, attr, value)
                        node_obj.save()
                        node_obj.move(parent, pos="last-child")
    
                    if "children" in node_struct:
                        # extending the stack with the current node as the parent of
                        # the new nodes
                        stack.extend([(node_obj, node) for node in node_struct["children"][::-1]])
    
                # Delete outdated items
                instance.root_item.get_descendants().exclude(id__in=node_ids).delete()
    

    What is happening is that while saving big menu (300+ items) and having live traffic at the same time I'm experiencing outdated data being stored in cache. I'm seeing changes on db level but cache data remains the same (loading edit interface shows data that does not match db level data). To confirm that theory invalidate_cachalot menus solves the issue.

    I guess it may be a result of race condition (?) because there is lots of changes are made in a cycle:

    while stack:
      ..
      node_obj = MenuItem.objects.get(pk=node_data["id"])
      for attr, value in node_data.items():
          if attr != "id":
              setattr(node_obj, attr, value)
      node_obj.save()
      node_obj.move(parent, pos="last-child")
    

    Am I missing something? Should I simply use api.invalidate method in such cases?

    What should've happened instead?

    Cached data should be changed.

    needs info 
    opened by pySilver 11
  • Adds django_db_geventpool.backends.postgresql_psycopg2 as supported database

    Adds django_db_geventpool.backends.postgresql_psycopg2 as supported database

    Description

    We've been using django-cachalot in a production environment for a while and after questioning about that on the Slack help page, it was recommended the addition of django_db_geventpool.backends.postgresql_psycopg2 as a supported database.

    Adding django_db_geventpool.backends.postgresql_psycopg2 as supported database.

    Rationale

    I'm not sure how to add a test for it but it's being used for at least 5 months in a production environment.

    It would remove the warning from deployments using the django_db_geventpool.

    opened by aemitos 11
  • Does django-cachalot lib support the new django.core.cache.backends.redis.RedisCache cache backend in Django version 4.0?

    Does django-cachalot lib support the new django.core.cache.backends.redis.RedisCache cache backend in Django version 4.0?

    Question

    The new django.core.cache.backends.redis.RedisCache cache backend provides built-in support for caching with Redis in Django version 4.0 and I used it in my application. I've used the latest version django-cachalot and got this warning: Cache backend 'django.core.cache.backends.redis.RedisCache' is not supported by django-cachalot. My question is: Does django-cachalot lib support the new django.core.cache.backends.redis.RedisCache cache backend in Django version 4.0? Please help answer my question, thanks!

    opened by vuphan-agilityio 4
  • UncachableQuery raised when exporting models of table even if listed in `CACHALOT_UNCACHABLE_APPS`.

    UncachableQuery raised when exporting models of table even if listed in `CACHALOT_UNCACHABLE_APPS`.

    What happened?

    UncachableQuery exception raised when exporting model accounts.Account, even if the app accounts is listed in CACHALOT_UNCACHABLE_APPS:

    Settings

    CACHALOT_CACHE = "cachalot"
    CACHALOT_ENABLED = True
    CACHALOT_ONLY_CACHABLE_APPS = [
        # ...
        "website",
    ]
    CACHALOT_UNCACHABLE_APPS = [
        "accounts",
        # ...
    ]
    

    Error

    UncachableQuery: 
      File "cachalot/monkey_patch.py", line 92, in inner
        table_cache_keys = _get_table_cache_keys(compiler)
      File "cachalot/utils.py", line 276, in _get_table_cache_keys
        for t in _get_tables(db_alias, compiler.query, compiler)]
      File "cachalot/utils.py", line 212, in _get_tables
        raise UncachableQuery
    

    What should've happened instead?

    No exception should have been raised by django-cachalot.

    Steps to reproduce

    Env

    Django version: Django==3.2.16 Cachalot version: django-cachalot==2.5.2 Database: MySQL Cache backend: django.core.cache.backends.memcached.PyMemcacheCache

    Context

    • I have a table accounts with ~20k entries, in the admin I try to export it almost all rows as .csv, then I end up with the reported error.
    • The account model has not foreign keys to a app.model listed in CACHALOT_ONLY_CACHABLE_APPS.
    • The export works fine if I select just few rows.
    • The export works fine if I don't use django-cachalot at all.
    opened by fabiocaccamo 3
  • Tables defined in CACHALOT_UNCACHABLE_APPS are still being cached

    Tables defined in CACHALOT_UNCACHABLE_APPS are still being cached

    Question

    I have a structure like this:

    App: AppA
    class ModelA(models.Model):
       some fields ....
    
    App: AppB
    class ModelB(models.Model):
       linked_field = models.Foreignkey(ModelA, ...)
    

    My settings are:

    CACHALOT_UNCACHABLE_APPS = INSTALLED_APPS
    
    CACHALOT_ONLY_CACHABLE_APPS = [
        "AppA"
    ]
    

    Why are my setting likes this? I've noticed that some tables are being cached of apps that were not defined to be cached. That's why I am exluding all and whitelist them later on. When I only defined CACHALOT_ONLY_CACHABLE_APPS without setting CACHALOT_UNCACHABLE_APPS even more apps were cached even though they should't be (e.g celery beats table)

    When using my application, I've noticed in the debug toolbar that also the tables of AppB are being cached even though they should not. I am also not access ModelB.

    Is this behaviour as expected?

    What have you tried so far?

    opened by hendrikschneider 0
  • Allow enum-likes in CACHABLE_PARAM_TYPES

    Allow enum-likes in CACHABLE_PARAM_TYPES

    Description

    Currently, Django's TextChoices property __class__ indicates it's enum-like

    >>> class MockTextChoices(models.TextChoices):
    ...     foo="foo"
    ...     bar="bar"
    ... 
    >>> MockTextChoices.foo
    MockTextChoices.foo
    >>> MockTextChoices.foo.__class__
    <enum 'MockTextChoices'>
    

    And it's perfectly fine to do something like this, when constructing queryset:

    queryset.filter(
                Q(
                    mock__in=(
                        MockTextChoices.foo,
                        MockTextChoices.bar
                    )
                )
            )
    

    Yet, such query, even if valid, can't be cached. It happens because value under MockTextChoices.foo.__class__ is not present in CACHABLE_PARAM_TYPES, which is used in check_parameter_types method. So long story short it will throw UncachableQuery and make db call anyway.

    If you replace MockTextChoices.foo with "foo" or MockTextChoices.foo.value it works just fine for obv reasons. //: # (What are you proposing and how would we implement it?) My suggestion would be to modify check_parameter_types to allow syntax above. I don't know right away how to implement that, but I'll have a thought, and I'll edit this ticket with proposed answer :^)

    Rationale

    I think such behavior might lead to unexpected queries, that are hard to catch without debug toolbar and pure luck lol. I mean, if Django accepts such syntax as valid query django-cachalot should be able to cache it right? 👀

    enhancement 
    opened by rythm-of-the-red-man 1
  • Unexpected behavior when using DjangoDebugToolbar

    Unexpected behavior when using DjangoDebugToolbar

    What happened?

    When setting CACHALOT_ENABLED = False but adding cachalot.panels.CachalotPanel panel in the DEBUG_TOOLBAR_PANELS variable, it acts like CACHALOT_ENABLED was True

    What should've happened instead?

    Adding cachalot.panels.CachalotPanel should not override the CACHALOT_ENABLED setting

    Steps to reproduce

    django 2.2 cachalot 2.5 django-debug-toolbar 3.0.0

    bug 
    opened by Vyko 0
  • caching before the first request

    caching before the first request

    Description

    I have a use case where I need to cache a particular request before the user requests the same. This is a list page paginated by pagenumberpagination and table is 30M records. Can you please suggest a way to cache the queryset before user hits the page?

    Rationale

    opened by thesealednectar22 1
Releases(v2.5.2)
  • v2.5.2(Aug 27, 2022)

    What's Changed

    • Add Django 4.1 support by @dmkoch in https://github.com/noripyt/django-cachalot/pull/219

    New Contributors

    • @dmkoch made their first contribution in https://github.com/noripyt/django-cachalot/pull/219

    Full Changelog: https://github.com/noripyt/django-cachalot/compare/v2.5.1...v2.5.2

    Source code(tar.gz)
    Source code(zip)
  • v2.5.1(Feb 24, 2022)

    What's Changed

    • table invalidation condition enhanced by @JanoValaska in https://github.com/noripyt/django-cachalot/pull/213
    • Include docs in sdist. Closes #201 by @debdolph in https://github.com/noripyt/django-cachalot/pull/202
    • Add test settings to sdist. Closes #200. by @debdolph in https://github.com/noripyt/django-cachalot/pull/203

    New Contributors

    • @JanoValaska made their first contribution in https://github.com/noripyt/django-cachalot/pull/213
    • @debdolph made their first contribution in https://github.com/noripyt/django-cachalot/pull/202

    Full Changelog: https://github.com/noripyt/django-cachalot/compare/v2.5.0...v2.5.1

    Source code(tar.gz)
    Source code(zip)
  • v2.5.0(Jan 14, 2022)

    What's Changed

    • Add final SQL check when looking up involved tables by @dbartenstein in https://github.com/noripyt/django-cachalot/pull/199

    Full Changelog: https://github.com/noripyt/django-cachalot/compare/v2.4.5...v2.5.0

    Source code(tar.gz)
    Source code(zip)
  • v2.4.5(Dec 8, 2021)

    What's Changed

    • Add Django 4.0 support, drop Python 3.6 and Django 3.1 by @Andrew-Chen-Wang in https://github.com/noripyt/django-cachalot/pull/208

    Full Changelog: https://github.com/noripyt/django-cachalot/compare/v2.4.4...v2.4.5

    Source code(tar.gz)
    Source code(zip)
  • v2.4.4(Nov 3, 2021)

    What's Changed

    • Handle queryset implementations without lhs/rhs attribute by @sumpfralle in https://github.com/noripyt/django-cachalot/pull/204
    • Add Python 3.10 Support (fixes #205) by @Andrew-Chen-Wang in https://github.com/noripyt/django-cachalot/pull/206

    New Contributors

    • @sumpfralle made their first contribution in https://github.com/noripyt/django-cachalot/pull/204

    Full Changelog: https://github.com/noripyt/django-cachalot/compare/v2.4.3...v2.4.4

    Source code(tar.gz)
    Source code(zip)
  • v2.4.3(Aug 23, 2021)

    • Fix annotated Now being cached (#195)
    • Fix conditional annotated expressions not being cached (#196)
    • Simplify annotation handling by using the flatten method (#197)
    • Fix Django 3.2 default_app_config deprecation (#198)
    • (Internal) Pinned psycopg2 to <2.9 due to Django 2.2 incompatibility
    Source code(tar.gz)
    Source code(zip)
  • v2.3.2(Sep 16, 2020)

  • v2.3.1(Aug 10, 2020)

  • v2.3.0(Jul 29, 2020)

  • v2.2.2(Jun 25, 2020)

  • 2.2.0(Feb 14, 2020)

No effort, no worry, maximum performance.

Django Cachalot Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io Table of Conte

NoriPyt 979 Jan 03, 2023
A Redis cache backend for django

Redis Django Cache Backend A Redis cache backend for Django Docs can be found at http://django-redis-cache.readthedocs.org/en/latest/. Changelog 3.0.0

Sean Bleier 1k Dec 15, 2022
A Python wrapper around the libmemcached interface from TangentOrg.

pylibmc is a Python client for memcached written in C. See the documentation at sendapatch.se/projects/pylibmc/ for more information. New in version 1

Ludvig Ericson 458 Dec 30, 2022
An ORM cache for Django.

Django ORMCache A cache manager mixin that provides some caching of objects for the ORM. Installation / Setup / Usage TODO Testing Run the tests with:

Educreations, Inc 15 Nov 27, 2022
Render template parts with extended cache control.

Render template parts with extended cache control. Installation Install django-viewlet in your python environment $ pip install django-viewlet Support

5 Monkeys 59 Apr 05, 2022
Python disk-backed cache (Django-compatible). Faster than Redis and Memcached. Pure-Python.

DiskCache is an Apache2 licensed disk and file backed cache library, written in pure-Python, and compatible with Django.

Grant Jenks 1.7k Jan 05, 2023
A decorator for caching properties in classes.

cached-property A decorator for caching properties in classes. Why? Makes caching of time or computational expensive properties quick and easy. Becaus

Daniel Roy Greenfeld 658 Dec 01, 2022
PyCache - simple key:value server written with Python

PyCache simple key:value server written with Python and client is here run server python -m pycache.server or from pycache.server import start_server

chick_0 0 Nov 01, 2022
No effort, no worry, maximum performance.

Django Cachalot Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io Table of Conte

NoriPyt 976 Dec 28, 2022
WSGI middleware for sessions and caching

Cache and Session Library About Beaker is a web session and general caching library that includes WSGI middleware for use in web applications. As a ge

Ben Bangert 500 Dec 29, 2022
RecRoom Library Cache Tool

RecRoom Library Cache Tool A handy tool to deal with the Library cache file. Features Parse Library cache Remove Library cache Parsing The script pars

Jesse 5 Jul 09, 2022
Asyncio cache manager for redis, memcached and memory

aiocache Asyncio cache supporting multiple backends (memory, redis and memcached). This library aims for simplicity over specialization. All caches co

aio-libs 764 Jan 02, 2023
Persistent caching for python functions

Cashier Persistent caching for python functions Simply add a decorator to a python function and cache the results for future use. Extremely handy when

Anoop Thomas Mathew 82 Mar 04, 2022
CacheControl is a port of the caching algorithms in httplib2 for use with requests session object.

CacheControl CacheControl is a port of the caching algorithms in httplib2 for use with requests session object. It was written because httplib2's bett

Eric Larson 409 Dec 04, 2022
Peerix is a peer-to-peer binary cache for nix derivations

Peerix Peerix is a peer-to-peer binary cache for nix derivations. Every participating node can pull derivations from each other instances' respective

92 Dec 13, 2022
An implementation of memoization technique for Django

django-memoize django-memoize is an implementation of memoization technique for Django. You can think of it as a cache for function or method results.

Unhaggle 118 Dec 09, 2022
A slick ORM cache with automatic granular event-driven invalidation.

Cacheops A slick app that supports automatic or manual queryset caching and automatic granular event-driven invalidation. It uses redis as backend for

Alexander Schepanovski 1.7k Dec 30, 2022
Extensible memoizing collections and decorators

cachetools This module provides various memoizing collections and decorators, including variants of the Python Standard Library's @lru_cache function

Thomas Kemmer 1.5k Jan 05, 2023
Automatic Flask cache configuration on Heroku.

flask-heroku-cacheify Automatic Flask cache configuration on Heroku. Purpose Configuring your cache on Heroku can be a time sink. There are lots of di

Randall Degges 39 Jun 05, 2022
Robust, highly tunable and easy-to-integrate in-memory cache solution written in pure Python, with no dependencies.

Omoide Cache Caching doesn't need to be hard anymore. With just a few lines of code Omoide Cache will instantly bring your Python services to the next

Leo Ertuna 2 Aug 14, 2022