a little task queue for python

Overview

http://media.charlesleifer.com/blog/photos/huey2-logo.png

a lightweight alternative.

huey is:

huey supports:

  • multi-process, multi-thread or greenlet task execution models
  • schedule tasks to execute at a given time, or after a given delay
  • schedule recurring tasks, like a crontab
  • automatically retry tasks that fail
  • task prioritization
  • task result storage
  • task expiration
  • task locking
  • task pipelines and chains

http://i.imgur.com/2EpRs.jpg

https://api.travis-ci.org/coleifer/huey.svg?branch=master

At a glance

from huey import RedisHuey, crontab

huey = RedisHuey('my-app', host='redis.myapp.com')

@huey.task()
def add_numbers(a, b):
    return a + b

@huey.task(retries=2, retry_delay=60)
def flaky_task(url):
    # This task might fail, in which case it will be retried up to 2 times
    # with a delay of 60s between retries.
    return this_might_fail(url)

@huey.periodic_task(crontab(minute='0', hour='3'))
def nightly_backup():
    sync_all_data()

Calling a task-decorated function will enqueue the function call for execution by the consumer. A special result handle is returned immediately, which can be used to fetch the result once the task is finished:

>>> from demo import add_numbers
>>> res = add_numbers(1, 2)
>>> res
<Result: task 6b6f36fc-da0d-4069-b46c-c0d4ccff1df6>

>>> res()
3

Tasks can be scheduled to run in the future:

>>> res = add_numbers.schedule((2, 3), delay=10)  # Will be run in ~10s.
>>> res(blocking=True)  # Will block until task finishes, in ~10s.
5

For much more, check out the guide or take a look at the example code.

Running the consumer

Run the consumer with four worker processes:

$ huey_consumer.py my_app.huey -k process -w 4

To run the consumer with a single worker thread (default):

$ huey_consumer.py my_app.huey

If your work-loads are mostly IO-bound, you can run the consumer with threads or greenlets instead. Because greenlets are so lightweight, you can run quite a few of them efficiently:

$ huey_consumer.py my_app.huey -k greenlet -w 32

Storage

Huey's design and feature-set were informed by the capabilities of the Redis database. Redis is a fantastic fit for a lightweight task queueing library like Huey: it's self-contained, versatile, and can be a multi-purpose solution for other web-application tasks like caching, event publishing, analytics, rate-limiting, and more.

Although Huey was designed with Redis in mind, the storage system implements a simple API and many other tools could be used instead of Redis if that's your preference.

Huey comes with builtin support for Redis, Sqlite and in-memory storage.

Documentation

See Huey documentation.

Project page

See source code and issue tracker on Github.

Huey is named in honor of my cat:

http://m.charlesleifer.com/t/800x-/blog/photos/p1473037658.76.jpg?key=mD9_qMaKBAuGPi95KzXYqg

Comments
  • Disable root logger in consumer

    Disable root logger in consumer

    I'm converting an existing code base to use Huey. One thing I'm running into is duplicate logs. In some Python modules I create a new logger with the name of the file, and attach a handler to it. However, the Huey consumer already sets up a handler on the root logger (in consumer_options.py, line 166 and before). So when I run any code using my logger, I get both logs.

    I don't see any way to override this behaviour. I do see that the setup_logger function on line 144 in the same file does accept a logger, but it's never passed in. Regardless, I don't see how I would access this anyway from the consumer.

    My preferred behaviour is that there is a way to have no handler added to the root logger, only a handler to the Huey logger which logs Huey stuff. Given that most of the logging appears to be done on specific loggers for Huey, I wouldn't expect this to be too impactful. The resulting behaviour would be like "semi-quiet" or "system-only".

    Would like to hear your thoughts. In the mean time I solved it by creating a child of the huey.consumer logger depending on an environment variable I pass to the process running the consumer, which works but I then lose control over formatting the log statements (which is not a huge deal in this case luckily).

    opened by larsderidder 34
  • How to wait on a group of tasks with Huey

    How to wait on a group of tasks with Huey

    I'd like to migrate from Celery to Huey but there is one potential sticking point. With Celery, I have a task that spawns a bunch of parallel tasks (a group) and waits on all of those parallel tasks to finish before executing a callback using a chord. For context, the parallel tasks fetch something from an external API, then the callback batch inserts this data into my Postgres database.

    I'm trying to think of how to accomplish something similar with Huey and came up with the following:

    1. Just have each task execute an API call and insert the result into the database individually. This is the simplest but potentially hammers the database if a bunch of API calls return simultaneously.
    2. Have a master task that spawns a bunch of API call tasks as well as another task that checks the result of all those tasks with a backoff delay.

    I'm not really satisfied with either of these approaches and was hoping someone might have a better suggestion.

    opened by distefam 25
  • A periodic task which can be scheduled manually

    A periodic task which can be scheduled manually

    I have a function I want to be run, let's say, daily. But also I want to be able to run it on demand.

    How to organize my code, task and periodic_task decorators?

    opened by mindojo-victor 21
  • periodic_task and utc time

    periodic_task and utc time

    @db_periodic_task(crontab(hour='0', minute='0'))

    With above decorator and utc=False option in consumer settings, the task runs at 2:00 am - it is my local time. I think that setting utc to False should effect in treating all datetimes as my local time (identical with operating system settings).

    opened by mskrzypek 20
  • Tasks Scheduling Tasks Run Immediately?

    Tasks Scheduling Tasks Run Immediately?

    I have a periodic task that runs at 2 minutes past the hour - every hour (see below). It's also triggered by a django management command - which is the issue. The task loader in turn schedules a series of tasks - as below. Unfortunately, when those tasks are scheduled in the future, they execute immediately, rather than being sent off to the task queue. The example below can reproduce this.

    #### foo/tasks.py ####
    @db_task
    def somefunc(val):
        print 'hi'
    
    @db_periodic(crontab('2 * * * *'))
    def load_thing():
         # dothings
         future = datetime.datetime.now() + datetime.timedelta(seconds=3600)
         somefunc.schedule(args=(1,), eta=future)
    
    ### django management command
    import os
    import sys 
    from django.core.management.base import BaseCommand
    from foo.tasks import load_thing
    
    class Command(BaseCommand):
        help = "A script to schedule integration refreshes"
    
        def handle(self, *args, **kwargs):
            load_integrations()
    
    opened by chayim 20
  • Extended djhuey to support multiple huey instances

    Extended djhuey to support multiple huey instances

    Heey guys

    We use huey with django in our production system and we need independent huey instances to process different tasks. Unfortunately the huey django plugin only supported one huey instance so I reworked the django plugin (djhuey).

    What changed

    • HUEY configuration supports multiple huey instances.
    • 'python manage.py run_huey' has a additional parameter to indicate on which huey instance it will consume.
    • This change is backward compatible. Old configurations can still be used. No code has to be changed to use these new features.
    • Updated the docs.

    Configuration example

    HUEY = {
        'my-app': {
            'default': True,  # Indicates the default huey instance
            'backend': 'huey.backends.redis_backend',
            'connection': {'host': 'localhost', 'port': 6379},
                'consumer': {
                    'workers': 4,
                    'worker_type': 'process',
            }
        },
        'my-app2': {
            'backend': 'huey.backends.sqlite_backend',
            'connection': {'location': 'sqlite filename'},
                'consumer': {
                    'workers': 4,
                    'worker_type': 'process',
            }
        },
    }
    

    The old configuration style is still valid. It only doesn't support multiple huey instances.

    run_huey example

    python manage.py run_huey --queue my-app2 
    python manage.py run_huey // Runs a consumer on the default huey instance
    

    Define task example

    from huey.contrib.djhuey import task
    
    @task('my-app2')
    def example_task(param):
            print(param)
    
    @task()  # Uses the default huey instance.
    def example_task2(param):
            print(param)
    

    We currently use my huey branch in our production system and it is thus already quiet mature. Don't hesitate to contact me for small changes.

    opened by SeverinAlexB 19
  • Periodic tasks with periods less than a minute

    Periodic tasks with periods less than a minute

    I'm working on a Django project currently where we need a task queueing system to get info from a separate system to store in our database. We want to get this info every five seconds, but it seems Huey is limited to a minute as its lowest period of time. Would adding this be possible?

    opened by wldcordeiro 18
  • task not found in TaskRegistry

    task not found in TaskRegistry

    I just upgraded to the latest version in master to make use of the fix for closing the db but it seems huey is no longer finding the tasks after the update for some reason. I was already running a pretty recent version so not sure what could have changed... i upgraded huey and restarted the workers which should recreate the task registry in redis again right?

    QueueException: queuecmd_create_initial_notifications not found in TaskRegistry
      File "huey/bin/huey_consumer.py", line 124, in check_message
        task = self.huey.dequeue()
      File "huey/api.py", line 211, in dequeue
        return registry.get_task_for_message(message)
      File "huey/registry.py", line 70, in get_task_for_message
        klass = self.get_task_class(klass_str)
      File "huey/registry.py", line 60, in get_task_class
        raise QueueException('%s not found in TaskRegistry' % klass_str)
    
    opened by tijs 17
  • Sub 1min intervals

    Sub 1min intervals

    I may be missing something, but I am not seeing that currently with cron style schedule anything that's under 1 is allowed. Am I correct, or is there a way to schedule something to run more than once per minute?

    opened by szaydel 17
  • non daemon workers

    non daemon workers

    What's the main reason workers are daemon like? Can we have an option to have them non-daemonized? We have some workers that need to do some cleanup operation, and not being able to execute atexit, or guarantee finishing of the processing are causing issues.

    opened by gaborbernat 16
  • Is it difficult to set a timeout for a task?

    Is it difficult to set a timeout for a task?

    Per task and default for all tasks? http://docs.celeryproject.org/en/latest/userguide/workers.html#time-limits https://stackoverflow.com/questions/11672179/setting-time-limit-on-specific-task-with-celery

    opened by mindojo-victor 15
Releases(2.4.2)
  • 2.4.2(Nov 28, 2021)

    • Fix implementation of schedule-pop Lua script so it works with Redis cluster.
    • Ensure Django connections are closed before and after (previously they were only closed after) task execution with db_task() and db_periodic_task().
    • Allow additional lock-names to be specified when flushing locks.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.4.1(Sep 16, 2021)

    • Attempt to reconnect to database if connection becomes unusable (e.g. due to a server restart). See: huey.contrib.sql_huey.SqlHuey.
    • Do not use a soft file-lock for FileStorage - use fcntl.flock() instead.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.4.0(Aug 10, 2021)

    • Task expiration: https://huey.readthedocs.io/en/latest/guide.html#task-expiration
    • Add option to make crontab() parsing strict, raising an error if an invalid interval specification is given. You probably want to enable this.
    • Fix bug in the FileStorage dequeue() method, which attempted to unlink an open file.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.3.2(Apr 20, 2021)

  • 2.3.1(Mar 4, 2021)

    • Add SIGNAL_INTERRUPTED to signal when a task is interrupted when a consumer exits abruptly.
    • Use the Huey.create_consumer() API within the Django management command, to allow Django users to customize the creation of the Consumer instance.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.3.0(Mar 4, 2021)

    • Use monotonic clock for timing operations within the consumer.
    • Ensure internal state is cleaned up on file-lock when released.
    • Support passing around TaskException as a pickled value.
    • Set the multiprocessing mode to "fork" on MacOS and Python 3.8 or newer.
    • Added option to enforce FIFO behavior when using Sqlite as storage.
    • Added the on_shutdown handler to djhuey namespace.
    • Ensure exception is set on AsyncResult in mini-huey.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.2.0(Feb 23, 2020)

    • Fix task repr (refs #460).
    • Adds task-id into metadata for task exceptions (refs #461).
    • Ensure database connection is not closed when using the call_local method of Django helper extension db_periodic_task().
    • Allow pickle protocol to be explicitly configured in serializer parameters.
    • Adds FileHuey and full FileStorage implementation.
    • Add shutdown() hook, which will be run in the context of the worker threads/processes during shutdown. This hook can be used to clean-up shared or global resources, for example.
    • Allow pipelines to be chained together. Additionally, support chaining task instances.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.1.3(Oct 16, 2019)

    • Fix semantics of SIGNAL_COMPLETE so that it is not sent until the result is ready.
    • Use classes for the specific Huey implementations (e.g. RedisHuey) so that it is easier to subclass / extend. Previously we just used a partial application of the constructor, which could be confusing.
    • Fix shutdown logic in consumer when using multiprocess worker model. Previously the consumer would perform a "graceful" shutdown, even when an immediate shutdown was requested (SIGTERM). Also cleans up the signal-handling code and ensures that interrupted tasks log a warning properly to indicate they were interrupted.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.1.2(Sep 4, 2019)

    • Allow AsyncResult object used in MiniHuey to support the __call__() method to block and resolve the task result.
    • When running the django run_huey management command, the huey loggers will not be configured if another logging handler is already registered to the huey namespace.
    • Added experimental contrib storage engine using kyoto tycoon <http://fallabs.com/kyototycoon>_ which supports task priority and the option to do automatic result expiration. Requires the ukt <https://github.com/coleifer/ukt>_ python package and a custom kyototycoon lua script.
    • Allow the Sqlite storage engine busy timeout to be configured when instantiating SqliteHuey.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.1.1(Aug 7, 2019)

    • Ensure that task()-decorated functions retain their docstrings.
    • Fix logger setup so that the consumer log configuration is only applied to the huey namespace, rather than the root logger.
    • Expose result, signal and disconnect_signal in the Django huey extension.
    • Add SignedSerializer, which signs and validates task messages.
    • Refactor the SqliteStorage so that it can be more easily extended to support other databases.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.1.0(Jun 6, 2019)

    • Added new contrib module sql_huey, which uses peewee <https://github.com/coleifer/peewee>_ to provide storage layer using any of the supported databases (sqlite, mysql or postgresql).
    • Added RedisExpireHuey, which modifies the usual Redis result storage logic to use an expire time for task result values. A consequence of this is that this storage implementation must keep all result keys at the top-level Redis keyspace. There are some small changes to the storage APIs as well, but will only possibly affect maintainers of alternative storage layers.
    • Also added a PriorityRedisExpireHuey which combines the priority-queue support from PriorityRedisHuey with the result-store expiration mechanism of RedisExpireHuey.
    • Fix gzip compatibility issue when using Python 2.x.
    • Add option to Huey to use zlib as the compression method instead of gzip.
    • Added FileStorageMethods storage mixin, which uses the filesystem for task result-store APIs (put, peek, pop).
    • The storage-specific Huey implementations (e.g. RedisHuey) are no longer subclasses, but instead are partial applications of the Huey constructor.

    View commits

    Source code(tar.gz)
    Source code(zip)
  • 2.0.1(Apr 3, 2019)

  • 2.0.0(Apr 2, 2019)

    View commits

    This section describes the changes in the 2.0.0 release. A detailed list of changes can be found here: https://huey.readthedocs.io/en/latest/changes.html

    Overview of changes:

    • always_eager mode has been renamed to immediate mode. Unlike previous versions, immediate mode involves the same code paths used by the consumer process. This makes it easier to test features like task revocation and task scheduling without needing to run a dedicated consumer process. Immediate mode uses an in-memory storage layer by default, but can be configured to use "live" storage like Redis or Sqlite.
    • The events stream API has been removed in favor of simpler callback-driven signals APIs. These callbacks are executed synchronously within the huey consumer process.
    • A new serialization format is used in 2.0.0, however consumers running 2.0 will continue to be able to read and deserialize messages enqueued by Huey version 1.11.0 for backwards compatibility.
    • Support for task priorities.
    • New Serializer abstraction allows users to customize the serialization format used when reading and writing tasks.
    • Huey consumer and scheduler can be more easily run within the application process, if you prefer not to run a separate consumer process.
    • Tasks can now specify an on_error handler, in addition to the previously-supported on_complete handler.
    • Task pipelines return a special ResultGroup object which simplifies reading the results of a sequence of task executions.
    • SqliteHuey has been promoted out of contrib, onto an equal footing with RedisHuey. To simplify deployment, the dependency on peewee was removed and the Sqlite storage engine uses the Python sqlite3 driver directly.
    Source code(tar.gz)
    Source code(zip)
  • 1.11.0(Feb 16, 2019)

    View commits

    Backwards-incompatible changes

    Previously, it was possible for certain tasks to be silently ignored if a task with that name already existed in the registry. To fix this, I have made two changes:

    1. The task-name, when serialized, now consists of the task module and the name of the decorated function. So, "queue_task_foo" becomes "myapp.tasks.foo".
    2. An exception will be raised when attempting to register a task function with the same module + name.

    Together, these changes are intended to fix problems described in #386.

    Because these changes will impact the serialization (and deserialization) of messages, it is important that you consume all tasks (including scheduled tasks) before upgrading.

    Always-eager mode changes

    In order to provide a more consistent API, tasks enqueued using always_eager mode will now return a dummy TaskResultWrapper implementation that wraps the return value of the task. This change is designed to provide the same API for reading task result values, regardless of whether you are using always-eager mode or not.

    Previously, tasks executed with always_eager would return the Python value directly from the task. When using Huey with the consumer, though, task results are not available immediately, so a special wrapper TaskResultWrapper is returned, which provides helper methods for retrieving the return value of the task. Going forward, always_eager tasks will return EagerTaskResultWrapper, which implements the same get() API that is typically used to retrieve task return values.

    Source code(tar.gz)
    Source code(zip)
  • 1.10.5(Dec 19, 2018)

  • 1.10.4(Nov 14, 2018)

    View commits

    • Log time taken to execute tasks at default log level.
    • Fix missing import in SQLite storage backend.
    • Small refactoring in Redis storage backend to make it easier to override the driver / client implementation.
    • Fix failing tests for simpledb storage backend.
    Source code(tar.gz)
    Source code(zip)
  • 1.10.3(Oct 10, 2018)

    View commits

    • Fixed regression where in always eager mode exceptions within tasks were being swallowed instead of raised.
    • Added an API for registering hooks to run when each worker process starts-up. This simplifies creating global/process-wide shared resources, such as a connection pool or database client. Documentation.
    Source code(tar.gz)
    Source code(zip)
  • 1.10.2(Oct 10, 2018)

  • 1.10.1(Jul 24, 2018)

    View commits

    • Remove call to SimpleDB Client.connect(), as the simpledb APIs have changed and no longer use this method.
    • Ensure that pre- and post-execute hooks are run when using Huey in "always_eager" mode.
    • Gracefully stop Huey consumer when SIGINT is received.
    • Improved continuous integration, now testing on Python 3.7 as well.
    Source code(tar.gz)
    Source code(zip)
  • 1.10.0(May 30, 2018)

    View commits

    • Ensure that the default SIGINT handler is registered. This fixes an edge-case that arises when the consumer is run without job control, which causes interrupt signals to be ignored.
    • Restarts (SIGHUP) are now graceful by default.
    Source code(tar.gz)
    Source code(zip)
  • 1.9.1(Apr 4, 2018)

    View commits

    • Ensure the scheduler loop does not drift (fixes #304).
    • Add TaskResultWrapper.reset() to enable resetting the results of tasks that failed and are subsequently being retried.
    • Allow task-decorated functions to be also decorated as periodic tasks.
    Source code(tar.gz)
    Source code(zip)
  • 1.9.0(Mar 12, 2018)

    View commits

    ROLLBACK of 1.8.0 Django Changes

    Due to problems with the django patch that added support for multiple huey instances, I've decided to rollback those changes.

    Django integration in Huey 1.9.0 will work the same as it had previously in 1.7.x and earlier.

    Apologies, I should have reviewed the patch more thoroughly and insisted on better test coverage.

    Source code(tar.gz)
    Source code(zip)
  • 1.8.0(Mar 9, 2018)

    View commits

    Backwards-incompatible change to Django integration

    In 1.8.0, support for multiple huey instances was added (with thanks to @Sebubu and @MarcoGlauser for the patches). Although existing Django/Huey apps should continue to work, there is a new configuration format available and I'd recommend that you take a look at the docs and switch over to it:

    Django integration documentation

    Source code(tar.gz)
    Source code(zip)
  • 1.7.0(Feb 7, 2018)

    View commits

    Backwards-incompatible change

    Previous versions of huey would store the traceback and associated metadata for a failed task within the result_store, regardless of whether store_errors was true or not. As of 1.7.0, task exceptions will only be stored in the result store if store_errors is True. See #290 for discussion.

    Source code(tar.gz)
    Source code(zip)
  • 1.6.1(Feb 7, 2018)

  • 1.6.0(Jan 12, 2018)

    View commits

    • Support for task pipelining and task function partials
    • Support for triggering task retries using RetryTask exception.
    • Support for task locking, restricting concurrency of a given task.
    • Getting result of task that failed with an exception results in a TaskException being raised.
    • Updated health check to ensure the task scheduler is always running.
    • Refactor implementation of task() and periodic_task() decorators, which should have the added benefit of making them easier to extend.
    • Refactored result-store APIs to simplify serialization / deserialization logic.
    • Fixed bug in serialization of task exceptions.
    • Added simple client/server implementation for testing locally. Blog post on the subject.
    Source code(tar.gz)
    Source code(zip)
  • 1.5.6(Jan 12, 2018)

    View commits

    • Allow arbitrary settings to be specified in task() decorators.
    • New task name format includes function module as part of task name.
    • Fix for operating systems that do not implement SIGHUP.
    • Fix bug in contrib.minimal task scheduler timing.
    Source code(tar.gz)
    Source code(zip)
  • 1.5.5(Nov 2, 2017)

  • 1.5.4(Oct 23, 2017)

  • 1.5.3(Oct 22, 2017)

Django models and endpoints for working with large images -- tile serving

Django Large Image Models and endpoints for working with large images in Django -- specifically geared towards geospatial tile serving. DISCLAIMER: th

Resonant GeoData 42 Dec 17, 2022
Agenda feita usando o django para adicionar eventos

Agenda de Eventos Projeto Agenda com Django Inicio O projeto foi iniciado no Django, usando o models.py foi adicionado os dados dos eventos e feita as

Bruno Fernandes 1 Apr 14, 2022
Packs a bunch of smaller CSS files together from 1 folder.

Packs a bunch of smaller CSS files together from 1 folder.

1 Dec 09, 2021
Djangoblog - A blogging platform built on Django and Python.

djangoblog 👨‍💻 A blogging platform built on Django and Python

Lewis Gentle 1 Jan 10, 2022
A Django application that provides country choices for use with forms, flag icons static files, and a country field for models.

Django Countries A Django application that provides country choices for use with forms, flag icons static files, and a country field for models. Insta

Chris Beaven 1.2k Jan 07, 2023
Organize Django settings into multiple files and directories. Easily override and modify settings. Use wildcards and optional settings files.

Organize Django settings into multiple files and directories. Easily override and modify settings. Use wildcards in settings file paths and mark setti

Nikita Sobolev 940 Jan 03, 2023
django+bootstrap5 实现的 个人博客

项目状态: 正在开发中【目前已基本可用】 项目地址: https://github.com/find456789/django_blog django_blog django+bootstrap5 实现的 个人博客 特点 文章的历史版本管理(随时回退) rss、atom markdown 评论功能

名字 3 Nov 16, 2021
A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, celery and redis.

Django Channels Websocket Chatbot A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, c

Yunbo Shi 8 Oct 28, 2022
A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for quickly creating new images from the one assigned to the field.

django-versatileimagefield A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for creat

Jonathan Ellenberger 490 Dec 13, 2022
Easily share data across your company via SQL queries. From Grove Collab.

SQL Explorer SQL Explorer aims to make the flow of data between people fast, simple, and confusion-free. It is a Django-based application that you can

Grove Collaborative 2.1k Dec 30, 2022
This is a template tag project for django to calculate in templates , enjoy it

Calculator-Template-Django this is a template tag project for django to calculate in templates , enjoy it Get Started : 1 - Download Source Code 2 - M

1 Feb 01, 2022
This repository contains django library management system project.

Library Management System Django ** INSTALLATION** First of all install python on your system. Then run pip install -r requirements.txt to required se

whoisdinanath 1 Dec 26, 2022
A Minimalistic Modern Django Boilerplate

A Minimalistic Modern Django Boilerplate This boilerplate is mainly for educational purposes. It is meant to be cloned as a starter code for future tu

Jonathan Adly 21 Nov 02, 2022
Social Media Network Focuses On Data Security And Being Community Driven Web App

privalise Social Media Network Focuses On Data Security And Being Community Driven Web App The Main Idea: We`ve seen social media web apps that focuse

Privalise 8 Jun 25, 2021
A Django backed for PostgreSQL using Psycopg 3

A Django backend for PostgreSQL using Psycopg 2 The backend passes the entire Django test suite, but it needs a few modifications to Django and to i

Daniele Varrazzo 42 Dec 16, 2022
Wrap the Blockchain API in Django!

django-blockchain Wrap the Blockchain API in Django. Installation pip install django-blockchain Add app in your settings.py INSTALLED_APPS = [ "d

Dmitry Kalinin 2 Feb 04, 2022
This is a simple Todo web application built Django (back-end) and React JS (front-end)

Django REST Todo app This is a simple Todo web application built with Django (back-end) and React JS (front-end). The project enables you to systemati

Maxim Mukhin 5 May 06, 2022
Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.

Cookiecutter Django Powered by Cookiecutter, Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly. Documentati

Daniel Feldroy 10k Dec 31, 2022
Django Phyton Web Apps template themes

Django Phyton Web Apps template themes Free download source code project for build a modern website using django phyton web apps. Documentation instal

Mesin Kasir 4 Dec 15, 2022
Simple API written in Python using FastAPI to store and retrieve Books and Authors.

Simple API made with Python FastAPI WIP: Deploy in AWS with Terraform Simple API written in Python using FastAPI to store and retrieve Books and Autho

Caio Delgado 9 Oct 26, 2022