Asynchronous tasks in Python with Celery + RabbitMQ + Redis

Overview

python-asynchronous-tasks

Setup & Installation

Create a virtual environment and install the dependencies:

$ python -m venv venv
$ source env/bin/activate

$ pip install -r requirements.txt

Start redis and rabbitmq services with docker-compose:

docker-compose up -d

Verify that the services are up and running:

$ docker ps

# Excepted output
27d1b8ced89a   rabbitmq:latest   "docker-entrypoint.s…"   8 minutes ago   Up 8 minutes   4369/tcp, 5671/tcp, 15691-15692/tcp, 25672/tcp, 0.0.0.0:5672->5672/tcp, :::5672->5672/tcp   python-async-tasks_rabbitmq_1
17b72947fb8a   redis:latest      "docker-entrypoint.s…"   8 minutes ago   Up 8 minutes   0.0.0.0:6379->6379/tcp, :::6379->6379/tcp                                                   python-async-tasks_redis_1

Usage

Spin up a new terminal and start the celery worker:

$ celery -A tasks worker -l info --pool=solo

Send a task to celery worker to verify that it's working as excepted: Open a new terminal and get inside python interactive shell, import say_hello function from tasks.py and then call it with delay()

$ python
>>> from tasks import say_hello
>>> say_hello.delay("Valon")
<AsyncResult: 5711efdc-17c7-4f02-bfba-d193c3d28c42>

And in the celery terminal that we started the worker before we should except a result similar to:

INFO/MainProcess] Received task: tasks.say_hello[5711efdc-17c7-4f02-bfba-d193c3d28c42]
INFO/MainProcess] Task tasks.say_hello[5711efdc-17c7-4f02-bfba-d193c3d28c42] succeeded in 5.01600000000326s: 'Hello Valon'

What to do next?

Just expand the usage of the celery, define your tasks and keep doing amazing stuff :)

Contact:

If you have any questions regarding the topic or anything else, feel free to reach to me on:

License

This project is licensed under the terms of the MIT license.

Owner
Valon Januzaj
Back-end Engineer
Valon Januzaj
Asynchronous tasks in Python with Celery + RabbitMQ + Redis

python-asynchronous-tasks Setup & Installation Create a virtual environment and install the dependencies: $ python -m venv venv $ source env/bin/activ

Valon Januzaj 40 Dec 03, 2022
Distributed Task Queue (development branch)

Version: 5.1.0b1 (singularity) Web: https://docs.celeryproject.org/en/stable/index.html Download: https://pypi.org/project/celery/ Source: https://git

Celery 20.7k Jan 01, 2023
Mr. Queue - A distributed worker task queue in Python using Redis & gevent

MRQ MRQ is a distributed task queue for python built on top of mongo, redis and gevent. Full documentation is available on readthedocs Why? MRQ is an

Pricing Assistant 871 Dec 25, 2022
OpenQueue is a experimental CS: GO match system written in asyncio python.

What is OpenQueue OpenQueue is a experimental CS: GO match system written in asyncio python. Please star! This project was a lot of work & still has a

OpenQueue 10 May 13, 2022
Redis-backed message queue implementation that can hook into a discord bot written with hikari-lightbulb.

Redis-backed FIFO message queue implementation that can hook into a discord bot written with hikari-lightbulb. This is eventually intended to be the backend communication between a bot and a web dash

thomm.o 7 Dec 05, 2022
A fast and reliable background task processing library for Python 3.

dramatiq A fast and reliable distributed task processing library for Python 3. Changelog: https://dramatiq.io/changelog.html Community: https://groups

Bogdan Popa 3.4k Jan 01, 2023
Full featured redis cache backend for Django.

Redis cache backend for Django This is a Jazzband project. By contributing you agree to abide by the Contributor Code of Conduct and follow the guidel

Jazzband 2.5k Jan 03, 2023
Asynchronous serverless task queue with timed leasing of tasks

Asynchronous serverless task queue with timed leasing of tasks. Threaded implementations for SQS and local filesystem.

24 Dec 14, 2022
Simple job queues for Python

Hypothesis Hypothesis is a family of testing libraries which let you write tests parametrized by a source of examples. A Hypothesis implementation the

RQ 8.7k Jan 07, 2023
Dagon - An Asynchronous Task Graph Execution Engine

Dagon - An Asynchronous Task Graph Execution Engine Dagon is a job execution sys

8 Nov 17, 2022
A multiprocessing distributed task queue for Django

A multiprocessing distributed task queue for Django Features Multiprocessing worker pool Asynchronous tasks Scheduled, cron and repeated tasks Signed

Ilan Steemers 1.7k Jan 03, 2023
Django database backed celery periodic task scheduler with support for task dependency graph

Djag Scheduler (Dj)ango Task D(AG) (Scheduler) Overview Djag scheduler associates scheduling information with celery tasks The task schedule is persis

Mohith Reddy 3 Nov 25, 2022
SAQ (Simple Async Queue) is a simple and performant job queueing framework built on top of asyncio and redis

SAQ SAQ (Simple Async Queue) is a simple and performant job queueing framework built on top of asyncio and redis. It can be used for processing backgr

Toby Mao 117 Dec 30, 2022
Django email backend with AWS SES and Celery

Django Celery SES Django Email Backend with Amazon Web Service SES and Celery, developed and used by StreetVoice. This packages provide a EmailBackend

StreetVoice 30 Oct 24, 2022
A fully-featured e-commerce application powered by Django

kobbyshop - Django Ecommerce App A fully featured e-commerce application powered by Django. Sections Project Description Features Technology Setup Scr

Kwabena Yeboah 2 Feb 15, 2022
PostgreSQL-based Task Queue for Python

Procrastinate: PostgreSQL-based Task Queue for Python Procrastinate is an open-source Python 3.7+ distributed task processing library, leveraging Post

Procrastinate 486 Jan 08, 2023
Add you own metrics to your celery backend

Add you own metrics to your celery backend

Gandi 1 Dec 16, 2022
Accept queue automatically on League of Legends.

Accept queue automatically on League of Legends. I was inspired by the lucassmonn code accept-queue-lol-telegram, and I modify it according to my need

2 Sep 06, 2022
Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code

py_extract Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code. Still in Development Stage! I

I'm Not A Bot #Left_TG 7 Nov 07, 2021
FastAPI with Celery

Minimal example utilizing fastapi and celery with RabbitMQ for task queue, Redis for celery backend and flower for monitoring the celery tasks.

Grega Vrbančič 371 Jan 01, 2023