Asynchronous Python HTTP Requests for Humans using twisted

Overview

Asynchronous Python HTTP Requests for Humans

https://travis-ci.org/tardyp/txrequests.png?branch=master

Small add-on for the python requests http library. Makes use twisted's ThreadPool, so that the requests'API returns deferred

The additional API and changes are minimal and strives to avoid surprises.

The following synchronous code:

from requests import Session

session = Session()
# first requests starts and blocks until finished
response_one = session.get('http://httpbin.org/get')
# second request starts once first is finished
response_two = session.get('http://httpbin.org/get?foo=bar')
# both requests are complete
print('response one status: {0}'.format(response_one.status_code))
print(response_one.content)
print('response two status: {0}'.format(response_two.status_code))
print(response_two.content)

Can be translated to make use of futures, and thus be asynchronous by creating a FuturesSession and catching the returned Future in place of Response. The Response can be retrieved by calling the result method on the Future:

from txrequests import Session
from twisted.internet import defer

@defer.inlineCallbacks
def main():
    # use with statement to cleanup session's threadpool, and connectionpool after use
    # you can also use session.close() if want to use session for long term use
    with Session() as session:
        # first request is started in background
        d1 = session.get('http://httpbin.org/get')
        # second requests is started immediately
        d2 = session.get('http://httpbin.org/get?foo=bar')
        # wait for the first request to complete, if it hasn't already
        response_one = yield d1
        print('response one status: {0}'.format(response_one.status_code))
        print(response_one.content)
        # wait for the second request to complete, if it hasn't already
        response_two = yield d2
        print('response two status: {0}'.format(response_two.status_code))
        print(response_two.content)

By default a ThreadPool is created with 4 max workers. If you would like to adjust that value or share a threadpool across multiple sessions you can provide one to the Session constructor.

from twisted.python.threadpool import ThreadPool
from txrequests import Session

session = FuturesSession(pool=ThreadPool(maxthreads=10))
# ...

As a shortcut in case of just increasing workers number you can pass minthreads and/or maxthreads straight to the Session constructor:

from txrequests import Session
session = Session(maxthreads=10)

That's it. The api of requests.Session is preserved without any modifications beyond returning a Deferred rather than Response. As with all futures exceptions are shifted to the deferred errback.

Working in the Background

There is one additional parameter to the various request functions, background_callback, which allows you to work with the Response objects in the background thread. This can be useful for shifting work out of the foreground, for a simple example take json parsing.

from pprint import pprint
from txrequests import Session
from twisted.internet import defer

@defer.inlineCallbacks
def main():
    with Session() as session:

        def bg_cb(sess, resp):
            # parse the json storing the result on the response object
            resp.data = resp.json()
            return resp

        d = session.get('http://httpbin.org/get', background_callback=bg_cb)
        # do some other stuff, send some more requests while this one works
        response = yield d
        print('response status {0}'.format(response.status_code))
        # data will have been attached to the response object in the background
        pprint(response.data)

Installation

pip install txrequests

Credits

txrequests is based on requests_future, from Ross McFarland

Owner
Pierre Tardy
Pierre Tardy
πŸ’‘Python package for HTTP/1.1 style headers. Parse headers to objects. Most advanced available structure for http headers.

HTTP Headers, the Complete Toolkit 🧰 Object-oriented headers. Kind of structured headers. ❓ Why No matter if you are currently dealing with code usin

TAHRI Ahmed R. 103 Dec 02, 2022
A next generation HTTP client for Python. πŸ¦‹

HTTPX - A next-generation HTTP client for Python. HTTPX is a fully featured HTTP client for Python 3, which provides sync and async APIs, and support

Encode 9.8k Jan 05, 2023
Some example code for using a raspberry pi to draw text (including emojis) and twitch emotes to a HUB75 RGB matrix via an HTTP post endpoint.

Some example code for using a raspberry pi to draw text (including emojis) and twitch emotes to a HUB75 RGB matrix via an HTTP post endpoint.

7 Nov 05, 2022
EasyRequests is a minimalistic HTTP-Request Library that wraps aiohttp and asyncio in a small package that allows for sequential, parallel or even single requests

EasyRequests EasyRequests is a minimalistic HTTP-Request Library that wraps aiohttp and asyncio in a small package that allows for sequential, paralle

Avi 1 Jan 27, 2022
A minimal HTTP client. βš™οΈ

HTTP Core Do one thing, and do it well. The HTTP Core package provides a minimal low-level HTTP client, which does one thing only. Sending HTTP reques

Encode 306 Dec 27, 2022
HTTP Request & Response Service, written in Python + Flask.

httpbin(1): HTTP Request & Response Service

Postman Inc. 11.3k Jan 01, 2023
Pretty fast mass-dmer with multiple tokens support made with python requests

mass-dm-requests - Little preview of the Logger and the Spammer Features Logging User IDS Sending DMs (Embeds are supported) to the logged IDs Includi

karma.meme 14 Nov 18, 2022
Requests + Gevent = <3

GRequests: Asynchronous Requests GRequests allows you to use Requests with Gevent to make asynchronous HTTP Requests easily. Note: You should probably

Spencer Phillip Young 4.2k Dec 30, 2022
A simple, yet elegant HTTP library.

Requests Requests is a simple, yet elegant HTTP library. import requests r = requests.get('https://api.github.com/user', auth=('user', 'pass')

Python Software Foundation 48.8k Jan 05, 2023
hackhttp2 make everything easier

hackhttp2 intro This repo is inspired by hackhttp, but it's out of date already. so, I create this repo to make simulation and Network request easier.

youbowen 5 Jun 15, 2022
Script to automate PUT HTTP method exploitation to get shell.

Script to automate PUT HTTP method exploitation to get shell.

devploit 116 Nov 10, 2022
T-Reqs: A grammar-based HTTP Fuzzer

T-Reqs HTTP Fuzzer T-Reqs (Two Requests) is a grammar-based HTTP Fuzzer written as a part of the paper titled "T-Reqs: HTTP Request Smuggling with Dif

Bahruz Jabiyev 207 Dec 06, 2022
Fast HTTP parser

httptools is a Python binding for the nodejs HTTP parser. The package is available on PyPI: pip install httptools. APIs httptools contains two classes

magicstack 1.1k Jan 07, 2023
Python package for caching HTTP response based on etag

Etag cache implementation for HTTP requests, to save request bandwidth for a non-modified response. Returns high-speed accessed dictionary data as cache.

Rakesh R 2 Apr 27, 2022
curl statistics made simple

httpstat httpstat visualizes curl(1) statistics in a way of beauty and clarity. It is a single file 🌟 Python script that has no dependency πŸ‘ and is

Xiao Meng 5.3k Jan 04, 2023
Python requests like API built on top of Twisted's HTTP client.

treq: High-level Twisted HTTP Client API treq is an HTTP library inspired by requests but written on top of Twisted's Agents. It provides a simple, hi

Twisted Matrix Labs 553 Dec 18, 2022
Probe and discover HTTP pathname using brute-force methodology and filtered by specific word or 2 words at once

pathprober Probe and discover HTTP pathname using brute-force methodology and filtered by specific word or 2 words at once. Purpose Brute-forcing webs

NFA 41 Jul 06, 2022
πŸ”„ 🌐 Handle thousands of HTTP requests, disk writes, and other I/O-bound tasks simultaneously with Python's quintessential async libraries.

πŸ”„ 🌐 Handle thousands of HTTP requests, disk writes, and other I/O-bound tasks simultaneously with Python's quintessential async libraries.

Hackers and Slackers 15 Dec 12, 2022
Python HTTP library with thread-safe connection pooling, file post support, user friendly, and more.

urllib3 is a powerful, user-friendly HTTP client for Python. Much of the Python ecosystem already uses urllib3 and you should too. urllib3 brings many

urllib3 3.2k Dec 29, 2022
Aiosonic - lightweight Python asyncio http client

aiosonic - lightweight Python asyncio http client Very fast, lightweight Python asyncio http client Here is some documentation. There is a performance

Johanderson Mogollon 93 Jan 06, 2023