SQL for Humans™

Overview

Records: SQL for Humans™

https://travis-ci.org/kennethreitz/records.svg?branch=master

Records is a very simple, but powerful, library for making raw SQL queries to most relational databases.

https://farm1.staticflickr.com/569/33085227621_7e8da49b90_k_d.jpg

Just write SQL. No bells, no whistles. This common task can be surprisingly difficult with the standard tools available. This library strives to make this workflow as simple as possible, while providing an elegant interface to work with your query results.

Database support includes RedShift, Postgres, MySQL, SQLite, Oracle, and MS-SQL (drivers not included).


☤ The Basics

We know how to write SQL, so let's send some to our database:

import records

db = records.Database('postgres://...')
rows = db.query('select * from active_users')    # or db.query_file('sqls/active-users.sql')

Grab one row at a time:

>>> rows[0]
<Record {"username": "model-t", "active": true, "name": "Henry Ford", "user_email": "[email protected]", "timezone": "2016-02-06 22:28:23.894202"}>

Or iterate over them:

for r in rows:
    print(r.name, r.user_email)

Values can be accessed many ways: row.user_email, row['user_email'], or row[3].

Fields with non-alphanumeric characters (like spaces) are also fully supported.

Or store a copy of your record collection for later reference:

>>> rows.all()
[<Record {"username": ...}>, <Record {"username": ...}>, <Record {"username": ...}>, ...]

If you're only expecting one result:

>>> rows.first()
<Record {"username": ...}>

Other options include rows.as_dict() and rows.as_dict(ordered=True).

☤ Features

  • Iterated rows are cached for future reference.
  • $DATABASE_URL environment variable support.
  • Convenience Database.get_table_names method.
  • Command-line records tool for exporting queries.
  • Safe parameterization: Database.query('life=:everything', everything=42).
  • Queries can be passed as strings or filenames, parameters supported.
  • Transactions: t = Database.transaction(); t.commit().
  • Bulk actions: Database.bulk_query() & Database.bulk_query_file().

Records is proudly powered by SQLAlchemy and Tablib.

☤ Data Export Functionality

Records also features full Tablib integration, and allows you to export your results to CSV, XLS, JSON, HTML Tables, YAML, or Pandas DataFrames with a single line of code. Excellent for sharing data with friends, or generating reports.

>>> print(rows.dataset)
username|active|name      |user_email       |timezone
--------|------|----------|-----------------|--------------------------
model-t |True  |Henry Ford|[email protected]|2016-02-06 22:28:23.894202
...

Comma Separated Values (CSV)

>>> print(rows.export('csv'))
username,active,name,user_email,timezone
model-t,True,Henry Ford,[email protected],2016-02-06 22:28:23.894202
...

YAML Ain't Markup Language (YAML)

>>> print(rows.export('yaml'))
- {active: true, name: Henry Ford, timezone: '2016-02-06 22:28:23.894202', user_email: model-t@gmail.com, username: model-t}
...

JavaScript Object Notation (JSON)

>>> print(rows.export('json'))
[{"username": "model-t", "active": true, "name": "Henry Ford", "user_email": "[email protected]", "timezone": "2016-02-06 22:28:23.894202"}, ...]

Microsoft Excel (xls, xlsx)

with open('report.xls', 'wb') as f:
    f.write(rows.export('xls'))

Pandas DataFrame

>>> rows.export('df')
    username  active       name        user_email                   timezone
0    model-t    True Henry Ford model-t@gmail.com 2016-02-06 22:28:23.894202

You get the point. All other features of Tablib are also available, so you can sort results, add/remove columns/rows, remove duplicates, transpose the table, add separators, slice data by column, and more.

See the Tablib Documentation for more details.

☤ Installation

Of course, the recommended installation method is pipenv:

$ pipenv install records[pandas]
✨🍰✨

☤ Command-Line Tool

As an added bonus, a records command-line tool is automatically included. Here's a screenshot of the usage information:

Screenshot of Records Command-Line Interface.

☤ Thank You

Thanks for checking this library out! I hope you find it useful.

Of course, there's always room for improvement. Feel free to open an issue so we can make Records better, stronger, faster.

Owner
Ken Reitz
Software Engineer focused on abstractions, reducing cognitive overhead, and Design for Humans.
Ken Reitz
A Python Object-Document-Mapper for working with MongoDB

MongoEngine Info: MongoEngine is an ORM-like layer on top of PyMongo. Repository: https://github.com/MongoEngine/mongoengine Author: Harry Marr (http:

MongoEngine 3.9k Jan 08, 2023
Python DBAPI simplified

Facata A Python library that provides a simplified alternative to DBAPI 2. It provides a facade in front of DBAPI 2 drivers. Table of Contents Install

Tony Locke 44 Nov 17, 2021
MinIO Client SDK for Python

MinIO Python SDK for Amazon S3 Compatible Cloud Storage MinIO Python SDK is Simple Storage Service (aka S3) client to perform bucket and object operat

High Performance, Kubernetes Native Object Storage 582 Dec 28, 2022
Py2neo is a comprehensive toolkit for working with Neo4j from within Python applications or from the command line.

Py2neo Py2neo is a client library and toolkit for working with Neo4j from within Python applications and from the command line. The library supports b

Nigel Small 1.2k Jan 02, 2023
A tool to snapshot sqlite databases you don't own

The core here is my first attempt at a solution of this, combining ideas from browser_history.py and karlicoss/HPI/sqlite.py to create a library/CLI tool to (as safely as possible) copy databases whi

Sean Breckenridge 10 Dec 22, 2022
Official Python low-level client for Elasticsearch

Python Elasticsearch Client Official low-level client for Elasticsearch. Its goal is to provide common ground for all Elasticsearch-related code in Py

elastic 3.8k Jan 01, 2023
A fast PostgreSQL Database Client Library for Python/asyncio.

asyncpg -- A fast PostgreSQL Database Client Library for Python/asyncio asyncpg is a database interface library designed specifically for PostgreSQL a

magicstack 5.8k Dec 31, 2022
Dinamopy is a python helper library for dynamodb

Dinamopy is a python helper library for dynamodb. You can define your access patterns in a json file and can use dynamic method names to make operations.

Rasim Andıran 2 Jul 18, 2022
A pythonic interface to Amazon's DynamoDB

PynamoDB A Pythonic interface for Amazon's DynamoDB. DynamoDB is a great NoSQL service provided by Amazon, but the API is verbose. PynamoDB presents y

2.1k Dec 30, 2022
Async database support for Python. 🗄

Databases Databases gives you simple asyncio support for a range of databases. It allows you to make queries using the powerful SQLAlchemy Core expres

Encode 3.2k Dec 30, 2022
SQL queries to collections

SQC SQL Queries to Collections Examples from sqc import sqc data = [ {"a": 1, "b": 1}, {"a": 2, "b": 1}, {"a": 3, "b": 2}, ] Simple filte

Alexander Volkovsky 0 Jul 06, 2022
Import entity definition document into SQLie3. Manage the entity. Also, create a "Create Table SQL file".

EntityDocumentMaker Version 1.00 After importing the entity definition (Excel file), store the data in sqlite3. エンティティ定義(Excelファイル)をインポートした後、データをsqlit

G-jon FujiYama 1 Jan 09, 2022
Find graph motifs using intuitive notation

d o t m o t i f Find graph motifs using intuitive notation DotMotif is a library that identifies subgraphs or motifs in a large graph. It looks like t

APL BRAIN 45 Jan 02, 2023
Anomaly detection on SQL data warehouses and databases

With CueObserve, you can run anomaly detection on data in your SQL data warehouses and databases. Getting Started Install via Docker docker run -p 300

Cuebook 171 Dec 18, 2022
Pure-python PostgreSQL driver

pg-purepy pg-purepy is a pure-Python PostgreSQL wrapper based on the anyio library. A lot of this library was inspired by the pg8000 library. Credits

Lura Skye 11 May 23, 2022
MongoX is an async python ODM for MongoDB which is built on top Motor and Pydantic.

MongoX MongoX is an async python ODM (Object Document Mapper) for MongoDB which is built on top Motor and Pydantic. The main features include: Fully t

Amin Alaee 112 Dec 04, 2022
MongoDB data stream pipeline tools by YouGov (adopted from MongoDB)

mongo-connector The mongo-connector project originated as a MongoDB mongo-labs project and is now community-maintained under the custody of YouGov, Pl

YouGov 1.9k Jan 04, 2023
Python cluster client for the official redis cluster. Redis 3.0+.

redis-py-cluster This client provides a client for redis cluster that was added in redis 3.0. This project is a port of redis-rb-cluster by antirez, w

Grokzen 1.1k Jan 05, 2023
Use SQL query in a jupyter notebook!

SQL-query Use SQL query in a jupyter notebook! The table I used can be found on UN Data. Or you can just click the link and download the file undata_s

Chuqin 2 Oct 05, 2022
A Python wheel containing PostgreSQL

postgresql-wheel A Python wheel for Linux containing a complete, self-contained, locally installable PostgreSQL database server. All servers run as th

Michel Pelletier 71 Nov 09, 2022