A tool for creating credentials for accessing S3 buckets

Overview

s3-credentials

PyPI Changelog Tests License

A tool for creating credentials for accessing S3 buckets

For project background, see s3-credentials: a tool for creating credentials for S3 buckets on my blog.

⚠️ Warning

I am not an AWS security expert. You shoud review how this tool works carefully before using it against with own AWS account.

If you are an AWS security expert I would love to get your feedback!

Installation

Install this tool using pip:

$ pip install s3-credentials

Configuration

This tool uses boto3 under the hood which supports a number of different ways of providing your AWS credentials. If you have an existing ~/.aws/config or ~/.aws/credentials file the tool will use that - otherwise you can set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables before calling this tool.

Usage

The s3-credentials create command is the core feature of this tool. Pass it one or more S3 bucket names and it will create a new user with permission to access just those specific buckets, then create access credentials for that user and output them to your console.

Make sure to record the SecretAccessKey because it will only be displayed once and cannot be recreated later on.

In this example I create credentials for reading and writing files in my static.niche-museums.com S3 bucket:

% s3-credentials create static.niche-museums.com

Created user: s3.read-write.static.niche-museums.com with permissions boundary: arn:aws:iam::aws:policy/AmazonS3FullAccess
Attached policy s3.read-write.static.niche-museums.com to user s3.read-write.static.niche-museums.com
Created access key for user: s3.read-write.static.niche-museums.com
{
    "UserName": "s3.read-write.static.niche-museums.com",
    "AccessKeyId": "AKIAWXFXAIOZOYLZAEW5",
    "Status": "Active",
    "SecretAccessKey": "...",
    "CreateDate": "2021-11-03 01:38:24+00:00"
}

The command has several additional options:

  • --username TEXT: The username to use for the user that is created by the command (or the username of an existing user if you do not want to create a new one). If ommitted a default such as s3.read-write.static.niche-museums.com will be used.
  • -c, --create-bucket: Create the buckts if they do not exist. Without this any missing buckets will be treated as an error.
  • --read-only: The user should only be allowed to read files from the bucket.-
  • --write-only: The user should only be allowed to write files to the bucket, but not read them. This is useful for logging use-cases.
  • --bucket-region: If creating buckets, the region in which they should be created.
  • --silent: Don't output details of what is happening, just output the JSON for the created access credentials at the end.
  • --user-permissions-boundary: Custom permissions boundary to use for users created by this tool. This will default to restricting those users to only interacting with S3, taking the --read-only option into account. Use none to create users without any permissions boundary at all.

Here's the full sequence of events that take place when you run this command:

  1. Confirm that each of the specified buckets exists. If they do not and --create-bucket was passed create them - otherwise exit with an error.
  2. If a username was not specified, determine a username using the s3.$permission.$buckets format.
  3. If a user with that username does not exist, create one with an S3 permissions boundary that respects the --read-only option - unless --user-permissions-boundary=none was passed (or a custom permissions boundary string).
  4. For each specified bucket, add an inline IAM policy to the user that gives them permission to either read-only, write-only or read-write against that bucket.
  5. Create a new access key for that user and output the key and its secret to the console.

Other commands

whoami

To see which user you are authenticated as:

s3-credentials whoami

This will output JSON representing the currently authenticated user.

list-users

To see a list of all users that exist for your AWS account:

s3-credentials list-users

This will return pretty-printed JSON objects by default.

Add --nl to collapse these to single lines as valid newline-delimited JSON.

Add --array to output a valid JSON array of objects instead.

list-buckets

Shows a list of all buckets in your AWS account.

s3-credentials list-buckets

Accepts the same --nl and --array options as list-users.

list-user-policies

To see a list of inline policies belonging to users:

% s3-credentials list-user-policies s3.read-write.static.niche-museums.com

User: s3.read-write.static.niche-museums.com
PolicyName: s3.read-write.static.niche-museums.com
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "ListObjectsInBucket",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::static.niche-museums.com"
            ]
        },
        {
            "Sid": "AllObjectActions",
            "Effect": "Allow",
            "Action": "s3:*Object",
            "Resource": [
                "arn:aws:s3:::static.niche-museums.com/*"
            ]
        }
    ]
}

You can pass any number of usernames here. If you don't specify a username the tool will loop through every user belonging to your account:

s3-credentials list-user-policies

delete-user

In trying out this tool it's possible you will create several different user accounts that you later decide to clean up.

Deleting AWS users is a little fiddly: you first need to delete their access keys, then their inline policies and finally the user themselves.

The s3-credentials delete-user handles this for you:

% s3-credentials delete-user s3.read-write.simonw-test-bucket-10
User: s3.read-write.simonw-test-bucket-10
  Deleted policy: s3.read-write.simonw-test-bucket-10
  Deleted access key: AKIAWXFXAIOZK3GPEIWR
  Deleted user

You can pass it multiple usernames to delete multiple users at a time.

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd s3-credentials
python -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest
Comments
  • `s3-credentials create` command

    `s3-credentials create` command

    This is the command which create a user and returns credentials for a specified bucket, optionally also creating the bucket as well.

    See initial design notes in #1.

    enhancement 
    opened by simonw 22
  • Standard default output should be a valid JSON array

    Standard default output should be a valid JSON array

    I just spotted list-buckets has the same not-quite-newline-delimited JSON output format, which is a bad default. I should fix that too.

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/issues> /28#issuecomment-1014838721

    enhancement 
    opened by simonw 15
  • Research creating expiring credentials using `sts.assume_role()`

    Research creating expiring credentials using `sts.assume_role()`

    The initial reason for creating this tool was that I wanted to be able to create long-lived (never expiring) tokens for the kinds of use-cases described in this post: https://simonwillison.net/2021/Nov/3/s3-credentials/

    Expiring credentials are fantastic for all sorts of other use-cases. It would be great if this tool could optionally create those instead of creating long-lived credentials.

    This would mean the tool didn't have to create users at all (when used in that mode) - it could create a role and then create temporary access credentials for that role using sts.assume_role(): https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role

    enhancement research 
    opened by simonw 15
  • `s3-credentials put-objects` command

    `s3-credentials put-objects` command

    It's frustrating when using s3-credentials put-object that you have to specify the key name each time, rather than deriving that from the filename:

    s3-credentials put-object simonwillison-cors-allowed-public \
      click_default_group-1.2.2-py3-none-any.whl \
      /tmp/click-default-group/dist/click_default_group-1.2.2-py3-none-any.whl
    

    One way to fix this would be with a s3-credentials put-objects which works like this:

    s3-credentials put-objects simonwillison-cors-allowed-public /tmp/click-default-group/dist/click_default_group-1.2.2-py3-none-any.whl
    

    It could accept multiple files (hence the plural name) and could also accept directories and recursively upload their contents.

    enhancement 
    opened by simonw 13
  • Make it easier to add extra policy statements

    Make it easier to add extra policy statements

    The current --policy option lets you set a custom policy, but leaves it to you to define one.

    I find myself wanting to mix in the following to the policy that I use, for s3-ocr:

    https://docs.aws.amazon.com/textract/latest/dg/security_iam_id-based-policy-examples.html#security_iam_async-actions

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "textract:StartDocumentTextDetection",
                    "textract:StartDocumentAnalysis",
                    "textract:GetDocumentTextDetection",
                    "textract:GetDocumentAnalysis"
                ],
                "Resource": "*"
            }
        ]
    }
    

    Would be nice if there was a neat way to do this.

    enhancement research 
    opened by simonw 10
  • Mechanism for running tests against a real AWS account

    Mechanism for running tests against a real AWS account

    The tests for this project currently run against mocks - which is good, because I don't like the idea of GitHub Action tests hitting real APIs.

    But... this project is about building securely against AWS. As such, automated tests that genuinely exercise a live AWS account (and check that the resulting permissions behave as expected) would be incredibly valuable for growing my confidence that this tool works as advertised.

    These tests would need quite a high level of administrative access, because they need to be able to create users, roles etc.

    I don't like the idea of storing my own AWS administrator account credentials in a GitHub Actions secret though. I think I'll write these tests such that they can be run outside of GitHub Actions, maybe configured via environment variables that allow other project contributors to run tests against their own accounts.

    tests 
    opened by simonw 10
  • Stop using action wildcards and start explicitly listing permissions

    Stop using action wildcards and start explicitly listing permissions

    See https://github.com/simonw/s3-credentials/issues/11#issuecomment-959844042_ for context.

    The read-write policy currently uses "Action": "s3:*Object" - and the read-only one uses Action": "s3:GetObject*".

    This is pretty gross - surely explicitly listing the allowed actions is better practice?

    • [x] #23
    • [x] #24
    • [x] #25
    research 
    opened by simonw 10
  • Support configuring the bucket as a website

    Support configuring the bucket as a website

    It would be useful to have an opt-in option for saying "this bucket should be configured as a website" - because setting that up without a tool is quite fiddly.

    https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteAccessPermissionsReqd.html has the details:

    When you configure a bucket as a static website, if you want your website to be public, you can grant public read access. To make your bucket publicly readable, you must disable block public access settings for the bucket and write a bucket policy that grants public read access.

    See #20 for "block public access" setting, and #19 for bucket policies.

    enhancement 
    opened by simonw 9
  • Work-in-progress create command

    Work-in-progress create command

    Refs #3. This is implemented... but it doesn't seem to work - when I copy and paste the credentials into Transmit it refuses to connect.

    • [x] Get it working
    • [x] Add tests
    • [x] Add documentation
    opened by simonw 8
  • Manually test --prefix against litestream.io

    Manually test --prefix against litestream.io

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/pull/39#issuecomment-1014857276

    Splitting this into a separate issue mainly so I can clearly document how to use Litestream in the comments here.

    Goal is to confirm that S3 credentials created using s3-credentials create ... --prefix litestream-test/ can be used with Litestream to back up a SQLite database to that path within the bucket.

    research tests 
    opened by simonw 7
  • Apply jdub policy suggestions

    Apply jdub policy suggestions

    https://github.com/simonw/s3-credentials/blob/main/s3_credentials/policies.py

    My suggestions:

    • specify individual actions explicitly (no wildcards)
    • separate permissions by resource (Buckets vs. Objects)
    • Sid is unnecessary

    Your read/write policy is good, but instead of *Object, list GetObject and PutObject.

    Your read-only policy would be better written like your read/write policy, one section for the bucket permission (ListBucket), one for the object permission (which should be GetObject, no wildcard).

    Your write-only policy is great as is.

    You may want to add additional permissions to let clients set ACLs. But if it's all simple object-by-object stuff, these very simple policies are great.

    Originally posted by @jdub in https://github.com/simonw/s3-credentials/issues/7#issuecomment-958651592

    enhancement research 
    opened by simonw 7
  • Add s3:PutObjectAcl to write policies

    Add s3:PutObjectAcl to write policies

    This came up here:

    • https://github.com/simonw/public-notes/issues/9#issuecomment-1328567164

    It turned out django-storages nees a write policy that includes s3:PutObjectAcl: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#iam-policy

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "VisualEditor0",
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObjectAcl",
                    "s3:GetObject",
                    "s3:ListBucket",
                    "s3:DeleteObject",
                    "s3:PutObjectAcl"
                ],
                "Principal": {
                    "AWS": "arn:aws:iam::example-AWS-account-ID:user/example-user-name"
                },
                "Resource": [
                    "arn:aws:s3:::example-bucket-name/*",
                    "arn:aws:s3:::example-bucket-name"
                ]
            }
        ]
    }
    

    Looks like I should add s3:GetObjectAcl to the default read policies too.

    enhancement 
    opened by simonw 3
  • Add the options to add tags to the created resources

    Add the options to add tags to the created resources

    Hello! I am looking into using s3-credentials for my projects. I use tags to identify resources in different ways, like how it was created or the project it belongs. I was wondering if there is planned support for adding tags to the resources created, or if you would be open for a contribution in that area.

    enhancement 
    opened by threkk 3
  • `get-objects/put-objects` `--skip` and `--skip-hash` options

    `get-objects/put-objects` `--skip` and `--skip-hash` options

    Idea:

    • --skip to skip downloading a file if it already exists with the same filename
    • --skip-hash to skip downloading a file if it already exists AND the MD5 hash has not changed (more expensive as needs to calculate the local hash)

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/issues/78#issuecomment-1248398247

    enhancement 
    opened by simonw 1
  • Provide a `--profile` option to allow AWS profile selection

    Provide a `--profile` option to allow AWS profile selection

    Users with multiple AWS accounts can declare named profiles to manage the different sets of credentials/regions. It would be ideal if s3-credentials accepted a --profile argument, just like the aws comand line tool.

    enhancement 
    opened by nk9 3
  • Bad session token error masked if not creating a new bucket

    Bad session token error masked if not creating a new bucket

    If a bad or expired session token is set, the create command fails with a misleading error that an existing bucket doesn't exist, if --create-bucket isn't specified. If --create-bucket is specified a traceback with more info is given instead:

    export AWS_ACCESS_KEY_ID="..."
    export AWS_SECRET_ACCESS_KEY="..."
    export AWS_SESSION_TOKEN="EXPIRED_TOKEN" 
    
    $ s3-credentials create --username USERNAME BUCKET
    Error: Bucket does not exist: BUCKET - try --create-bucket to create it
    
    $ s3-credentials create --create-bucket --username USERNAME BUCKET
    Traceback (most recent call last):
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/bin/s3-credentials", line 8, in <module>
        sys.exit(cli())
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1659, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/s3_credentials/cli.py", line 314, in create
        s3.create_bucket(Bucket=bucket, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/botocore/client.py", line 391, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/botocore/client.py", line 719, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the CreateBucket operation: The provided token has expired.
    
    bug 
    opened by kimvanwyk 3
Releases(0.14)
  • 0.14(Sep 15, 2022)

    • s3-credentials put-objects command (docs) for uploading more than one file or directory to an S3 bucket at a time. #68
    • s3-credentials get-objects command (docs) for downloading multiple files from an S3 bucket. #78
    Source code(tar.gz)
    Source code(zip)
  • 0.13(Aug 12, 2022)

    • Documentation now lives on a dedicated documentation website: https://s3-credentials.readthedocs.io/ #71
    • s3-credentials create ... --website --create-bucket now creates an S3 bucket that is configured to act as a website, with index.html an the index page and error.html as the page used for any errors. #21
    • s3-credentials list-buckets --details now returns the bucket region and the URL to the website, if it is configured to act as a website. #77
    • Fixed a bug where list-bucket would return an error if the bucket (or specified --prefix) was empty. #76
    Source code(tar.gz)
    Source code(zip)
  • 0.12.1(Aug 1, 2022)

    • Using the --policy or --statement options now implies --user-permissions-boundary=none. Previously it was easy to use these options to accidentally create credentials that did not work as expected since they would have a default permissions boundary that locked them down to only being able to access S3. #74
    • The s3-credentials.AmazonS3FullAccess role created by this tool in order to issue temporary credentials previously used the default MaxSessionDuration value of 3600, preventing it from creating credentials that could last more than an hour. This has been increased to 12 hours. See this issue comment for instructions on fixing your existing role if this bug is affecting your account. #75
    Source code(tar.gz)
    Source code(zip)
  • 0.12(Jun 30, 2022)

    • New --statement JSON option for both the s3-credentials create and s3-credentials policy commands, allowing one or more additional policy statements (provided as JSON strings) to be added to the generated IAM policy. #72
    Source code(tar.gz)
    Source code(zip)
  • 0.11(May 1, 2022)

  • 0.10(Jan 25, 2022)

  • 0.9(Jan 18, 2022)

    See Weeknotes: s3-credentials prefix and Datasette 0.60 for extra background on these new features.

    • New --prefix myprefix/ option to s3-credentials create, which configures the credentials to only allow access to keys within the S3 bucket that start with the provided prefix. #12
    • s3-credentials policy --prefix myprefix/ command for generating and outputting a JSON policy that is restricted to the specified prefix. You can see examples in the README.
    • New list-bucket command for listing the contents of a specified bucket. #28
    • The list-users, list-buckets and list-bucket command all default to outputting an indented JSON array - previously the outputted indented JSON objects separated by newlines. The --nl option can be used to return newline-delimited single line JSON objects. The new --csv and --tsv options can be used to return CSV or TSV output. #48
    Source code(tar.gz)
    Source code(zip)
  • 0.8(Dec 7, 2021)

    • s3-credentials create my-bucket --public option for creating public buckets, which allow anyone with knowledge of a filename to download that file. This works by attaching this public bucket policy to the bucket after it is created. #42
    • s3-credentials put-object now sets the Content-Type header on the uploaded object. The type is detected based on the filename, or can be specified using the new --content-type option. #43
    • s3-credentials policy my-bucket --public-bucket outputs the public bucket policy that would be attached to a bucket of that name. #44
    Source code(tar.gz)
    Source code(zip)
  • 0.7(Nov 30, 2021)

    • s3-credentials policy command, to output the JSON policy that would be used directly to the terminal. #37
    • README now includes examples of the three different policies. #36
    • s3-credentials put-object and s3-credentials get-object commands for uploading and downloading files from an S3 bucket. #38
    Source code(tar.gz)
    Source code(zip)
  • 0.6(Nov 18, 2021)

    • create --dry-run option outputs a summary of changes that would be made to an AWS account without applying them. #35
    • s3-credentials whoami command now uses sts.GetCallerIdentity, which means it works with any kind of access key. #33
    Source code(tar.gz)
    Source code(zip)
  • 0.5(Nov 11, 2021)

    • New s3-credentials create --duration 20m option. This creates temporary credentials that only last for the specified time, by creating a role and using STS.AssignRole() to retrieve credentials. #27
    • Redesigned read-only and read-write policies to no longer use wildcards and instead explicitly list allowed actions. #15
    • Commands now accept an optional --auth file/path.json option to specify a JSON or INI file containing the credentials to use. #29
    • New s3-credentials list-buckets --details option to include ACLs, website configuration and bucket policies. #22
    • New s3-credentials create --format ini option for outputting INI format instead of JSON. #17
    • Now uses botocore.stub in some of the tests - thanks, Niko Abeler. #16
    • Added integration tests, run using pytest --integration, which exercise the tool against an AWS account and delete any created resources afterwards. #30
    • Added tips section to the README, including how to access CloudTrail
    Source code(tar.gz)
    Source code(zip)
  • 0.4(Nov 4, 2021)

    • New options for authenticating with AWS: --access-key, --secret-key, --session-token, --endpoint-url. #2
    • Various improvements to JSON policies - thanks, @jdub! #11
    • --policy filename.json option for specifying a custom JSON policy. #14
    Source code(tar.gz)
    Source code(zip)
  • 0.3(Nov 3, 2021)

  • 0.2(Nov 3, 2021)

  • 0.1(Nov 3, 2021)

    • Initial release
    • s3-credentials create name-of-bucket creates a new user with read-write access only to the specified S3 bucket, creates an access key for that user and outputs it to the console. #3
    • s3-credentials list-users lists all of the users for the current AWS account. #4
    • s3-credentials list-user-policies lists inline policies for the specifeid users, or all users. #5
    • s3-credentials whoami shows information about the currently authenticated user.
    Source code(tar.gz)
    Source code(zip)
Owner
Simon Willison
Simon Willison
RDMAss - A Python Discord bot creating an interaction with RDM API

RDMAss A Python Discord bot creating an interaction with RDM API. Features Assig

5 Sep 21, 2022
A simple bot to upload file to various cloud servers.

Cloudsy Bot A simple bot to upload file to various cloud servers. Variables API_HASH Your API Hash from my.telegram.org API_ID Your API ID from my.tel

Flying Santas 8 Oct 31, 2022
A discord bot thet lets you play Space invaders.

space_Invaders A discord bot thet lets you play Space invaders. It is my first discord bot... so please give any suggestions to improve it :] Commands

2 Dec 30, 2021
Simple debugger and tester for dico-command.

dp Simple debugger and tester for dico-command. Installation pip install -U dico-dp Usage bot = dico_command.Bot(...) ... bot.load_module("dp") Comma

3 Nov 19, 2022
Block Telegram's new

Telegram Channel Blocker Bot Channel go away! This bot is used to delete and ban message sent by channel How this appears? The reason this appears ple

16 Feb 15, 2022
A Python module for communicating with the Twilio API and generating TwiML.

twilio-python The default branch name for this repository has been changed to main as of 07/27/2020. Documentation The documentation for the Twilio AP

Twilio 1.6k Jan 05, 2023
A lightweight, dependency-free Python library (and command-line utility) for downloading YouTube Videos.

24 July 2020 Actively soliciting contributers! Ping @ronncc if you would like to help out! pytube pytube is a very serious, lightweight, dependency-fr

pytube 7.9k Jan 02, 2023
Wrapper for wttr.in weather forecast.

pywttr Wrapper for wttr.in weather forecast. Asynchronous version here. Installation pip install pywttr Example This example prints the average temper

Almaz 6 Dec 25, 2022
A Recommendation System For Diabetes Detection And Treatment

Diabetes-detection-tg-bot A Recommendation System For Diabetes Detection And Treatment Данная система помогает определить наличие или отсутствие сахар

Alexander Kanonirov 1 Nov 22, 2021
A reddit.com bot that will return reference links from official python documentation site for the standard library.

Python Docs Bot A reddit.com bot that will return documentation links for the library and language reference sections of the python docs website. The

Trevor Miller 2 Sep 14, 2021
AuthGG is a Python library for dealing with Auth.gg apis

AuthGG AuthGG is a Python library for dealing with Auth.gg apis Installation Use the package manager pip to install requests Add the auth.py file in y

ExtremeDev 1 Dec 20, 2021
Telegram Link Shortener Bot (With 20 Shorteners)

Telegram ShortenerBot ShortenerBot: 🇬🇧 Telegram Link Shortener Bot (11 + 9 Shorteners) 🇹🇷 Telegram Link Kısaltıcı Bot (11 + 9 Kısaltıcı) All suppo

Hüzünlü Artemis [HuzunluArtemis] 10 May 24, 2022
A simple Telegram bot that can broadcast messages and media to the bot subscribers. with mongo DB support

𝘽𝙧𝙤𝙖𝙙𝙘𝙖𝙨𝙩 𝘽𝙤𝙩 A simple Telegram bot that can broadcast messages and media to the bot subscribers using MongoDB. Features Support mongodb.c

N A C BOTS 70 Jan 02, 2023
Бот Telegram для Школы в Капотне (ЦО № 1858)

co1858 Telegram Bot Активно разрабатывался в 2015-2016 году как учебный проект, с целью научиться создавать ботов для Telegram. Бот автоматически парс

Ilya Pavlov 4 Aug 30, 2022
Dante, my discord bot. Open source project in development and not optimized for other filesystems, install and setup script in development

DanteMode (In private development for ~6 months) Dante, my discord bot. Open source project in development and not optimized for other filesystems, in

2 Nov 05, 2021
Materials for the AMS 2022 Student Conference Python Workshop.

AMS 2022 Student Conference Python Workshop Let's talk MetPy! Here you will find a collection of notebooks we will be demonstrating and working throug

Unidata 4 Dec 13, 2022
SpamBot.py allows you, to spam other Chat Partners etc.

SpamBot -SpamBot.py allows you, to spam other Chat Partners etc. Install If you downloaded it yet, you have to install "requirements.txt" write the di

Marco 1 Jan 16, 2022
Braintree Python library

Braintree Python library The Braintree Python library provides integration access to the Braintree Gateway. TLS 1.2 required The Payment Card Industry

Braintree 230 Dec 18, 2022
Visual Weather api. Returns beautiful pictures with the current weather.

VWapi Visual Weather api. Returns beautiful pictures with the current weather. Installation: sudo apt update -y && sudo apt upgrade -y sudo apt instal

Hotaru 33 Nov 13, 2022
Python library for Seeedstudio Grove devices

grove.py Python library for Seeedstudio Grove Devices on embeded Linux platform, especially good on below platforms: Coral Dev Board (Wiki) NVIDIA Jet

Seeed Studio 123 Dec 17, 2022