Publicly Open Amazon AWS S3 Bucket Viewer

Overview

S3Viewer Build Status

Publicly open storage viewer (Amazon S3 Bucket, Azure Blob, FTP server, HTTP Index Of/)

s3viewer is a free tool for security researchers that lists the content of publicly open storages and helps to identify leaking data. The tool allows you to view all the files in a given storage and download selected files and directories. The goal is to identify the owner of the storage as quickly as possible in order to report that data is leaking from it.

Supported open storage:

  • Amazon S3 Buckets
  • Microsoft Azure Blobs
  • FTP servers with Anonymous access allowed
  • HTTP Index Of / Pages (Apache/nginx-style directory listing)

Demo

The tool lists directory contents and display them in a tree view GUI from which you can navigate to view all directories and files and even download them. You can also use the Load button to load a pre-downloaded dirlist to view the directory hierarchy offline.

Simplified

Feature List

  • Supporting multiple open storage types including S3 bucket, Azure blob, FTP, HTTP Index
  • View, download, and interact with open storage directory hierarchy
  • Generate offline dirlist and load it later to work offline
  • Search for specific files easily
  • Cross-platform (Windows, MAC, Linux) GUI desktop application
  • Free

Setup

Prerequisites

  • python3
    • python3 -m pip install -r packaging/requirements.txt
  • aws cli
    • make sure aws works. Then configure once (aws configure) with a random region (e.g. us-east-1). No need for keys.
  • azure azcopy
    • download and place azcopy somewhere along the PATH

Run

python s3viewer.py

Usage Fill the storage url and press Get Dirlist. Use double-click to download a file or use right-click for more options such as download all files in a directory. You can keep the generated dirlist to load quickly later.

Supported URL schemes

TODO

  • Features
    • Download manager
      • Background downloads
      • Parallel downloads
      • Stop/Pause/Resume a download
    • Explorer mode
    • Mac, Linux builds
  • Bugs
    • Progress bar isn't synced with FTP downloads

Motivation

TL;DR

Publicly open storages have become a serious threat to many companies and people due to massive data leaks which led to countless breaches, extortions, and overall embarrassment to all invloved parties. I have personally discovered and reported on dozens of major publicly open storages open to the public belonging to companies that were completely unaware of them. This must be stopped and I hope this tool will help security researchers to identify misconfigured cloud instances in order to responsibly disclose it to the affected companies.

Longer Version

Simple Storage Service (S3) bucket is a public cloud storage resource available in Amazon Web Services (AWS). They are favorable by developers and IT team, as their storages offer a simple web service interface which enables them to store and retrieve any amount of data at any time from anywhere. Companies are trying to keep up with the pace and ensure their cloud-stored data is safe, yet despite that, they haven't fully incorporated best practices from AWS and we see WAAAAAAAY TOO MANY misconfigured publicly open buckets that can be easily accessed by anyone.

As the popularity of s3 buckets increased I started to discover more and more publicly open buckets and needed a tool to assist me in identifying the companies behind the buckets. Sometimes correlating a bucket name to a company may prove to be an easy task, but sometimes the name of the bucket is too vague and it’s unclear of the company behind it, for example “devbucket” or “prod3bucket”.

The problem with cloud storage technologies, such as S3 buckets, is that they tend to be misconfigured, as proven in recent data breaches, and may leak data to anyone with a browser. That is why it is important to recognize and report any leaked information, since today's leaked information can be a random company's information, but tomorrow's leaked information could be your business or personally identifying information leaked to criminals.

You might also like...
AHA is an incident management & communication framework to provide real-time alert customers when there are active AWS event(s). For customers with AWS Organizations, customers can get aggregated active account level events of all the accounts in the Organization. Customers not using AWS Organizations still benefit alerting at the account level. Automated AWS account hardening with AWS Control Tower and AWS Step Functions
Automated AWS account hardening with AWS Control Tower and AWS Step Functions

Automate activities in Control Tower provisioned AWS accounts Table of contents Introduction Architecture Prerequisites Tools and services Usage Clean

AWS Interactive CLI - Allows you to execute a complex AWS commands by chaining one or more other AWS CLI dependency

AWS Interactive CLI - Allows you to execute a complex AWS commands by chaining one or more other AWS CLI dependency

A simple URL shortener app using Python AWS Chalice, AWS Lambda and AWS Dynamodb.
A simple URL shortener app using Python AWS Chalice, AWS Lambda and AWS Dynamodb.

url-shortener-chalice A simple URL shortener app using AWS Chalice. Please make sure you configure your AWS credentials using AWS CLI before starting

Jenkins-AWS-CICD - Implement Jenkins CI/CD with AWS CodeBuild and AWS CodeDeploy, build a python flask web application.
Jenkins-AWS-CICD - Implement Jenkins CI/CD with AWS CodeBuild and AWS CodeDeploy, build a python flask web application.

Jenkins-AWS-CICD - Implement Jenkins CI/CD with AWS CodeBuild and AWS CodeDeploy, build a python flask web application.

Implement backup and recovery with AWS Backup across your AWS Organizations using a CI/CD pipeline (AWS CodePipeline).
Implement backup and recovery with AWS Backup across your AWS Organizations using a CI/CD pipeline (AWS CodePipeline).

Backup and Recovery with AWS Backup This repository provides you with a management and deployment solution for implementing Backup and Recovery with A

Amazon Scraper: A command-line tool for scraping Amazon product data
Amazon Scraper: A command-line tool for scraping Amazon product data

Amazon Product Scraper: 2021 Description A command-line tool for scraping Amazon product data to CSV or JSON format(s). Requirements Python 3 pip3 Ins

This repository contains the implementations related to the experiments of a set of publicly available datasets that are used in the time series forecasting research space.

TSForecasting This repository contains the implementations related to the experiments of a set of publicly available datasets that are used in the tim

Generate a list of papers with publicly available source code in the daily arxiv

2021-06-08 paper code optimal network slicing for service-oriented networks with flexible routing and guaranteed e2e latency networkslicing multi-moda

Repository for publicly available deep learning models developed in Rosetta community

trRosetta2 This package contains deep learning models and related scripts used by Baker group in CASP14. Installation Linux/Mac clone the package git

Anime Streams Scrapper for Telegram Publicly Available for everyone to use

AniRocks Project Structure: ╭─ bot ├──── plugins: directory stored all the plugins ├──── utils: a directory of Utilities to help bot Client to create

A repository of publicly verifiable token Sale contracts

Token-Sale-Plutus-Contract A repository of publicly verifiable token sale and royalty contracts. This will be the storage solution since it is easily

Archive, organize, and watch for changes to publicly available information.

0. Overview The Trapper Keeper is a collection of scripts that support archiving information from around the web to make it easier to study and use. I

Tool to check whether a GCP bucket is public or not.
Tool to check whether a GCP bucket is public or not.

Tool to check publicly accessible GCP bucket. Blog https://justm0rph3u5.medium.com/gcp-inspector-auditing-publicly-exposed-gcp-bucket-ac6cad55618c Wha

Crawler job that scrapes comments from social media posts and saves them in a S3 bucket.
Crawler job that scrapes comments from social media posts and saves them in a S3 bucket.

Toxicity comments crawler Crawler job that scrapes comments from social media posts and saves them in a S3 bucket. Twitter Tweets and replies are scra

Terraform module to ship CloudTrail logs stored in a S3 bucket into a Kinesis stream for further processing and real-time analysis.
Terraform module to ship CloudTrail logs stored in a S3 bucket into a Kinesis stream for further processing and real-time analysis.

AWS infrastructure to ship CloudTrail logs from S3 to Kinesis This repository contains a Terraform module to ship CloudTrail logs stored in a S3 bucke

S3-cleaner - A Python script attempts to delete the all objects/delete markers/versions from specific S3 bucket
S3-cleaner - A Python script attempts to delete the all objects/delete markers/versions from specific S3 bucket

Remove All Objects From S3 Bucket This Python script attempts to delete the all

Openstack bucket retention cli

Openstack bucket retention cli

Bancos de Dados Relacionais (SQL) na AWS com Amazon RDS

Bancos de Dados Relacionais (SQL) na AWS com Amazon RDS Repositório para o Live Coding DIO do dia 24/11/2021 Serviços utilizados Amazon RDS AWS Lambda

Comments
  • Custom dirlist not working

    Custom dirlist not working

    This tool is really great, nonetheless, with the built-in downloader. I couldn't get it to work with an S3 compatible endpoint hence I looked into the code to find how you generate the dirlist.

    Here is what I got: aws --no-sign-request s3 ls s3://capitol-hill-riots --endpoint-url=https://s3.us-east-1.wasabisys.com --recursive > list.txt

    Here is the preview: image

    However, I get a warning message box when trying to load it. image

    opened by victorshx 7
  • Fixed Code Quality Issues

    Fixed Code Quality Issues

    Description

    Summary:

    • Removed useless Object inheritance
    • Removed the usage of self
    • Add .deepsource.toml

    I ran a DeepSource Analysis on my fork of this repository. You can see all the issues raised by DeepSource here.

    DeepSource helps you to automatically find and fix issues in your code during code reviews. This tool looks for anti-patterns, bug risks, performance problems, and raises issues. There are plenty of other issues in relation to Bug Discovery and Anti-Patterns which you would be interested to take a look at.

    If you do not want to use DeepSource to continuously analyze this repo, I'll remove the .deepsource.toml from this PR and you can merge the rest of the fixes. If you want to setup DeepSource for Continuous Analysis, I can help you set that up.

    opened by HarshCasper 2
Releases(v0.7)
Owner
Sharon Brizinov
Sharon Brizinov
Here is some Python code that allows you to read in SVG files and approximate their paths using a Fourier series.

Here is some Python code that allows you to read in SVG files and approximate their paths using a Fourier series. The Fourier series can be animated and visualized, the function can be output as a tw

Alexander 12 Jan 01, 2023
RMfuse provides access to your reMarkable Cloud files in the form of a FUSE filesystem

RMfuse provides access to your reMarkable Cloud files in the form of a FUSE filesystem. These files are exposed either in their original format, or as PDF files that contain your annotations. This le

Robert Schroll 82 Nov 24, 2022
Object-oriented file system path manipulation

path (aka path pie, formerly path.py) implements path objects as first-class entities, allowing common operations on files to be invoked on those path

Jason R. Coombs 1k Dec 28, 2022
Powerful Python library for atomic file writes.

Powerful Python library for atomic file writes.

Markus Unterwaditzer 313 Oct 19, 2022
An object-oriented approach to Python file/directory operations.

Unipath An object-oriented approach to file/directory operations Version: 1.1 Home page: https://github.com/mikeorr/Unipath Docs: https://github.com/m

Mike Orr 506 Dec 29, 2022
Search for files under the specified directory. Extract the file name and file path and import them as data.

Search for files under the specified directory. Extract the file name and file path and import them as data. Based on that, search for the file, select it and open it.

G-jon FujiYama 2 Jan 10, 2022
CredSweeper is a tool to detect credentials in any directories or files.

CredSweeper is a tool to detect credentials in any directories or files. CredSweeper could help users to detect unwanted exposure of credentials (such as personal information, token, passwords, api k

Samsung 54 Dec 13, 2022
Python interface for reading and appending tar files

Python interface for reading and appending tar files, while keeping a fast index for finding and reading files in the archive. This interface has been

Lawrence Livermore National Laboratory 1 Nov 12, 2021
Creates folders into a directory to categorize files in that directory by file extensions and move all things from sub-directories to current directory.

Categorize and Uncategorize Your Folders Table of Content TL;DR just take me to how to install. What are Extension Categorizer and Folder Dumper Insta

Furkan Baytekin 1 Oct 17, 2021
A wrapper for DVD file structure and ISO files.

vs-parsedvd DVDs were an error. A wrapper for DVD file structure and ISO files. You can find me in the IEW Discord server

7 Nov 17, 2022
Simple, convenient and cross-platform file date changing library. 📝📅

Simple, convenient and cross-platform file date changing library.

kubinka0505 15 Dec 18, 2022
Extract an archive file (zip file or tar file) stored on AWS S3

S3 Extract Extract an archive file (zip file or tar file) stored on AWS S3. Details Downloads archive from S3 into memory, then extract and re-upload

Evan 1 Dec 14, 2021
A tool for batch processing large fasta files and accompanying metadata table to upload to repositories via API

Fasta Uploader A tool for batch processing large fasta files and accompanying metadata table to repositories via API The python fasta_uploader.py scri

Centre for Infectious Disease and One Health 1 Dec 09, 2021
File-manager - A basic file manager, written in Python

File Manager A basic file manager, written in Python. Installation Install Pytho

Samuel Ko 1 Feb 05, 2022
Yadl - it is a simple library for working with both dotenv files and environment variables.

Yadl Yadl - it is a simple library for working with both dotenv files and environment variables. Features Validation of whitespaces. Validation of num

Ivan Kapranov 3 Oct 19, 2021
Some-tasks - Files for some of the tasks for the group sessions

Files for some of the tasks for the group sessions Here you can find some of the

<a href=[email protected] Computer Networks"> 0 Aug 25, 2022
FileGenerator - File Generator for sites that accepts documents

File Generator for sites that accepts documents This code generates files as per

Shaunak 2 Mar 19, 2022
Instant Fuzzy File Search for Alfred

List all the files inside a folder using fd, and instantly fuzzy-search through all of them using fzf, all from inside Alfred with a single keyword: fzf.

Mr. Pennyworth 37 Nov 30, 2022
Remove [x]_ from StudIP zip Archives and archive_filelist.csv completely

This tool removes the "[x]_" at the beginning of StudIP zip Archives. It also deletes the "archive_filelist.csv" file

Kelke vl 1 Jan 19, 2022
A small Python module for determining appropriate platform-specific dirs, e.g. a "user data dir".

the problem What directory should your app use for storing user data? If running on macOS, you should use: ~/Library/Application Support/AppName If

ActiveState Software 948 Dec 31, 2022