a tool that compiles a csv of all h1 program stats

Related tags

Data Analysish1stats
Overview

h1stats - h1 Program Stats Scraper

This python3 script will call out to HackerOne's graphql API and scrape all currently active programs for information and stats on every h1 program. All programs and their stats get tabulated into a generated CSV file. From here you can compare and contrast all program stats to pick high fidelity targets. Furthermore, you can supply your h1 session cookie to the script to also compile in all private programs to the CSV.

Data Collected:

  • Program Name
  • Program URL
  • Program Type (Public or Private)
  • Clear Program (Yes/No)
  • Offers Bounties (Yes/No)
  • Max Critical (USD)
  • Max High (USD)
  • Max Medium (USD)
  • Max Low (USD)
  • Average Bounty Max (USD)
  • Average Bounty Min (USD)
  • Top Bounty Max (USD)
  • Top Bounty Min (USD)
  • Resolved Reports
  • Reports Received in 90 Days
  • Total Bounties Paid (USD)
  • Total Bounties Paid in 90 Days (USD)
  • Avg Time to First Response (Hours)
  • Avg Time to Triage (Hours)
  • Avg Time to Bounty (Hours)
  • Avg Time to Resolution (Hours)
  • Progam Age (Months)
  • Days Since Last Report

Usage

normal usage (public programs): python3 h1stats

authenticated usage (public and private programs): python3 h1stats [<Your HackerOne __Host-session Token>]

WARNING (Authenticated Usage)

THIS SCRIPT HANDLES YOUR H1 SESSION TOKEN WHICH CONTAINS YOUR HACKERONE PRIVATE DATA AND THE PRIVATE DATA OF YOUR HACKERONE PROGRAMS. BECAREFUL WHEN HANDLING THIS TOKEN. THE AUTHORS ARE NOT LIABLE FOR ANY MISUSE OF THIS SCRIPT OR YOUR HACKERONE SESSION TOKEN. PLEASE USE AT YOUR OWN RISK. DO NOT PUBLISH ANY CSVs WITH HACKERONE PRIVATE PROGRAM DATA.

For authenticated usage It is suggested that you assign your token into a variable once using export and pushing the env variable into the script's argument list (as shown in the examples).

Examples

Normal Flow (Public Only):

bash> python3 h1stats
  _     _ ____  _        _
 | |__ / / ___|| |_ __ _| |_ ___
 | '_ \| \___ \| __/ _` | __/ __|
 | | | | |___) | || (_| | |_\__ \
 |_| |_|_|____/ \__\__,_|\__|___/

                      defparam

[+] No session cookie specified
[+] Collecting public data...
[+] Please wait... (this may take several minutes)
[+] Collecting... (350 programs)
[+] Wrote all data to: h1stats-2021-4-24.csv
[+] Done!

Authenticated Flow (Public and Private):

bash> export H1CRED="JGH92kd9...b5e" # HackerOne session cookie
bash> python3 h1stats $H1CRED
  _     _ ____  _        _
 | |__ / / ___|| |_ __ _| |_ ___
 | '_ \| \___ \| __/ _` | __/ __|
 | | | | |___) | || (_| | |_\__ \
 |_| |_|_|____/ \__\__,_|\__|___/

                      defparam

[+] Using specified session cookie
[+] Collecting public and private data...
[+] Please wait... (this may take several minutes)
[+] Collecting... (400 programs)
[+] Wrote all data to: h1stats-PRIVATE-2021-4-24.csv
[+] Warning: this data contains private information under NDA, do not publish!
[+] Done!
Owner
Evan
Architect, Hacker, FPGA Whisperer, Fuzzerer
Evan
Repository created with LinkedIn profile analysis project done

EN/en Repository created with LinkedIn profile analysis project done. The datase

Mayara Canaver 4 Aug 06, 2022
Python Practicum - prepare for your Data Science interview or get a refresher.

Python-Practicum Python Practicum - prepare for your Data Science interview or get a refresher. Data Data visualization using data on births from the

Jovan Trajceski 1 Jul 27, 2021
Exploratory Data Analysis of the 2019 Indian General Elections using a dataset from Kaggle.

2019-indian-election-eda Exploratory Data Analysis of the 2019 Indian General Elections using a dataset from Kaggle. This project is a part of the Cou

Souradeep Banerjee 5 Oct 10, 2022
Pandas and Dask test helper methods with beautiful error messages.

beavis Pandas and Dask test helper methods with beautiful error messages. test helpers These test helper methods are meant to be used in test suites.

Matthew Powers 18 Nov 28, 2022
bigdata_analyse 大数据分析项目

bigdata_analyse 大数据分析项目 wish 采用不同的技术栈,通过对不同行业的数据集进行分析,期望达到以下目标: 了解不同领域的业务分析指标 深化数据处理、数据分析、数据可视化能力 增加大数据批处理、流处理的实践经验 增加数据挖掘的实践经验

Way 2.4k Dec 30, 2022
BErt-like Neurophysiological Data Representation

BENDR BErt-like Neurophysiological Data Representation This repository contains the source code for reproducing, or extending the BERT-like self-super

114 Dec 23, 2022
NFCDS Workshop Beginners Guide Bioinformatics Data Analysis

Genomics Workshop FIXME: overview of workshop Code of Conduct All participants s

Elizabeth Brooks 2 Jun 13, 2022
TE-dependent analysis (tedana) is a Python library for denoising multi-echo functional magnetic resonance imaging (fMRI) data

tedana: TE Dependent ANAlysis TE-dependent analysis (tedana) is a Python library for denoising multi-echo functional magnetic resonance imaging (fMRI)

136 Dec 22, 2022
Advanced Pandas Vault — Utilities, Functions and Snippets (by @firmai).

PandasVault ⁠— Advanced Pandas Functions and Code Snippets The only Pandas utility package you would ever need. It has no exotic external dependencies

Derek Snow 374 Jan 07, 2023
Example Of Splunk Search Query With Python And Splunk Python SDK

SSQAuto (Splunk Search Query Automation) Example Of Splunk Search Query With Python And Splunk Python SDK installation: ➜ ~ git clone https://github.c

AmirHoseinTangsiriNET 1 Nov 14, 2021
Data pipelines built with polars

valves Warning: the project is very much work in progress. Valves is a collection of functions for your data .pipe()-lines. This project aimes to host

14 Jan 03, 2023
scikit-survival is a Python module for survival analysis built on top of scikit-learn.

scikit-survival scikit-survival is a Python module for survival analysis built on top of scikit-learn. It allows doing survival analysis while utilizi

Sebastian Pölsterl 876 Jan 04, 2023
A Python package for the mathematical modeling of infectious diseases via compartmental models

A Python package for the mathematical modeling of infectious diseases via compartmental models. Originally designed for epidemiologists, epispot can be adapted for almost any type of modeling scenari

epispot 12 Dec 28, 2022
An orchestration platform for the development, production, and observation of data assets.

Dagster An orchestration platform for the development, production, and observation of data assets. Dagster lets you define jobs in terms of the data f

Dagster 6.2k Jan 08, 2023
For making Tagtog annotation into csv dataset

tagtog_relation_extraction for making Tagtog annotation into csv dataset How to Use On Tagtog 1. Go to Project Downloads 2. Download all documents,

hyeong 4 Dec 28, 2021
Calculate multilateral price indices in Python (with Pandas and PySpark).

IndexNumCalc Calculate multilateral price indices using the GEKS-T (CCDI), Time Product Dummy (TPD), Time Dummy Hedonic (TDH), Geary-Khamis (GK) metho

Dr. Usman Kayani 3 Apr 27, 2022
Pizza Orders Data Pipeline Usecase Solved by SQL, Sqoop, HDFS, Hive, Airflow.

PizzaOrders_DataPipeline There is a Tony who is owning a New Pizza shop. He knew that pizza alone was not going to help him get seed funding to expand

Melwin Varghese P 4 Jun 05, 2022
International Space Station data with Python research 🌎

International Space Station data with Python research 🌎 Plotting ISS trajectory, calculating the velocity over the earth and more. Plotting trajector

Facundo Pedaccio 41 Jun 16, 2022
Desafio 1 ~ Bantotal

Challenge 01 | Bantotal Please read the instructions for the challenge by selecting your preferred language below: Español Português License Copyright

Maratona Behind the Code 44 Sep 28, 2022
songplays datamart provide details about the musical taste of our customers and can help us to improve our recomendation system

Songplays User activity datamart The following document describes the model used to build the songplays datamart table and the respective ETL process.

Leandro Kellermann de Oliveira 1 Jul 13, 2021