An open source algorithm and dataset for finding poop in pictures.

Related tags

Algorithmsshitspotter
Overview

🗑️ 📱 💩 ShitSpotter 💩 📱 🗑️

The shitspotter module is where I will be work on the "shitspotter" poop-detection algorithm and dataset. The primary goal of this work is to allow for the creation of a phone app that finds where your dog pooped, because you ran to grab the doggy-bags you forgot, and now you can't find the damn thing. Other applications can be envisioned, such as AR glasses that lets you know if you are about to walk into a steamer, or perhaps city governments could use this to more efficiently clean public areas.

This module will contain an algorithm for training a pytorch network to detect poop in images, and a script for detecting poop in unseen images given a pretrained model.

The dataset currently contains 20GB of outdoor images taken with a phone. The general process of acquiring the dataset has been: 1. My dog poops or I see a rogue poop, 2. I take a "before" picture of the poop, 3. I pick up the poop, 4. I take an "after" picture as a high-correlation negative, and 5. I take a 3rd image of a different nearby area to get a lower-correlation-negative. The dataset is currently unannotated, but but before/after pairs will help with bootstrapping. Annotations and the data manifest will be managed using kwcoco.

Both the code and the dataset will be open source. The code will be published as it is written to this repo. The data and pretrained models will be made public on IPFS.

Recent Updates

Check back for updates, but because this is a personal project, it might take some time for it to fully drop.

  • 2021-11-23 - Added annotation process overview and dataset sample.
  • 2021-11-11 - Initial upload of data munging scripts.
  • 2020-12-18 - Took the first picture.

Introduction

In Fall 2019, I was at the local dog park, and I found myself in a situation where my dog pooped, but I had forgotten to bring bags with me. I walked to the local bag station (thank you DGS), grabbed one, but then I couldn't find where the poop was. The brown fallen leaves made it very difficult to find the poop.

This happened every so often. Often I would be able to find it, but I'm afraid sometimes, I was unable to relocate the "object of interest". This got me thinking, what if I had a phone app that could scan the area with the camera and try to locate the poop? If I had a dataset, training a poop detection model with today's deep learning methods should work pretty well.

Thus, on 2020-12-18, I took my first picture. My dog pooped, I took a picture, I picked it up, and then I took an "after" picture. The idea is that I will align the pictures (probably via computing local features like sift or some deep variant and then estimating an affine/projective transform) and then take a difference image. That should let me seed some sort of semi-automated annotation process.

Then in 2021-05-11, one of my colleague suggested that I take a 3rd unrelated picture to use as negative examples, so I took that suggestion and started doing that. This is the process currently being used. The following figure illustrates an example of one of these "triples".

https://i.imgur.com/NnEC8XZ.jpg

Related Work

I was surprised to find that there does not seem to be much work on this problem in the outdoor setting. Because none of the related work exactly meets my needs, I haven't looked too in depth into much of it, it could be that some of these are more relevant than I've given them credit for. As time moves on I'll continue to refine this section.

Apparently Roomba has an indoor poop dataset: https://www.engadget.com/irobot-roomba-j-7-object-poop-detection-040152887.html It would be interesting to combine the indoor / outdoor datasets, but we are more concerned about outdoor detection. Maybe Boston Dynamics and Roomba can take this dataset and do something interesting.

The MSHIT fake dog poop dataset: https://www.kaggle.com/mikian/dog-poop is similar to this domain, but not the real-deal. This may be relevant, but I have not looked too deeply into it yet.

There is Human Poop Classification: https://seed.com/poop/ and https://www.theverge.com/2019/10/29/20937108/poop-database-ai-training-photo-upload-first-mit but this is not our domain.

Detect Images of Dogs Pooping: https://colab.research.google.com/github/matthewchung74/blogs/blob/dev/Dog_Pooping_Dectron.ipynb Unfortunately, this is detecting the action, and not the consequence.

A Dog Poop DNA database could be used in conjunction with this work: https://www.bbc.com/news/uk-england-somerset-56324906

A 2019 Project by Neeraj Madan: https://www.youtube.com/watch?v=qGNbHwp0jM8 This is the most similar thing to this project that I've seen so far. I have not watched his entire video yet, but I may contact him so see if they're interested in collaborating.

TACO: http://tacodataset.org/ The TACO dataset is Trash Annotations in Context. It could be the case that this data could be incorporated into the TACO dataset, although it does not currently contain a category for feces.

Other related links I haven't gone through well enough yet:

Dataset Description

The dataset contains a wide variety of image and background conditions that occur in update New York, including: seasonal changes, snow, rain, daytime, nighttime (some taken with flash, others taken with my phone's night mode), grass, concrete, etc...

Known dataset biases are:

  • Geographic region: Most images were taken in Upstate New York climate.
  • Sensor: Most images were taken with my Pixel 5. A few images were from my old Motorola Droid.
  • Coordinate: Humans unconsciously center "objects of interest" in images they take. In some instances I tried to mitigate this bias, either by explicitly changing the center of the poop, or not looking at the screen when taking a snapshot.
  • Me: I'm the only one taking pictures. I'm also fairly tall, so the images are all from my viewpoint. There are other "me" biases I may not be aware of.
  • My Dogs: My two poop machines are fairly regular, and they have their own methods for times and places to make a dookie.
  • Freshness: The shit I deal with is often fresh out of the oven. Although, I have picked up a decent number of abandoned stools from other dog owners in the area, some of these are quite old. And age of the sample does seem to have an impact on its appearance. New poops have a shine, while old ones are quite dull, and will start to break down.

The following scatterplot illustrates trends in the space / time distribution of the images.

https://i.imgur.com/LXvcqGW.png

The following figure is a hand-picked sample of 9 images from the dataset. Each of these images has poop in it. In some cases it's easy to spot. In other cases, it can be quite difficult.

https://i.imgur.com/QwFpxD1.jpg

Dataset Statistics:

  • Most images only show a single poop, but other images have multiple.
  • As of 2021-11-11 I've collected 1935 pictures with "798" before/after/(maybe negative) groups of images.
  • There are roughly 457 paired-groups and 333 triple-groups. (Based only on counts, grouping has not happened yet).

Annotation Process

To make annotation easier, I've taken before a picture before and after I clean up the poop. The idea is that I can align these images and use image-differencing to more quickly find the objects of interest in the image. As you can see, it's not so easy to spot the shit, especially when there are leaves in the image.

https://i.imgur.com/lZ8J0vD.png

But with a little patience and image processing, it's not to hard to narrow down the search.

https://i.imgur.com/A6qlcNk.jpg

Scripts to produce these visualizations have been checked into the repo. Annotations and the image manifest will be stored in the kwcoco json format.

The Algorithm

Currently there is no algorithm checked into the repo. I need to start annotating the dataset first. Eventually there will be a shitspotter.fit and shitspotter.predict script for training and performing inference on unseen images. My current plan for a baseline algorithm is a mobilenet backbone pretrained on imagenet and some single-stage detection / segmentation head on top of that.

Given kwcoco a formated detection dataset, we can also use off-the-shelf detection baselines via netharn, mmdet, or some other library that accepts coco/kwcoco input manifests.

Downloading the Data

This dataset will be made public once I figure out a way to host and manage it. Currently the raw images live on my hard drive, and are backed up across 2 machines, each running RAID-10. Lower res copies of the photos live on the cloud, but I'm planning on sharing the originals.

The dataset is currently 20+GB, so I'm planning to use IPFS (or some P2P solution) to handle data distribution. Currently the data does not have any annotations, although I've started to build scripts to make that process easier.

Eventually I would like to host the data via DVC + IPFS, but fsspec needs an IPFS filesystem implementation first.

If you are in urgent need of any of the data, feel free to contact me (make an issue or email me). I can pin what I have on IPFS, but I was planning on organizing the data a bit before I did that, but I'm willing

Officially the licence will be Apache 2 because that's what I use for everything. It annoys me that I have to put licenses on things. These are the terms I care about:

  • Cite my work if you use it.
  • If you annotate any of the images, contribute the annotations back. Picking up shit is a team effort.
  • When asked to build something, particularly ML systems, think about the ethical implications, and act ethically.
  • Pin the dataset on IPFS if you can.

Otherwise the data is free to use commercially or otherwise.

Acknowledgements

I want to give thanks to the people and animals-that-think-they-are-people who contributed to this project. My colleagues at Kitware have provided valuable help / insight into project direction, dataset collection, problem formulation, related research, discussion, and memes.

I want to give special thanks to my two poop machines, without whom this project would not be possible.

https://i.imgur.com/MWQVs0w.jpg

https://i.imgur.com/YUJjWoh.jpg

Owner
Jon Crall
Staff R&D Engineer; Open Source Enthusiast
Jon Crall
Python based framework providing a simple and intuitive framework for algorithmic trading

Harvest is a Python based framework providing a simple and intuitive framework for algorithmic trading. Visit Harvest's website for details, tutorials

100 Jan 03, 2023
This is an implementation of the QuickHull algorithm in Python. I

QuickHull This is an implementation of the QuickHull algorithm in Python. It randomly generates a set of points and finds the convex hull of this set

Anant Joshi 4 Dec 04, 2022
Distributed Grid Descent: an algorithm for hyperparameter tuning guided by Bayesian inference, designed to run on multiple processes and potentially many machines with no central point of control

Distributed Grid Descent: an algorithm for hyperparameter tuning guided by Bayesian inference, designed to run on multiple processes and potentially many machines with no central point of control.

Martin 1 Jan 01, 2022
This application solves sudoku puzzles using a backtracking recursive algorithm

This application solves sudoku puzzles using a backtracking recursive algorithm. The user interface is coded with Pygame to allow users to easily input puzzles.

Glenda T 0 May 17, 2022
A litle algorithm that i made for transform a picture in a spreadsheet.

PicsToSheets How it works? It is an algorithm designed to transform an image into a spreadsheet file. this converts image pixels to color cells of she

Guilherme de Oliveira 1 Nov 12, 2021
A simple library for implementing common design patterns.

PyPattyrn from pypattyrn.creational.singleton import Singleton class DummyClass(object, metaclass=Singleton): # DummyClass is now a Singleton!

1.7k Jan 01, 2023
Algoritmos de busca:

Algoritmos-de-Buscas Algoritmos de busca: Abaixo está a interface da aplicação: Ao selecionar o tipo de busca e o caminho, então será realizado o cálc

Elielson Barbosa 5 Oct 04, 2021
marching Squares algorithm in python with clean code.

Marching Squares marching Squares algorithm in python with clean code. Tools Python 3 EasyDraw Creators Mohammad Dori Run the Code Installation Requir

Mohammad Dori 3 Jul 15, 2022
This python algorithm creates a simple house floor plan based on a user-provided CSV file.

This python algorithm creates a simple house floor plan based on a user-provided CSV file. The algorithm generates possible router placements and evaluates where a signal will be reached in every roo

Joshua Miller 1 Nov 12, 2021
It is a platform that implements some path planning algorithms.

PathPlanningAlgorithms It is a platform that implements some path planning algorithms. Main dependence: python3.7, opencv4.1.1.26 (for image show) Tip

5 Feb 24, 2022
This repository explores an implementation of Grover's Algorithm for knights on a chessboard.

Grover Knights Welcome to my Knights project! Project Description: I explore an implementation of a quantum oracle for knights on a chessboard.

Will Sun 8 Feb 22, 2022
This is an Airport Scheduling Time table implemented using Genetic Algorithm

This is an Airport Scheduling Time table implemented using Genetic Algorithm In this The scheduling is performed on the basisi of that no two Air planes are arriving or departing at the same runway a

1 Jan 06, 2022
TikTok X-Gorgon & X-Khronos Generation Algorithm

TikTok X-Gorgon & X-Khronos Generation Algorithm X-Gorgon and X-Khronos headers are required to call tiktok api. I will provide you API as rental or s

TikTokMate 31 Dec 01, 2022
A Python implementation of Jerome Friedman's Multivariate Adaptive Regression Splines

py-earth A Python implementation of Jerome Friedman's Multivariate Adaptive Regression Splines algorithm, in the style of scikit-learn. The py-earth p

431 Dec 15, 2022
Gnat - GNAT is NOT Algorithmic Trading

GNAT GNAT is NOT Algorithmic Trading! GNAT is a financial tool with two goals in

Sher Shah 2 Jan 09, 2022
A Python project for optimizing the 8 Queens Puzzle using the Genetic Algorithm implemented in PyGAD.

8QueensGenetic A Python project for optimizing the 8 Queens Puzzle using the Genetic Algorithm implemented in PyGAD. The project uses the Kivy cross-p

Ahmed Gad 16 Nov 13, 2022
Cormen-Lib - An academic tool for data structures and algorithms courses

The Cormen-lib module is an insular data structures and algorithms library based on the Thomas H. Cormen's Introduction to Algorithms Third Edition. This library was made specifically for administeri

Cormen Lib 12 Aug 18, 2022
An NUS timetable generator which uses a genetic algorithm to optimise timetables to suit the needs of NUS students.

A timetable optimiser for NUS which uses an evolutionary algorithm to "breed" a timetable suited to your needs.

Nicholas Lee 3 Jan 09, 2022
Implementation for Evolution of Strategies for Cooperation

Moraliser Implementation for Evolution of Strategies for Cooperation Dependencies You will need a python3 (= 3.8) environment to run the code. Before

1 Dec 21, 2021
🌟 Python algorithm team note for programming competition or coding test

🌟 Python algorithm team note for programming competition or coding test

Seung Hoon Lee 3 Feb 25, 2022