Methods to get the probability of a changepoint in a time series.

Overview

Bayesian Changepoint Detection

Methods to get the probability of a changepoint in a time series. Both online and offline methods are available. Read the following papers to really understand the methods:

[1] Paul Fearnhead, Exact and Efficient Bayesian Inference for Multiple
Changepoint problems, Statistics and computing 16.2 (2006), pp. 203--213

[2] Ryan P. Adams, David J.C. MacKay, Bayesian Online Changepoint Detection,
arXiv 0710.3742 (2007)

[3] Xuan Xiang, Kevin Murphy, Modeling Changing Dependency Structure in
Multivariate Time Series, ICML (2007), pp. 1055--1062

To see it in action have a look at the example notebook.

Comments
  • Other observation models besides Gaussian

    Other observation models besides Gaussian

    Hi. I was wondering if you had any insight in extending your code to include other emission models besides gaussian. In particular, how about a GMM with known number of gaussians?

    I was going to take a stab at implementing it and submit a PR, but wanted to get your input first.

    Thanks

    Dan

    enhancement 
    opened by mathDR 16
  • CD automation for deployment to PyPI

    CD automation for deployment to PyPI

    What is this feature about? CD for deploying package to PyPI. It makes use of Github's workflow.

    Closes Issues: https://github.com/hildensia/bayesian_changepoint_detection/issues/32

    Pre-req for owner @hildensia before merging this :

    1. Create a new API tokens inside your PyPI account where this project lives https://pypi.org/
    2. Creating two Repository secrets inside the github project setting. (Steps defined here) A. Secret name PYPI_USERNAME which value __token__
      B. Secret name PYPI_PROD_PASSWORD with the token value from step #1

    How to release a package? Leveraging github release feature This can only be done by project admin/maintainer. @hildensia Right now have made the release to based on manual action. We have to make use the of Releases option shown by github, provide a version tag number and description. If we want to change the release strategy we can update the cd.yml accordingly but usually I have seen projects follow manual release.

    What Testing was done? I have tested this pipeline where the package was deployed to my Test PyPI account. https://github.com/zillow/bayesian_changepoint_detection/actions/runs/1966108484

    opened by shahsmit14 12
  • Add pyx file again

    Add pyx file again

    Was removed during a PR. Is there a good way to keep cython and python in sync. I'm not sure if I prefer one over the other (python is better for debugging, cython is faster).

    opened by hildensia 5
  • How to utilize R matrix to detect change points?

    How to utilize R matrix to detect change points?

    In the current version of code, Nw=10; ax.plot(R[Nw,Nw:-1]) is used to exhibit the changpoints. Although it works fine, I am really confused about the moral behind it. I tried to plot the run length with maximum prob in each time step i.e. the y index of maximum prob in each x col, but the result showed the run length keeps going up... I also went back to Admas's paper but found nothing about change point indentification stuff (he just stop at R matrix)... I also tried to find Adams's MATLAB code, but the code seems to have been removed...

    I am trying to use this method in my work, and I believe it's the best to fully understand it before any deployment. Any help will be appreciated and thanks a lot!

    opened by mike-ocean 4
  • Corrected scale and beta factor calculation

    Corrected scale and beta factor calculation

    The scale factor should be the standard deviation. There was a small bug in the betaT0 calculation, this makes it consistent with the paper/gaussdemo.m file.

    opened by nariox 3
  • Example notebook does not work

    Example notebook does not work

    If I click on the "example notebook" work - an nbviewer link - I get a "too many redirects" error.

    It would be nice if the example notebook was easily accessible in the repo (maybe I overlooked it... ) because we don't need a live notebook / nbviewer to figure out whether the example fits our use case.

    opened by chryss 2
  • Updating parameters for bayesian online change point

    Updating parameters for bayesian online change point

    I think my question is related to the one, which was not answered and is already closed: https://github.com/hildensia/bayesian_changepoint_detection/issues/19

    In your example, you have applied the student t-distribution as a likelihood. I understand the distribution, its parameters, but I have a question about how you set up prior and update its parameters in the code. So the following is:

    df = 2*self.alpha
    scale = np.sqrt(self.beta * (self.kappa+1) / (self.alpha * self.kappa))
    

    I don't understand what alpha, beta and kappa correspond to. How have you come across this expression? The paper by Adams and McKey refers to updating sufficient statistics. Is your expression related to that? If so, how can I do that for any other distribution, let's say gaussian? In my comment, I refer to the following formula in the paper:

    equation

    opened by celdorwow 2
  • Scipy Import Error on newer versions

    Scipy Import Error on newer versions

    Hi guys,

    there is an import issue if one uses newer scipy versions.

    Would be a quick fix if you adapt the import statement at offline_changepoint_detection.py

    try:  # SciPy >= 0.19
        from scipy.special import comb, logsumexp
    except ImportError:
        from scipy.misc import comb, logsumexp  # noqa
    
    opened by fhaselbeck 2
  • Multivariate T

    Multivariate T

    • Introduces a pluggable prior/posterior config for multivariate Gaussian data, with sensible defaults. Note that this only works for scipy > 1.6.0, where they introduced the multivariate t PDF. The library will remind you to upgrade if you have an old version.
    • Adds a test for this new configuration, as well as for the univariate one
    • Adds a "dev" and "multivariate" setup extra, meaning that you can pip install bayesian_changepoint_detection[dev] for development work (currently this installs pytest), or pip install bayesian_changepoint_detection[multivariate] (enforces that you have a new enough scipy version for this new feature)
    opened by multimeric 2
  • Why the probability exceeds one?

    Why the probability exceeds one?

    I ran the given online detection example in the notebook, and I assumed the y axis indicating the probability of changepoint (am I right?). But the y value ranged from zero to hundreds. I am not very familiar with the math, so can anyone please explain this outcome?

    Thanks.

    opened by mike-ocean 2
  • Fix full covariance method and add example

    Fix full covariance method and add example

    This fixes the full cov method and adds an example similar to the original ipython notebook. If you prefer, I can merge them separately, but since they are related, I thought it'd be fine to merge them together.

    opened by nariox 2
  • About the conditions to use bocpd

    About the conditions to use bocpd

    Hi,nice to meet you,and i want to aks a basic question,if i don’t know the distribution of data(not the normal distribution),then could i use the bocpd? Thank you!

    opened by Codergers 0
  • Scaling of Data

    Scaling of Data

    Hi, I've noticed is the scaling of the data can have an effect on the result, but I am not sure why it would and can't find any reason for it in the code or references. Below I have the CP probabilities for the same data with or without a constant factor, which are somewhat different.

    Are there some assumptions about the input data I am missing? Thanks

    image image

    opened by stefan37 3
  • How to adjust the sensitivity of the BOCD algorithm?

    How to adjust the sensitivity of the BOCD algorithm?

    There is always a tradeoff between false alarms and missed alarms, and when the algorithm is more sensitive we should have higher false alarm rate and lower missed alarm rate. My question is, is it possible to adjust the sensitivity level of this algorithm by changing the hyperparameter (e.g., alpha, beta, kappa, mu)? Thank you!

    opened by gqffqggqf 4
  • 'FloatingPointError: underflow encountered in logaddexp'  occurs when setting np.seterr(all='raise')

    'FloatingPointError: underflow encountered in logaddexp' occurs when setting np.seterr(all='raise')

    Hi,

    I installed bayesian_changepoint_detection from this github repository.

    By setting (accidentally) np.seterr(all='raise'), I was able to cause the following exception.

    I am not sure whether this would have any relevance for the further processing, but I just wanted to draw attention to people working on / with this library.

    /home/user/venv/env01/bin/python3.6 /home/user/PycharmProjects/project01/snippet.py
    Use scipy logsumexp().
    Traceback (most recent call last):
      File "/home/user/PycharmProjects/project01/snippet.py", line 68, in <module>
        Q, P, Pcp = offcd.offline_changepoint_detection(data, partial(offcd.const_prior, l=(len(data) + 1)), offcd.gaussian_obs_log_likelihood, truncate=-40)
      File "/home/user/experiments/original-unforked/bayesian_changepoint_detection/bayesian_changepoint_detection/offline_changepoint_detection.py", line 98, in offline_changepoint_detection
        Q[t] = np.logaddexp(P_next_cp, P[t, n-1] + antiG)
    FloatingPointError: underflow encountered in logaddexp
    
    Process finished with exit code 1
    
    
    opened by alatif-alatif 0
  • Added Normal known precision, Poisson distributions + alternate hazard function

    Added Normal known precision, Poisson distributions + alternate hazard function

    For someone whoever is interested, I have added Normal known precision, poisson distributions in my fork below. Also tried adding another type of hazard function which is normally distributed over time. Usage of the same is updated in Example code as well. Find my fork here - https://github.com/kmsravindra/bayesian_changepoint_detection

    opened by kmsravindra 2
  • Confused about the R matrix interpretation

    Confused about the R matrix interpretation

    Hi,

    I am confused about the returned R matrix interpretation in the online detection algorithm. In the notebook example, the third plot is R[Nw,Nw:-1], where it is mentioned to be "the probability at each time step for a sequence length of 0, i.e. the probability of the current time step to be a changepoint." So why do we choose the indices R[Nw,Nw:-1] ? why not R[Nw,:]

    Also, it was mentioned as an example that R[7,3] means the probability at time step 7 taking a sequence of length 3, so does R[Nw,Nw:-1] means that we are taking all the probabilities at time step Nw ?

    Any suggestions to help me to understand the output R ?

    Thanks

    opened by RanaElnaggar 4
Releases(v0.4)
Owner
Johannes Kulick
Machine Learning and Robotics Scientist
Johannes Kulick
Awesome Long-Tailed Learning

Awesome Long-Tailed Learning This repo pays specially attention to the long-tailed distribution, where labels follow a long-tailed or power-law distri

Stomach_ache 284 Jan 06, 2023
This repository contains the implementation of the paper Contrastive Instance Association for 4D Panoptic Segmentation using Sequences of 3D LiDAR Scans

Contrastive Instance Association for 4D Panoptic Segmentation using Sequences of 3D LiDAR Scans This repository contains the implementation of the pap

Photogrammetry & Robotics Bonn 40 Dec 01, 2022
Pytorch implementation of Generative Models as Distributions of Functions 🌿

Generative Models as Distributions of Functions This repo contains code to reproduce all experiments in Generative Models as Distributions of Function

Emilien Dupont 117 Dec 29, 2022
A Fast and Stable GAN for Small and High Resolution Imagesets - pytorch

A Fast and Stable GAN for Small and High Resolution Imagesets - pytorch The official pytorch implementation of the paper "Towards Faster and Stabilize

Bingchen Liu 455 Jan 08, 2023
Mouse Brain in the Model Zoo

Deep Neural Mouse Brain Modeling This is the repository for the ongoing deep neural mouse modeling project, an attempt to characterize the representat

Colin Conwell 15 Aug 22, 2022
Framework that uses artificial intelligence applied to mathematical models to make predictions

LiconIA Framework that uses artificial intelligence applied to mathematical models to make predictions Interface Overview Table of contents [TOC] 1 Ar

4 Jun 20, 2021
Stock-history-display - something like a easy yearly review for your stock performance

Stock History Display Available on Heroku: https://stock-history-display.herokua

LiaoJJ 1 Jan 07, 2022
Rotary Transformer

[中文|English] Rotary Transformer Rotary Transformer is an MLM pre-trained language model with rotary position embedding (RoPE). The RoPE is a relative

325 Jan 03, 2023
Real-time VIBE: Frame by Frame Inference of VIBE (Video Inference for Human Body Pose and Shape Estimation)

Real-time VIBE Inference VIBE frame-by-frame. Overview This is a frame-by-frame inference fork of VIBE at [https://github.com/mkocabas/VIBE]. Usage: i

23 Jul 02, 2022
Image Data Augmentation in Keras

Image data augmentation is a technique that can be used to artificially expand the size of a training dataset by creating modified versions of images in the dataset.

Grace Ugochi Nneji 3 Feb 15, 2022
The final project of "Applying AI to 2D Medical Imaging Data" of "AI for Healthcare" nanodegree - Udacity.

Pneumonia Detection from X-Rays Project Overview In this project, you will apply the skills that you have acquired in this 2D medical imaging course t

Omar Laham 1 Jan 14, 2022
The original implementation of TNDM used in the NeurIPS 2021 paper (no longer being updated)

TNDM - Targeted Neural Dynamical Modeling Note: This code is no longer being updated. The official re-implementation can be found at: https://github.c

1 Jul 21, 2022
An experimentation and research platform to investigate the interaction of automated agents in an abstract simulated network environments.

CyberBattleSim April 8th, 2021: See the announcement on the Microsoft Security Blog. CyberBattleSim is an experimentation research platform to investi

Microsoft 1.5k Dec 25, 2022
This is an official implementation of the CVPR2022 paper "Blind2Unblind: Self-Supervised Image Denoising with Visible Blind Spots".

Blind2Unblind: Self-Supervised Image Denoising with Visible Blind Spots Blind2Unblind Citing Blind2Unblind @inproceedings{wang2022blind2unblind, tit

demonsjin 58 Dec 06, 2022
Lexical Substitution Framework

LexSubGen Lexical Substitution Framework This repository contains the code to reproduce the results from the paper: Arefyev Nikolay, Sheludko Boris, P

Samsung 37 Sep 15, 2022
Random-Afg - Afghanistan Random Old Idz Cloner Tools

AFGHANISTAN RANDOM OLD IDZ CLONER TOOLS Install $ apt update $ apt upgrade $ apt

MAHADI HASAN AFRIDI 5 Jan 26, 2022
Official implementation of GraphMask as presented in our paper Interpreting Graph Neural Networks for NLP With Differentiable Edge Masking.

GraphMask This repository contains an implementation of GraphMask, the interpretability technique for graph neural networks presented in our ICLR 2021

Michael Schlichtkrull 29 Sep 02, 2022
code for paper -- "Seamless Satellite-image Synthesis"

Seamless Satellite-image Synthesis by Jialin Zhu and Tom Kelly. Project site. The code of our models borrows heavily from the BicycleGAN repository an

Light 14 Apr 05, 2022
An Open-Source Toolkit for Prompt-Learning.

An Open-Source Framework for Prompt-learning. Overview • Installation • How To Use • Docs • Paper • Citation • What's New? Nov 2021: Now we have relea

THUNLP 2.3k Jan 07, 2023
The official repository for Deep Image Matting with Flexible Guidance Input

FGI-Matting The official repository for Deep Image Matting with Flexible Guidance Input. Paper: https://arxiv.org/abs/2110.10898 Requirements easydict

Hang Cheng 51 Nov 10, 2022