A meta plugin for processing timelapse data timepoint by timepoint in napari

Overview

napari-time-slicer

License PyPI Python Version tests codecov napari hub

A meta plugin for processing timelapse data timepoint by timepoint. It enables a list of napari plugins to process 2D+t or 3D+t data step by step when the user goes through the timelapse. Currently, these plugins are using napari-time-slicer:

napari-time-slicer enables inter-plugin communication, e.g. allowing to combine the plugins listed above in one image processing workflow for segmenting a timelapse dataset:

If you want to convert a 3D dataset into as 2D + time dataset, use the menu Tools > Utilities > Convert 3D stack to 2D timelapse (time-slicer). It will turn the 3D dataset to a 4D datset where the Z-dimension (index 1) has only 1 element, which will in napari be displayed with a time-slider. Note: It is recommended to remove the original 3D dataset after this conversion.

Usage for plugin developers

Plugins which implement the napari_experimental_provide_function hook can make use the @time_slicer. At the moment, only functions which take napari.types.ImageData, napari.types.LabelsData and basic python types such as int and float are supported. If you annotate such a function with @time_slicer it will internally convert any 4D dataset to a 3D dataset according to the timepoint currently selected in napari. Furthermore, when the napari user changes the current timepoint or the input data of the function changes, a re-computation is invoked. Thus, it is recommended to only use the time_slicer for functions which can provide [almost] real-time performance. Another constraint is that these annotated functions have to have a viewer parameter. This is necessary to read the current timepoint from the viewer when invoking the re-computions.

Example

import napari
from napari_time_slicer import time_slicer

@time_slicer
def threshold_otsu(image:napari.types.ImageData, viewer: napari.Viewer = None) -> napari.types.LabelsData:
    # ...

You can see a full implementations of this concept in the napari plugins listed above.


This napari plugin was generated with Cookiecutter using @napari's cookiecutter-napari-plugin template.

Installation

You can install napari-time-slicer via pip:

pip install napari-time-slicer

To install latest development version :

pip install git+https://github.com/haesleinhuepf/napari-time-slicer.git

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the BSD-3 license, "napari-time-slicer" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

Comments
  • pyqt5 dependency

    pyqt5 dependency

    The dependency on pyqt5 which gets installed via pip can create trouble if napari has been installed via conda (see https://napari.org/plugins/best_practices.html#don-t-include-pyside2-or-pyqt5-in-your-plugin-s-dependencies). Is there any reason for this dependency? As this plugin is itself a dependency of other plugins like napari-segment-blobs-and-things-with-membranes this can create trouble down the chain.

    opened by guiwitz 7
  • PyQt5 version requirement breaks environment

    PyQt5 version requirement breaks environment

    Hi @haesleinhuepf ,

    I wanted to ask whether it is really strictly necessary to use the current PyQt5 requirement?

    pyqt5>=5.15.0
    

    It collides with current Spyder versions that only support PyQt up to 5.13:

    spyder 5.1.5 requires pyqtwebengine<5.13, which is not installed.
    spyder 5.1.5 requires pyqt5<5.13, but you have pyqt5 5.15.6 which is incompatible.
    

    Since the time slicer is used downstream in quite a few plugins of yours (e.g., segment-blobs-and-things-with-membranes, etc.) this is quite a restriction.

    opened by jo-mueller 5
  • Bug report: `KeyError: 'viewer'`

    Bug report: `KeyError: 'viewer'`

    Hi @haesleinhuepf ,

    I am getting an error in this notebook in the 5th cell on this command:

    surface = nppas.largest_label_to_surface(labels)
    

    where nppas is napari-process-points-and-surfaces. Labels is a regular label image as made with skimage.measure.label().

    Thanks for looking at it!

    opened by jo-mueller 2
  • Make dask arrays instead of computing slice for slice

    Make dask arrays instead of computing slice for slice

    Hey @haesleinhuepf! this is the first implementation of the time slicer wrapper using dask instead of computing the time slices based on the current time index. I could re-use some a little of the previous code but the wrappers start to differ from eachother pretty soon. At the moment I'm also unsure if this wrapper can replace the original time slicer function as a substitute so I kept both your old version and the dask version. An idea that I had which could be useful for saving the dask images is a function which processes each time slice and saves it as a separate image (If images are saved one by one it's really easy to load them as dask arrays!)

    opened by Cryaaa 1
  • Tests failing

    Tests failing

    source:

     if sys.platform.startswith('linux') and running_as_bundled_app():
      .tox/py37-linux/lib/python3.7/site-packages/napari/utils/misc.py:65: in running_as_bundled_app
          metadata = importlib_metadata.metadata(app_module)
      .tox/py37-linux/lib/python3.7/site-packages/importlib_metadata/__init__.py:1005: in metadata
      return Distribution.from_name(distribution_name).metadata
      .tox/py37-linux/lib/python3.7/site-packages/importlib_metadata/__init__.py:562: in from_name
      raiseValueError("A distribution name is required.")
      E   ValueError: A distribution name is required.
    

    See also:

    https://github.com/napari/napari/issues/4797

    opened by haesleinhuepf 0
  • Have 4D dask arrays as result of time-sliced functions

    Have 4D dask arrays as result of time-sliced functions

    This turns result of time-slicer annotated functions into 4D delayed dask arrays as proposed by @Cryaaa in #5

    This PR doesn't fully work yet in the interactive napari user-interface. After setting up a workflow and when going through time, it crashes sometimes with a KeyError while saving the duration of an operation. This is related to a computation finishing while the result has already be replaced. Basically multiple threads writing to the same result. It's this error: https://github.com/dask/dask/issues/896

    Reproduce:

    • Start napari
    • Open the Example dataset clEsperanto > CalibZapwfixed
    • Turn it into a 2D+t dataset using Tools > Utilities
    • Open the assistant
    • Setup a workflow, e.g. Denoise, Threshold, Label
    • Move the time-bar a couple of times until it crashes.

    I'm not sure yet how to solve this.

    opened by haesleinhuepf 8
  • Aggregate points and surfaces in 4D

    Aggregate points and surfaces in 4D

    Hi Robert @haesleinhuepf ,

    I am seeing some issues with using the timeslicer on 4D points/surface data in napari. For instance, using the label_to_surface() function from napari-process-points-and-surfaces throws an error:

    ValueError: Input volume should be a 3D numpy array.
    

    which comes from the marching_cubes function under the hood. Here is a small example script to reproduce the error:

    import napari
    import napari_process_points_and_surfaces as nppas
    # Make a blurry sphere
    s = 100
    data = np.zeros((s, s, s), dtype=float)
    x0 = 50
    radius = 15
    
    for x in range(s):
        for y in range(s):
            for z in range(s):
                if np.sqrt((x-x0)**2 + (y-x0)**2 + (z-x0)**2) < radius:
                    data[x, y, z] = 1.0
    
    viewer = make_napari_viewer()
    viewer.add_image(image)
    
    segmentation = image > filters.threshold_otsu(image)
    viewer.add_labels(segmentation)
    
    surf = nppas.label_to_surface(segmentation.astype(int))
    viewer.add_surface(surf)
    

    When introspecting the call to marching_cubes within the time_slicer function it is also evident that the image is somehow still a 4D image.

    opened by jo-mueller 4
Releases(0.4.9)
Owner
Robert Haase
Computational Microscopist, BioImage Analyst, Code Jockey
Robert Haase
small package with utility functions for analyzing (fly) calcium imaging data

fly2p Tools for analyzing two-photon (2p) imaging data collected with Vidrio Scanimage software and micromanger. Loading scanimage data relies on scan

Hannah Haberkern 3 Dec 14, 2022
EOD Historical Data Python Library (Unofficial)

EOD Historical Data Python Library (Unofficial) https://eodhistoricaldata.com Installation python3 -m pip install eodhistoricaldata Note Demo API key

Michael Whittle 20 Dec 22, 2022
A set of tools to analyse the output from TraDIS analyses

QuaTradis (Quadram TraDis) A set of tools to analyse the output from TraDIS analyses Contents Introduction Installation Required dependencies Bioconda

Quadram Institute Bioscience 2 Feb 16, 2022
A forecasting system dedicated to smart city data

smart-city-predictions System prognostyczny dedykowany dla danych inteligentnych miast Praca inżynierska realizowana przez Michała Stawikowskiego and

Kevin Lai 1 Nov 08, 2021
Gaussian processes in TensorFlow

Website | Documentation (release) | Documentation (develop) | Glossary Table of Contents What does GPflow do? Installation Getting Started with GPflow

GPflow 1.7k Jan 06, 2023
PATC: Introduction to Big Data Analytics. Practical Data Analytics for Solving Real World Problems

PATC: Introduction to Big Data Analytics. Practical Data Analytics for Solving Real World Problems

1 Feb 07, 2022
This is an analysis and prediction project for house prices in King County, USA based on certain features of the house

This is a project for analysis and estimation of House Prices in King County USA The .csv file contains the data of the house and the .ipynb file con

Amit Prakash 1 Jan 21, 2022
Statistical & Probabilistic Analysis of Store Sales, University Survey, & Manufacturing data

Statistical_Modelling Statistical & Probabilistic Analysis of Store Sales, University Survey, & Manufacturing data Statistical Methods for Decision Ma

Avnika Mehta 1 Jan 27, 2022
Data Scientist in Simple Stock Analysis of PT Bukalapak.com Tbk for Long Term Investment

Data Scientist in Simple Stock Analysis of PT Bukalapak.com Tbk for Long Term Investment Brief explanation of PT Bukalapak.com Tbk Bukalapak was found

Najibulloh Asror 2 Feb 10, 2022
BErt-like Neurophysiological Data Representation

BENDR BErt-like Neurophysiological Data Representation This repository contains the source code for reproducing, or extending the BERT-like self-super

114 Dec 23, 2022
Handle, manipulate, and convert data with units in Python

unyt A package for handling numpy arrays with units. Often writing code that deals with data that has units can be confusing. A function might return

The yt project 304 Jan 02, 2023
Describing statistical models in Python using symbolic formulas

Patsy is a Python library for describing statistical models (especially linear models, or models that have a linear component) and building design mat

Python for Data 866 Dec 16, 2022
Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods

Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods Introduction Graph Neural Networks (GNNs) have demonstrated

37 Dec 15, 2022
The Dash Enterprise App Gallery "Oil & Gas Wells" example

This app is based on the Dash Enterprise App Gallery "Oil & Gas Wells" example. For more information and more apps see: Dash App Gallery See the Dash

Austin Caudill 1 Nov 08, 2021
Tablexplore is an application for data analysis and plotting built in Python using the PySide2/Qt toolkit.

Tablexplore is an application for data analysis and plotting built in Python using the PySide2/Qt toolkit.

Damien Farrell 81 Dec 26, 2022
PyTorch implementation for NCL (Neighborhood-enrighed Contrastive Learning)

NCL (Neighborhood-enrighed Contrastive Learning) This is the official PyTorch implementation for the paper: Zihan Lin*, Changxin Tian*, Yupeng Hou* Wa

RUCAIBox 73 Jan 03, 2023
A collection of robust and fast processing tools for parsing and analyzing web archive data.

ChatNoir Resiliparse A collection of robust and fast processing tools for parsing and analyzing web archive data. Resiliparse is part of the ChatNoir

ChatNoir 24 Nov 29, 2022
Dbt-core - dbt enables data analysts and engineers to transform their data using the same practices that software engineers use to build applications.

Dbt-core - dbt enables data analysts and engineers to transform their data using the same practices that software engineers use to build applications.

dbt Labs 6.3k Jan 08, 2023
This repository contains some analysis of possible nerdle answers

Nerdle Analysis https://nerdlegame.com/ This repository contains some analysis of possible nerdle answers. Here's a quick overview: nerdle.py contains

0 Dec 16, 2022
Toolchest provides APIs for scientific and bioinformatic data analysis.

Toolchest Python Client Toolchest provides APIs for scientific and bioinformatic data analysis. It allows you to abstract away the costliness of runni

Toolchest 11 Jun 30, 2022