Raster-based Spatial Analysis for Python

Overview

🌍 xarray-spatial: Raster-Based Spatial Analysis in Python

Build Status Build status PyPI version Downloads License


History of OS GIS Timeline


title

πŸ“ Fast, Accurate Python library for Raster Operations

⚑ Extensible with Numba

⏩ Scalable with Dask

🎊 Free of GDAL / GEOS Dependencies

🌍 General-Purpose Spatial Processing, Geared Towards GIS Professionals


Xarray-Spatial implements common raster analysis functions using Numba and provides an easy-to-install, easy-to-extend codebase for raster analysis.

Installation

# via pip
pip install xarray-spatial

# via conda
conda install -c conda-forge xarray-spatial

Dowloading our starter examples and data

Once you have xarray-spatial installed in your environment, you can use one of the following in your terminal (with the environment active) to download our examples and/or sample data into your local directory.

xrspatial examples : Download the examples notebooks and the data used.

xrspatial copy-examples : Download the examples notebooks but not the data. Note: you won't be able to run many of the examples.

xrspatial fetch-data : Download just the data and not the notebooks.

In all the above, the command will download and store the files into your current directory inside a folder named 'xrspatial-examples'.

xarray-spatial grew out of the Datashader project, which provides fast rasterization of vector data (points, lines, polygons, meshes, and rasters) for use with xarray-spatial.

xarray-spatial does not depend on GDAL / GEOS, which makes it fully extensible in Python but does limit the breadth of operations that can be covered. xarray-spatial is meant to include the core raster-analysis functions needed for GIS developers / analysts, implemented independently of the non-Python geo stack.

Our documentation is still under constructions, but docs can be found here.

Raster-huh?

Rasters are regularly gridded datasets like GeoTIFFs, JPGs, and PNGs.

In the GIS world, rasters are used for representing continuous phenomena (e.g. elevation, rainfall, distance), either directly as numerical values, or as RGB images created for humans to view. Rasters typically have two spatial dimensions, but may have any number of other dimensions (time, type of measurement, etc.)

Supported Spatial Functions with Supported Inputs


Classification

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Equal Interval βœ…οΈ βœ… βœ… βœ…οΈ
Natural Breaks βœ…οΈ βœ… βœ… βœ…οΈ
Reclassify βœ…οΈ βœ… βœ… βœ…
Quantile βœ…οΈ βœ… βœ… βœ…οΈ

Focal

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Apply βœ…οΈ
Hotspots βœ…οΈ
Mean βœ…οΈ
Focal Statistics βœ…οΈ βœ…οΈ

Multispectral

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Atmospherically Resistant Vegetation Index (ARVI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Enhanced Built-Up and Bareness Index (EBBI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Enhanced Vegetation Index (EVI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Green Chlorophyll Index (GCI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Normalized Burn Ratio (NBR) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Normalized Burn Ratio 2 (NBR2) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Normalized Difference Moisture Index (NDMI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Normalized Difference Vegetation Index (NDVI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Soil Adjusted Vegetation Index (SAVI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ
Structure Insensitive Pigment Index (SIPI) βœ…οΈ βœ…οΈ βœ…οΈ βœ…οΈ

Pathfinding

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
A* Pathfinding βœ…οΈ

Proximity

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Allocation βœ…οΈ
Direction βœ…οΈ
Proximity βœ…οΈ

Surface

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Aspect βœ…οΈ βœ…οΈ βœ…οΈ ⚠️
Curvature βœ…οΈ ⚠️
Hillshade βœ…οΈ βœ…οΈ
Slope βœ…οΈ βœ…οΈ βœ…οΈ ⚠️
Terrain Generation βœ…οΈ
Viewshed βœ…οΈ
Perlin Noise βœ…οΈ
Bump Mapping βœ…οΈ

Zonal

Name NumPy xr.DataArray Dask xr.DataArray CuPy GPU xr.DataArray Dask GPU xr.DataArray
Apply βœ…οΈ
Crop βœ…οΈ
Regions βœ…οΈ
Trim βœ…οΈ
Zonal Statistics βœ…οΈ
Zonal Cross Tabulate βœ…οΈ

Usage

Basic Pattern
import xarray as xr
from xrspatial import hillshade

my_dataarray = xr.DataArray(...)
hillshaded_dataarray = hillshade(my_dataarray)

Check out the user guide here.


title title

Dependencies

xarray-spatial currently depends on Datashader, but will soon be updated to depend only on xarray and numba, while still being able to make use of Datashader output when available.

title

Notes on GDAL

Within the Python ecosystem, many geospatial libraries interface with the GDAL C++ library for raster and vector input, output, and analysis (e.g. rasterio, rasterstats, geopandas). GDAL is robust, performant, and has decades of great work behind it. For years, off-loading expensive computations to the C/C++ level in this way has been a key performance strategy for Python libraries (obviously...Python itself is implemented in C!).

However, wrapping GDAL has a few drawbacks for Python developers and data scientists:

  • GDAL can be a pain to build / install.
  • GDAL is hard for Python developers/analysts to extend, because it requires understanding multiple languages.
  • GDAL's data structures are defined at the C/C++ level, which constrains how they can be accessed from Python.

With the introduction of projects like Numba, Python gained new ways to provide high-performance code directly in Python, without depending on or being constrained by separate C/C++ extensions. xarray-spatial implements algorithms using Numba and Dask, making all of its source code available as pure Python without any "black box" barriers that obscure what is going on and prevent full optimization. Projects can make use of the functionality provided by xarray-spatial where available, while still using GDAL where required for other tasks.

Contributors

  • @brendancol
  • @thuydotm
  • @jbednar
  • @pablomakepath
  • @kristinepetrosyan
  • @sjsrey
  • @giancastro
  • @ocefpaf
  • @rsignell-usgs
  • @marcozimmermannpm
  • @jthetzel
  • @chase-dwelle
  • @SAN154
  • @SapirLastimoza-Dooley
  • @lex-c
Comments
  • Add annulus focal kernel

    Add annulus focal kernel

    Addresses #125, #124

    Using the circle kernel to extend to an annulus kernel. Implementation excludes the cells on the bound of the inner radius of the annulus.

    Also implements the ability to return the raw z-scores from hotspots.

    opened by chase-dwelle 7
  • fix tarball and drop versioneer

    fix tarball and drop versioneer

    versioneer is deprecated, does not work properly with pep517/518, and the source distribution tarball on PyPI is broken without the versioneer.py file in it. See https://github.com/conda-forge/staged-recipes/pull/12177

    This PR implements setuptools_scm instead of versioneer, modernize the Travis-CI a little bit and implement a tarball check to avoid regressions, and fix the tarball.

    Ping @rsignell-usgs who requested the conda-forge package.

    Needed in https://github.com/conda-forge/staged-recipes/pull/12177

    opened by ocefpaf 7
  • Added a pure numba hillshade that is 10x faster compared to numpy

    Added a pure numba hillshade that is 10x faster compared to numpy

    The new code uses less context switching and data transfers, minimizes communication overhead and removes the cupy dependency. I've made it run by default, if cuda is available. Otherwise if falls back to numpy

    This change addresses issue #541

    opened by a7az0th 6
  • Add distributed proximity distance grids using Dask

    Add distributed proximity distance grids using Dask

    Given a 2D raster image, a set of target pixels in the input raster, the proximity function computes a raster of proximity to indicate the distance from each pixel to the nearest target pixel. The distance metric to be used can be one of the following: Euclidean, Great-Circle, or Manhattan. In the most naive way, we can calculate distances from each pixel to all target pixels and find the closest target. This would take m*n*t calculations, where m*n is the size of the raster, and t is the number of targets.

    The current implementation of xrspatial.proximity function is ported from GDAL with some modifications to make it work with xarray.DataArray. This notebook shows how to use the function. To keep it simple, let's consider the problem at a 2D array level instead of a 2D xarray DataArray. The algorithm can be described as follows:

    Inputs:

    • Raster image I, a height x width 2D array
    • Set of target pixels T. All target pixels are in I
    • Distance metric d()

    Output: Proximity raster P where P[i, j] is the distance from cell (i, j) to its nearest target pixel

    Idea: Use dynamic programming to identify the nearest target pixel of a pixel based on the nearest target pixels of its 3x3 neighborhood window.

    Detail implementation:

    • Let Nx be a 1d array of width elements: Nx[j] is the x-position in pixel space of nearest target pixel of pixel (i, j). Nx[j] can have a value in [0, width-1]
    • Let Ny be a 1d array of width elements: Ny[j] is the y-position in pixel space of nearest target pixel of pixel (i, j). Ny[j] can have a value in [0, height-1]. Values of Nx and Ny will be updated row by row.
    1. Initially,
    • set Nx[j] = -1 for all j.
    • set Ny[j] = -1 for all j.
    • for all (i, j), set P[i, j] = 0 if cell (i, j) is a target pixel, P[i, j] = infinity otherwise.
    • note that distance d( (i1, j1), (i2, j2) ) = infinity if any of the inputs i1, j1, i2, j2 equals to -1 (i.e, invalid pixel)
    1. Traverse the image row by row from top to bottom:
    • traverse each row from left to right
    • traverse each row from right to left
    1. Reset Nx[j] = -1 and Ny[j] = -1 for all j.
    2. Traverse the image in reverse order from bottom to top:
    • traverse each row from right to left
    • traverse each row from left to right

    The formula to update P[i, j] and Ny[j] and Nx[j] at each cell (i, j)

    P[i, j] = min(
        P[i, j],                   
        d( (Ny[j], Nx[j]), (i, j) ),     # Are we nearer to the closest target to the above (below) pixel?
        d( (Ny[j-1], Nx[j-1]), (i, j) ), # Are we nearer to the closest target to the left (right) pixel?
        d( (Ny[j+1], Nx[j+1]), (i, j) ), # Are we nearer to the closest target to the top right (bottom left) pixel?
    )
    

    Update Ny[j] and Nx[j] accordingly:

    • Ny[j], Nx[j] = i, j if P[i, j] = 0
    • Ny[j], Nx[j] = Ny[j-1], Nx[j-1] if P[i, j] is updated as d( (Ny[j-1], Nx[j-1]), (i, j) )
    • Ny[j], Nx[j] = Ny[j+1], Nx[j+1] if P[i, j] is updated as d( (Ny[j+1], Nx[j+1]), (i, j) )

    We're looking for an implementation for the distributed version of the Proximity function that works with Dask. Explicit questions are listed as:

    1. Currently, the calculations are performed sequentially, how to parallelize them?
    2. Dividing our 2D input data into a set of smaller chunks, how to compute proximity distance grid chunk by chunk? The primary question is, how to determine the nearest target pixel of all pixels in a chunk when the target pixels can be outside the chunk?
    technical-assistance-needed 
    opened by marcozimmermannpm 6
  • nodata_value for zonal_stats doesn't appear to work

    nodata_value for zonal_stats doesn't appear to work

    Describe the bug Running zonal_stats and passing in the argument nodata_values = -99999 results in a ValueError: All arrays must be of the same length stack trace with method and line number zonal.py line 519 zonal.py 335 python3.7/site-packages/pandas/core/frame.py 614 python3.7/site-packages/pandas/core/internals/construction.py 465

    When i run zonal_stats without the nodata_values argument it runs fine and computes

    nodata_values = -99999 or nodata_values = -99999.0

    Expected behavior Would expect the method zonal_stats to return statistics computed ignoring pixels with nodata_values

    Screenshots Unable to provide screenshots, running this in a classified environment

    Desktop (please complete the following information): Running this in Databricks 7.3LTS with Python 3.7.5, xarray-spatial 0.3.0

    bug 
    opened by petersedivec 5
  • Compat error with `numpy>1.20`

    Compat error with `numpy>1.20`

    Describe the bug Cannot install the latest version of xarray-spatial with an up-to-date stack (numpy>1.20).

    Expected behavior Install the latest version of xarray-spatial through conda

    Screenshots I'm trying to update the gds_env and trying an install that pins xarray-spatial to its latest release returns the following error:

    package xarray-spatial-0.2.9-pyhd8ed1ab_0 requires numpy >=1.7,<=1.20, but none of the providers can be installed
    

    The numpy version it is installing (for compatibility with other libraries) is 1.21.2. Is there any reason why it pins to 1.20 as the most recent version?

    bug 
    opened by darribas 5
  • Optl reproj

    Optl reproj

    Added reprojection with one blank function to test optional modules install. Can be extended to other modules as well. Install with pip install -e . --install-option='--reprojection' to get reprojection module.

    ready to merge 
    opened by calexat-123 5
  • Ready: Fix examples download cli cmd

    Ready: Fix examples download cli cmd

    Changed versioning scheme. The scm versioning in setup py was not working correctly. I substituted from param (as in datashader) to generate versions. Conda versions are fixed at 0.1.6 right now.

    ready to merge 
    opened by calexat-123 5
  • Remove numpy version pin

    Remove numpy version pin

    Now that numba 0.55 has been released with support for numpy 1.21 and python 3.10, we no longer need to pin to a max numpy version ourselves.

    Fixes issue #565.

    opened by ianthomas23 4
  • Fast terrain

    Fast terrain

    This PR brings all the terrain generation related development and optimization. In brief:

    • adds a fast_perlin.py module that contains a fast_perlin() method that can be used to generate perlin noise. It offers higher performance compared to the previous implementation of perlin noise and has a Numpy, CuPy and Dask implementation. Note that the interface is now slightly different. The user needs to pass an xr.DataArray (can be based on numpy, cupy or dask array) instead of width, height.
    • adds a fast_terrain.py module that contains a generate_fast_terrain() method that can be used to generate a randomized terrain. It is faster compare to the existing implementation of terrain generation. Uses the fast_perlin() method and has a Numpy, CuPy and Dask implementation. Note that the interface is now slightly different. The user needs to pass an xr.DataArray (can be based on numpy, cupy or dask array) instead of a Canvas.
    • Adds two jupyter notebooks in the examples directory, one for each of the two newly added methods.
    opened by kiliakis 4
  • Update Release note versions

    Update Release note versions

    Hey the last few releases need the CHANGELOG.md information copied over. to the github Release for each version (e.g. 0.1.4, 0.1.5, 0.1.6). Check those out, the content should be fine the way it is.

    opened by brendancol 4
  • Discrepancy between xarray-spatial hillshade and GDAL hillshade

    Discrepancy between xarray-spatial hillshade and GDAL hillshade

    I'm testing xarray-spatial against QGIS and seeing that when running hillshade on the same input data, the results by xarray-spatial and GDAL/QGIS are very different. Source code for QGIS hillshade can be found at: https://github.com/qgis/QGIS/blob/master/python/plugins/processing/algs/gdal/hillshade.py. This needs more research to see how the algorithm was implemented in QGIS to clearly understand the difference between the 2 libraries.

    Input data:

    array([[      nan,       nan,       nan,       nan,       nan,       nan],
           [704.237  , 242.24084, 429.3324 , 779.8816 , 193.29506, 984.6926 ],
           [226.56795, 815.7483 , 290.6041 ,  76.49687, 820.89716,  32.27882],
           [344.8238 , 256.34998, 806.8326 , 602.0442 , 721.1633 , 496.95636],
           [185.43515, 834.10425, 387.0871 , 716.0262 ,  49.61273, 752.95483],
           [302.4271 , 151.49211, 442.32797, 358.4702 , 659.8187 , 447.1241 ],
           [148.04834, 819.2133 , 468.97913, 977.11694, 597.69666, 999.14185],
           [268.1575 , 625.96466, 840.26483, 448.28333, 859.2699 , 528.04095]],
          dtype=float32)
    

    xarray-spatial hillshade:

    array([[       nan,        nan,        nan,        nan,        nan,            nan],
           [       nan,        nan,        nan,        nan,        nan,            nan],
           [       nan, 0.75030494, 0.06941041, 0.90643436, 0.15474272,            nan],
           [       nan, 0.80836594, 0.72366774, 0.14052185, 0.774778  ,            nan],
           [       nan, 0.93396175, 0.7071851 , 0.42872226, 0.9455124 ,            nan],
           [       nan, 0.85551083, 0.6819584 , 0.46013114, 0.23561102,            nan],
           [       nan, 0.41484872, 0.3213355 , 0.5821109 , 0.21879822,            nan],
           [       nan,        nan,        nan,        nan,        nan,            nan]],
     dtype=float32)
    

    QGIS/GDAL hillshade

    array([[  0.      ,   0.      ,   0.      ,   0.      ,   0.      ,          0.      ],
           [  0.      ,   0.      ,   0.      ,   0.      ,   0.      ,          0.      ],
           [107.84468 ,  70.09885 ,   0.      ,  17.661407,   0.      ,          0.      ],
           [ 80.06987 ,  71.644684,   0.      ,   0.      ,   0.      ,          0.      ],
           [ 85.574615, 106.36669 ,  96.23605 ,  28.27108 ,  90.29079 ,         85.07072 ],
           [ 81.44522 ,  77.092354,   8.479876,   0.      ,   0.      ,          0.      ],
           [ 62.541145,   2.647696,   0.      ,   0.      ,   0.      ,          6.515689],
           [ 74.07955 ,  78.71434 ,   0.      ,  84.590744,  34.814816,         44.81609 ]],
     dtype=float32)
    
    opened by thuydotm 0
  • Hotspots time series analysis

    Hotspots time series analysis

    Many GIS data is collected frequently. For example, Sentinel-2 is captured every 5 days. Having tools that support studying data over time can help in detecting the changes, as well as identifying the trends to predict future. Time series analysis will be our next step. Let's do some research about how to do time series hotspots analysis.

    https://pro.arcgis.com/en/pro-app/latest/tool-reference/space-time-pattern-mining/emerginghotspots.htm

    https://pro.arcgis.com/en/pro-app/latest/tool-reference/spatial-statistics/space-time-analysis.htm

    opened by thuydotm 0
  • Review GPU supports for all tools

    Review GPU supports for all tools

    One of the 2023 focus is to strengthen our GPU support to xarray-spatial tools. Let's review again and address which needs improvement and what we can do.

    opened by thuydotm 0
  • Test against QGIS, GDAL

    Test against QGIS, GDAL

    To evaluate the reliability of xarray-spatial, let's test the package and compare the results against QGIS/GDAL.

    • [x] aspect
    • [x] slope
    • [x] multispectral tools
    • [x] proximity
    • [x] zonal stats

    Fixes #22, #104, #182

    ready to merge 
    opened by thuydotm 1
Releases(v0.3.5)
  • v0.3.5(Jun 5, 2022)

    The 0.3.5 release mainly addresses the scaling issue in GPU viewshed to gain better accurate triangulation. The GPU raytraced viewshed should now give comparable results to the CPU version. However, the 2 versions use 2 different approaches, there can be slightly differences at some points where a version returns visible while the other considers them as invisible. Many thanks to @nodell111, @a7az0th, and the maintainers @thuydotm for contributing to this release.

    Enhancements

    • command to get change log (#716)
    • Added Feature Proposal Template (#714)

    Bug Fixes

    • Improved viewshed rtx. Now result should match the CPU version (#715)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.4(Jun 1, 2022)

    Version 0.3.4 - 2022-06-01

    The 0.3.4 release primarily a bug fix release but also includes a number of enhancements with a focus on GPU supports. Many thanks to @mkeenan195, @a7az0th, and the maintainers @ianthomas23 and @thuydotm for contributing to this release.

    Enhancements

    • NumPy zonal stats: return a data array of calculated stats (#685)
    • set unit for hotspots output (#686)
    • More robust cuda and cupy identification (#657)
    • Remove deprecated tiles module (#698)
    • Test on python 3.10, remove 3.6 (#694)
    • moved all tests to github actions (#689)
    • Add isort to pytest (#700)
    • Add flake8 to pytest (#697)
    • Remove unnecessary executable flags (#696)
    • updated test hotspots gpu (#692)
    • 3D numpy zonal_crosstab to support more agg methods (#687)

    Bug Fixes

    • Fix rtx viewshed rendering blank image (#711)
    • Convolve_2d gpu fixes (#702)
    • focal.mean(): only do data type conversion once (#699)
    • Update to remote sensing notebook (#688)
    • focal_stats(): gpu case (#709)
    • focal apply: drop gpu support (#706)
    • drop gpu support (#705)
    • enabled numba.cuda.jit in hotspots cupy (#691)

    Documentation

    • Correct examples in docstrings (#703)
    • Fix doc build dependencies in CI (#683)
    • Fix link to Austin road network notebook (#695)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.3(Mar 21, 2022)

    • fixed ubuntu version (#681)
    • Don't calculate angle when not needed (#677)
    • codecov: ignore all tests at once (#674)
    • add more tests to focal module (#676)
    • classify: more tests (#675)
    • Codecov: disable Numba; ignore tests, experimental, and gpu_rtx (#673)
    • Improve linter: add isort (#672)
    • removed stale test files from project root (#670)
    • User guide fixes (#665)
    • license year in README to include 2022 (#668)
    • install dependencies specified in test config (#666)
    • Pytests for CuPy zonal stats (#658)
    • add Codecov badge to README
    • codecov with github action (#663)
    • Modernise build system (#654)
    • classify tools: classify infinite values as nans, natural_breaks: classify all data points when using sub sample (#653)
    • Add more benchmarks (#648)
    • Stubbed out function for Analytics module (#621)
    • Fix doc build failure due to Jinja2 version (#651)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.2(Feb 4, 2022)

    • Remove numpy version pin (#637)
    • aspect: added benchmarks (#640)
    • Clean gitignore and manifest files (#642)
    • Benchmark results (#643)
    • handle CLI errors #442 (#644)
    • Cupy zonal (#639)
    • Tests improvements (#636)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Jan 10, 2022)

    • Add benchmarking framework using asv (#595)
    • Fix classify bug with dask array (#599)
    • polygonize function on cpu for numpy-backed xarray DataArrays (#585)
    • Test python 3.9 on CI (#602)
    • crosstab: speedup dask case (#596)
    • Add benchmark for CPU polygonize (#605)
    • Change copyright year to include 2021 (#610)
    • Docs enhancement (#604, #628)
    • code refactor: use array function mapper, add messages param to not_implemented_func() (#612)
    • condense tests (#613)
    • Multispectral fixes (#617)
    • Change copyright year to 2022 (#622)
    • Aspect: convert to float if int dtype input raster (#619)
    • direction(), allocation(): set all NaNs at initalization (#618)
    • Add rtx gpu hillshade with shadows (#608)
    • Add hillshade benchmarking, for numpy, cupy and rtxpy (#625)
    • Focal mean: handle nans inside kernel (#623)
    • Convert to float32 if input raster is in int dtype (#629)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Dec 1, 2021)

    • Added a pure numba hillshade that is 10x faster compared to numpy (#542)
    • dask case proximity: process whole raster at once if max_distance exceed max possible distance (#558)
    • pathfinding: start and goal in (y, x) format (#550)
    • generate_terrain: cupy case, dask numpy case (#555)
    • Optimize natural_break on large inputs (#562)
    • Fixes in CPU version of natural_breaks. (#562) (#563)
    • zonal stats, speed up numpy case (#568)
    • Ensure that cupy is not None (#570)
    • Use explicit cupy to numpy conversion in tests (#573)
    • zonal stats: speed up dask case (#572)
    • zonal_stats: ensure chunksizes of zones and values are matching (#574)
    • validate_arrays: ensure chunksizes of arrays are matching (#577)
    • set default value for num_sample (#580)
    • Add rtx gpu viewshed and improve cpu viewshed (#588)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.9(Sep 1, 2021)

  • v0.2.8(Aug 27, 2021)

    • Added dask support to proximity tools (#540)
    • Refactored the resample utils function and changed their name to canvas_like (#539)
    • Added zone_ids and cat_ids param to stats zonal function (#538)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.7(Jul 30, 2021)

    • Added Dask support for stats and crosstab zonal functions (#502)
    • Ignored NaN values on classify functions (#534)
    • Added agg param to crosstab zonal function (#536)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.6(Jun 28, 2021)

    • Updated the classification notebook (#489)
    • Added xrspatial logo to readme (#492)
    • Removed reprojection notebook old version (#494)
    • Added true_color function to documentation (#494)
    • Added th params to true_color function (#494)
    • Added pathfinding nb data load guidance (#491)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.5(Jun 24, 2021)

    • Added reprojection notebook (#474)
    • Reviewed local tools notebook (#466)
    • Removed save_cogs_azure notebook (#478)
    • Removed xrspatial install guidance from makepath channel (#483)
    • Moved local notebook to user guide folder (#486)
    • Fixed pharmacy notebook (#479)
    • Fixed path-finding notebook data load guidance (#480)
    • Fixed focal notebook imports (#481)
    • Fixed remote-sensing notebook data load guidance (#482)
    • Added output name and attrs on true_color function (#484)
    • Added classify notebook (#477)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.4(Jun 10, 2021)

  • v0.2.3(Jun 2, 2021)

  • v0.2.2(May 7, 2021)

  • v0.2.1(May 7, 2021)

    • Added GPU and Dask support for Focal tools: mean, apply, hotspots (#238)
    • Moved kernel creation functions to convolution module (#238)
    • Update Code of Conduct (#391)
    • Fixed manhattan distance to sum of abs (#309)
    • Example notebooks running on PC Jupyter Hub (#370)
    • Fixed examples download cli cmd (#349)
    • Removed conda recipe (#397)
    • Updated functions and classes docstrings (#302)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Apr 29, 2021)

  • v0.1.9(Apr 27, 2021)

    • Deprecated tiles module (#381)
    • Added user guide on the documentation website (#376)
    • Updated docs design version mapping (#378)
    • Added Github Action to publish package to PyPI (#371)
    • Moved Spatialpandas to core install requirements for it to work on JLabs (#372)
    • Added CONTRIBUTING.md (#374)
    • Updated true_color to return a xr.DataArray (#364)
    • Added get_data module and example sentinel-2 data (#358)
    • Added citations guidelines and reformat (#382)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.8(Apr 15, 2021)

  • v0.1.7(Apr 15, 2021)

    • Updated multispectral.true_color: sigmoid contrast enhancement (#339)
    • Added notebook save cogs in examples directory (#307)
    • Updated Focal user guide (#336)
    • Added documentation step on release steps (#346)
    • Updated cloudless mosaic notebook: use Dask-Gateway (#351)
    • Fixed user guide notebook numbering (#333)
    • Correct warnings (#350)
    • Add flake8 Github Action (#331)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.6(Apr 13, 2021)

    • Cleared metadata in all examples ipynb (#327)
    • Moved docs requirements to source folder (#326)
    • Fixed manifest file
    • Fixed travis ci (#323)
    • Included yml files
    • Fixed examples path in Pharmacy Deserts Noteboo
    • Integrate xarray-spatial website with the documentation (#291)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.5(Apr 13, 2021)

  • v0.1.4(Apr 13, 2021)

  • v0.1.3(Apr 6, 2021)

    • Added band_to_img utils func
    • Added download-examples CLI command for all notebooks (#241)
    • Added band_to_img utils func
    • docs enhancements
    • GPU and dask support for multispectral tools
    • GPU and Dask support for classify module (#168)
    • Fixed savi dask cupy test skip
    • Moved validate_arrays to utils
    • Added GPU support for hillshade (#151)
    • Added CLI for examples data
    • Improved Sphinx docs / theme
    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Dec 1, 2020)

    • Added GPU support for curvature (#150)
    • Added dask.Array support for curvature (#150)
    • Added GPU support for aspect (#156)
    • Added dask.Array support for aspect (#156)
    • Added GPU support for slope (#152)
    • Added dask.Array support for slope (#152)
    • Fixed slope cupy: nan edge effect, remove numpy padding that cause TypeError (#160)
    • Fixed aspect cupy: nan edge effect, remove numpy padding that cause TypeError(#160)
    • Updated README with Supported Spatial Features Table
    • Added badge for open source gis timeline
    • Added GPU Support for Multispectral tools (#148)
    • Added Python 3.9 to Test Suite
    Source code(tar.gz)
    Source code(zip)
  • v0.1.1(Oct 21, 2020)

  • v0.1.0(Sep 11, 2020)

    • Moved kernel creation to name-specific functions. (#127)
    • Separated the validate and custom kernel functions. (focal)
    • Added annulus focal kernel (#126) (focal)
    • Added outputting of z-scores from hotspots tool (focal)
    • Changed type checking to use np.floating (focal)
    • Added tests for refactored focal statistics (focal)
    Source code(tar.gz)
    Source code(zip)
  • v0.0.9(Aug 28, 2020)

    • Added A* pathfinding
    • Allow all numpy float data types, not just numpy.float64 (#122)
    • Broke out user-guide into individual notebooks
    • Added num_sample param option to natural_breaks (#123)
    • Removed sklearn dependency
    Source code(tar.gz)
    Source code(zip)
  • v0.0.8(Jul 22, 2020)

  • 0.0.7(Jul 22, 2020)

  • v0.0.6(Jul 15, 2020)

    Version 0.0.6 - 7/14/2020

    • Added Proximity Direction (proximity)
    • Added Proximity Allocation (proximity)
    • Added Zonal Crop (zonal)
    • Added Trim (zonal)
    • Added ebbi (multispectral)
    • Added more tests for slope (slope)
    • Added image grid (readme)
    Source code(tar.gz)
    Source code(zip)
Owner
makepath
makepath
geobeam - adds GIS capabilities to your Apache Beam and Dataflow pipelines.

geobeam adds GIS capabilities to your Apache Beam pipelines. What does geobeam do? geobeam enables you to ingest and analyze massive amounts of geospa

Google Cloud Platform 61 Nov 08, 2022
A Python package for delineating nested surface depressions from digital elevation data.

Welcome to the lidar package lidar is Python package for delineating the nested hierarchy of surface depressions in digital elevation models (DEMs). I

Qiusheng Wu 166 Jan 03, 2023
Search and download Copernicus Sentinel satellite images

sentinelsat Sentinelsat makes searching, downloading and retrieving the metadata of Sentinel satellite images from the Copernicus Open Access Hub easy

837 Dec 28, 2022
Tool to suck data from ArcGIS Server and spit it into PostgreSQL

chupaESRI About ChupaESRI is a Python module/command line tool to extract features from ArcGIS Server map services. Name? Think "chupacabra" or "Chupa

John Reiser 34 Dec 04, 2022
Documentation and samples for ArcGIS API for Python

ArcGIS API for Python ArcGIS API for Python is a Python library for working with maps and geospatial data, powered by web GIS. It provides simple and

Esri 1.4k Dec 30, 2022
Script that allows to download data with satellite's orbit height and create CSV with their change in time.

Satellite orbit height β—Ύ Requirements Python = 3.8 Packages listen in reuirements.txt (run pip install -r requirements.txt) Account on Space Track β—Ύ

Alicja MusiaΕ‚ 2 Jan 17, 2022
Computer Vision in Python

Mahotas Python Computer Vision Library Mahotas is a library of fast computer vision algorithms (all implemented in C++ for speed) operating over numpy

Luis Pedro Coelho 792 Dec 20, 2022
Fiona reads and writes geographic data files

Fiona Fiona reads and writes geographic data files and thereby helps Python programmers integrate geographic information systems with other computer s

987 Jan 04, 2023
PyTorch implementation of ''Background Activation Suppression for Weakly Supervised Object Localization''.

Background Activation Suppression for Weakly Supervised Object Localization PyTorch implementation of ''Background Activation Suppression for Weakly S

34 Dec 27, 2022
A simple reverse geocoder that resolves a location to a country

Reverse Geocoder This repository holds a small web service that performs reverse geocoding to determine whether a user specified location is within th

4 Dec 25, 2021
Pandas Network Analysis: fast accessibility metrics and shortest paths, using contraction hierarchies :world_map:

Pandana Pandana is a Python library for network analysis that uses contraction hierarchies to calculate super-fast travel accessibility metrics and sh

Urban Data Science Toolkit 321 Jan 05, 2023
python toolbox for visualizing geographical data and making maps

geoplotlib is a python toolbox for visualizing geographical data and making maps data = read_csv('data/bus.csv') geoplotlib.dot(data) geoplotlib.show(

Andrea Cuttone 976 Dec 11, 2022
A short term landscape evolution using a path sampling method to solve water and sediment flow continuity equations and model mass flows over complex topographies.

r.sim.terrain A short-term landscape evolution model that simulates topographic change for both steady state and dynamic flow regimes across a range o

Brendan Harmon 7 Oct 21, 2022
Satellite imagery for dummies.

felicette Satellite imagery for dummies. What can you do with this tool? TL;DR: Generate JPEG earth imagery from coordinates/location name with public

Shivashis Padhi 1.8k Jan 03, 2023
When traveling in the backcountry during winter time, updating yourself on current and recent weather data is important to understand likely avalanche danger.

Weather Data When traveling in the backcountry during winter time, updating yourself on current and recent weather data is important to understand lik

Trevor Allen 0 Jan 02, 2022
pure-Python (Numpy optional) 3D coordinate conversions for geospace ecef enu eci

Python 3-D coordinate conversions Pure Python (no prerequistes beyond Python itself) 3-D geographic coordinate conversions and geodesy. API similar to

Geospace code 292 Dec 29, 2022
This repository contains the scripts to derivate the ENU and ECEF coordinates from the longitude, latitude, and altitude values encoded in the NAD83 coordinates.

This repository contains the scripts to derivate the ENU and ECEF coordinates from the longitude, latitude, and altitude values encoded in the NAD83 coordinates.

Luigi Cruz 1 Feb 07, 2022
GeoIP Legacy Python API

MaxMind GeoIP Legacy Python Extension API Requirements Python 2.5+ or 3.3+ GeoIP Legacy C Library 1.4.7 or greater Installation With pip: $ pip instal

MaxMind 230 Nov 10, 2022
h3-js provides a JavaScript version of H3, a hexagon-based geospatial indexing system.

h3-js The h3-js library provides a pure-JavaScript version of the H3 Core Library, a hexagon-based geographic grid system. It can be used either in No

Uber Open Source 648 Jan 07, 2023
Evaluation of file formats in the context of geo-referenced 3D geometries.

Geo-referenced Geometry File Formats Classic geometry file formats as .obj, .off, .ply, .stl or .dae do not support the utilization of coordinate syst

Advanced Information Systems and Technology 11 Mar 02, 2022