A Python reference implementation of the CF data model

Overview

cfdm

A Python reference implementation of the CF data model.

GitHub tag (latest by date) PyPI Conda

Conda Website GitHub

GitHub Workflow Status Codecov

References

Website Website Website

Compliance with FAIR principles

fair-software.eu

Documentation

https://ncas-cms.github.io/cfdm

Tutorial

https://ncas-cms.github.io/cfdm/tutorial

Installation

https://ncas-cms.github.io/cfdm/installation

Functionality

The cfdm package implements the CF data model (https://doi.org/10.5194/gmd-10-4619-2017) for its internal data structures and so is able to process any CF-compliant dataset. It is not strict about CF-compliance, however, so that partially conformant datasets may be ingested from existing datasets and written to new datasets. This is so that datasets which are partially conformant may nonetheless be modified in memory.

The central elements defined by the CF data model are the field construct, which corresponds to CF-netCDF data variable with all of its metadata; and the domain contruct, which may be the domain of a field construct or corresponds to a CF-netCDF domain variable with all of its metadata.

A simple example of reading a field construct from a file and inspecting it:

>>> import cfdm
>>> f = cfdm.read('file.nc')
>>> f
[<Field: air_temperature(time(12), latitude(64), longitude(128)) K>]
>>> print(f[0])
Field: air_temperature (ncvar%tas)
----------------------------------
Data            : air_temperature(time(12), latitude(64), longitude(128)) K
Cell methods    : time(12): mean (interval: 1.0 month)
Dimension coords: time(12) = [0450-11-16 00:00:00, ..., 0451-10-16 12:00:00] noleap
                : latitude(64) = [-87.8638, ..., 87.8638] degrees_north
                : longitude(128) = [0.0, ..., 357.1875] degrees_east
                : height(1) = [2.0] m

The cfdm package can:

  • read field and domain constructs from netCDF and CDL datasets,
  • create new field and domain constructs in memory,
  • write and append field and domain constructs to netCDF datasets on disk,
  • read, write, and create coordinates defined by geometry cells,
  • read and write netCDF4 string data-type variables,
  • read, write, and create netCDF and CDL datasets containing hierarchical groups,
  • inspect field and domain constructs,
  • test whether two constructs are the same,
  • modify field and domain construct metadata and data,
  • create subspaces of field and domain constructs,
  • incorporate, and create, metadata stored in external files, and
  • read, write, and create data that have been compressed by convention (i.e. ragged or gathered arrays), whilst presenting a view of the data in its uncompressed form.

Command line utility

During installation the cfdump command line tool is also installed, which generates text descriptions of the field constructs contained in a netCDF dataset:

$ cfdump file.nc
Field: air_temperature (ncvar%tas)
----------------------------------
Data            : air_temperature(time(12), latitude(64), longitude(128)) K
Cell methods    : time(12): mean (interval: 1.0 month)
Dimension coords: time(12) = [0450-11-16 00:00:00, ..., 0451-10-16 12:00:00] noleap
                : latitude(64) = [-87.8638, ..., 87.8638] degrees_north
                : longitude(128) = [0.0, ..., 357.1875] degrees_east
                : height(1) = [2.0] m

Tests

Tests are run from within the cfdm/test directory:

$ python run_tests.py

Citation

If you use cfdm, either as a stand-alone application or to provide a CF data model implementation to another software library, please consider including the reference:

Hassell et al., (2020). cfdm: A Python reference implementation of the CF data model. Journal of Open Source Software, 5(54), 2717, https://doi.org/10.21105/joss.02717

@article{Hassell2020,
  doi = {10.21105/joss.02717},
  url = {https://doi.org/10.21105/joss.02717},
  year = {2020},
  publisher = {The Open Journal},
  volume = {5},
  number = {54},
  pages = {2717},
  author = {David Hassell and Sadie L. Bartholomew},
  title = {cfdm: A Python reference implementation of the CF data model},
  journal = {Journal of Open Source Software}
}
Comments
  • Query: setting the variable name for a grid_mapping

    Query: setting the variable name for a grid_mapping

    Hello David,

    the script you gave me works fine now, and I've been able to modify it to adjust the names of variables in the output netcdf file by using either nc_set_variable or nc_set_dimension, with one exception: there is a grid_mapping variable which is generated with the name rotated_latitude_longitude and carries the datum: I can't find a construct which corresponds to this variable. Is there a way of changing the name-in-file of the grid_mapping variable?

    question 
    opened by martinjuckes 16
  • Constants inheritance proposal #1

    Constants inheritance proposal #1

    Hi @sadielbartholomew here are some thoughts for refactoring the constants access situation so that we can most effortlessly use it in cf-python. The user-facing API is unchanged. What do you think?

    The on-line docs are unchanged. There is a small change to the help(cfdm.rtol) docs, but tolerable, I think.

    cf-python would use it as follows:

    # in cf/functions.py
    
    class rtol(cfdm.rtol):
        _CONSTANTS = CONSTANTS
        _Constant = Constant
    
    
    rtol.__doc__ = cfdm.rtol.__doc__.replace('cfdm.', 'cf.')
    

    So cf-python need not know anything about the functionality nor API nor docs from cfdm.

    This is just one idea of many, I'm sure.

    opened by davidhassell 11
  • Container copy method implies deep copy behaviour

    Container copy method implies deep copy behaviour

    Initial work towards doctesting has implied, and after further investigation I can confirm, that the copy method for the ABC Container (i.e. cfdm.core.Container.copy) that is documented as being a deep copying operation is in fact only displaying the behaviour of a shallow copy.

    For example, note how the setting of a _custom dict component of g is also reflected in f when it is an item within a container, appearing to be a reference rather than a copy of that item, but not reflected in ff when a simple object:

    >>> # Setup
    >>> import cfdm
    >>> f = cfdm.core.abstract.container.Container()
    >>> f._custom
    {}
    >>> f._custom['feature'] = ['f']
    >>> f._custom
    {'feature': ['f']}
    
    # Apply the copy, expecting it to be deep
    >>> g = f.copy()
    >>> g._custom['feature'][0] = 'g'
    >>> g._custom
    {'feature': ['g']}
    
    # ...but note how the change is also reflected in f:
    >>> f._custom
    {'feature': ['g']}
    
    # ...though changing the top-level value for g does not influence f:
    >>> g._custom['feature'] = 'gee whiz'
    >>> g._custom
    {'feature': 'gee whiz'}
    >>> f._custom
    {'feature': ['g']}
    

    Environment

    >>> cfdm.environment(paths=False)
    Platform: Linux-4.15.0-54-generic-x86_64-with-glibc2.10
    HDF5 library: 1.10.6
    netcdf library: 4.7.4
    Python: 3.8.5
    netCDF4: 1.5.4
    numpy: 1.19.4
    cfdm.core: 1.8.8.1
    cftime: 1.3.0
    netcdf_flattener: 1.2.0
    cfdm: 1.8.8.1
    
    documentation 
    opened by sadielbartholomew 9
  • Test update to GitHub Actions (Codecov coverage reports)

    Test update to GitHub Actions (Codecov coverage reports)

    Updates to our workflow to run the test suite, aiming to upload a coverage report to Codecov for two jobs, one each for the latest distros for Ubuntu and Mac using Python 3.7 as a representative Python version (I think it would be superfluous & a waste of Actions jobs to upload further coverage reports, since each job should give a very similar if not identical figure).

    Testing in this PR to try to get the two coverage reports to show as a pair of 'checks' tied to the PR. Once I get that working, I can merge the PR.

    opened by sadielbartholomew 9
  • Basic logging w/ interface to methods' verbose kwarg

    Basic logging w/ interface to methods' verbose kwarg

    Resolves #31. This sets up minimal logging to delegate all applicable library messages to the logging module, such that they emerge as before by default, but there is new functionality in that they can be filtered out by severity level via a configurable global log severity level addressing the root logger, however the verbose keyword argument in a method will behave as before by overriding the log level.

    Implementation

    The main code changes here to achieve the above are:

    • loggers, which are instantiated on a modular basis, as is recommended practice, with all inheriting from the root logger initiated in the __init__.py (if we add further logging configuration, we should move this to a dedicated module e.g. logging-config.py);
    • a new function ~~cfdm.LOG_SEVERITY_LEVEL~~ [now renamed simply cfdm.LOG_LEVEL for clarity to reflect that the levels are a mixture of verbosity- & severity- based] to change the minimal level at which log messages emerge, being WARNING by default:
      • I have written it so it accepts either the named (case insensitive) words for the levels, e.g. WARNING, INFO, or instead an integer from ~~1 through to 5~~ [now -1 to 3 in a revised schema - see later comment] mapped to those, which is easier for users to recall, though less explicit;
    • a new decorator _manage_log_level_via_verbosity which provides the interface with decorated methods' verbose keyword argument, such that, as agreed, if verbose is:
      • ~~True: will display all log messages from that method & any method it calls;~~
      • None, as is now the default for all such methods: will use the global log severity level to determine which messages to display so some at lower levels can be filtered out if configured as such;
      • ~~False: will not display any log messages from that method or any method it calls.~~
      • [Edit:] verbose now takes integer levels consistent with those supported by cfdm.LOG_LEVEL for increased granularity of per-function verbosity control, see https://github.com/NCAS-CMS/cfdm/pull/34#issuecomment-626738123 though note we decided on a different schema, as above
    • all previous if verbose: print(<message>) statements being changed to log.<level>(<message>) messages, where the conditional on verbose is no longer required as the decorator handles the equivalent logic (& more).

    Work still TODO for this PR

    If I have missed anything, let me know! But at the least I still want to:

    • [x] set the levels of calls appropriately, since I have started by setting them all to info;
    • [x] document the configurability of the logging & the overriding verbose keyword argument in a new sub-section of the docs;
    • [x] amend the docstrings of methods with verbose kwargs to indicate the interface with the log severity level;
    • [x] add some testing of the emerging log levels, & of the new function & decorator.

    Post-PR work

    1. With this PR, log messages go, as before, to STDOUT as pure messages i.e. no extra metadata such as datetime stamps or module names are included, however such extensions can now be trivially set up in new handlers e.g. file handlers with full detail to dedicated log files for user support purposes etc., if we wish.
    2. This PR does not add new logging calls, it simply replaces current print calls, and now the basis is in place we should add meaningful messages across the codebase at applicable levels.
    3. Improved display of objects for readability. I have added a few pretty-print calls already to do this in some cases (e.g. see pprint.pformat in netcdfread.py).
    enhancement 
    opened by sadielbartholomew 9
  • Should the (field) equality operation be noncommutative?

    Should the (field) equality operation be noncommutative?

    There are cases where a.equals(b) evaluates differently to b.equals(a), at least with a and b being fields as I recently noticed during the work towards append mode (#69), namely for cases where the fields in question are the same except for one having a component missing. In other words, our equals method appears to not be commutative/symmetric for certain operands.

    See below for details of the particular case which I observed to give different results (confirmed on the master) depending on the order of fields as class upon which the method acts or the parameter.

    This raises some questions for me, because I found the difference in output of a * b and b * a confusing given equality in a logical sense should, to me (e.g. it's certainly the case in a mathematical sense), imply commutative behaviour. The relevant parts of the documentation did not seem to provide any information or clues as to whether the equality method should be symmetric or not, but I thought:

    Equality is strict by default.

    suggests it should, though perhaps (my emphasis):

    Any type of object may be tested but, in general, equality is only possible with another object of the same type, or a subclass of one

    could be relevant?

    My questions are:

    1. Is a difference in result for a.equals(b) and b.equals(a) something that should be possible, particularly in cases such as that outlined below, or is it a bug we should fix?
    2. In either case, I think we should add a few lines to the documentation to explicitly outline whether these cases are possible and for what inputs and what constructs the equals method is bound to, so there is no ambiguity.
    3. If it is a bug and we ensure symmetrical behaviour, could we and should we make it configurable so a field and another that is the same but reduced can be treated as equal if users desire in some context, e.g. with a kwarg called something like accept_subset?

    Example case

    Note this example with one field a and a field b that is the same but missing a time dimension coordinate:

    ...
    >>> a.dump()
    ------------------------------------------------------------------
    Field: air_potential_temperature (ncvar%air_potential_temperature)
    ------------------------------------------------------------------
    Conventions = 'CF-1.8'
    standard_name = 'air_potential_temperature'
    units = 'K'
    
    Data(time(36), latitude(5), longitude(8)) = [[[210.7, ..., 286.6]]] K
    
    Cell Method: area: mean
    
    Domain Axis: air_pressure(1)
    Domain Axis: latitude(5)
    Domain Axis: longitude(8)
    Domain Axis: time(36)
    
    Dimension coordinate: time
        standard_name = 'time'
        units = 'days since 1959-01-01'
        Data(time(36)) = [1959-12-16 12:00:00, ..., 1962-11-16 00:00:00]
        Bounds:Data(time(36), 2) = [[1959-12-01 00:00:00, ..., 1962-12-01 00:00:00]]
    
    Dimension coordinate: latitude
        standard_name = 'latitude'
        units = 'degrees_north'
        Data(latitude(5)) = [-75.0, ..., 75.0] degrees_north
        Bounds:Data(latitude(5), 2) = [[-90.0, ..., 90.0]] degrees_north
    
    Dimension coordinate: longitude
        standard_name = 'longitude'
        units = 'degrees_east'
        Data(longitude(8)) = [22.5, ..., 337.5] degrees_east
        Bounds:Data(longitude(8), 2) = [[0.0, ..., 360.0]] degrees_east
    
    Dimension coordinate: air_pressure
        standard_name = 'air_pressure'
        units = 'hPa'
        Data(air_pressure(1)) = [850.0] hPa
    >>> b.dump()
    --------------------------------------------------------------------
    Field: air_potential_temperature (ncvar%air_potential_temperature_1)
    --------------------------------------------------------------------
    Conventions = 'CF-1.8'
    standard_name = 'air_potential_temperature'
    units = 'K'
    
    Data(ncdim%time_1(36), latitude(5), longitude(8)) = [[[210.7, ..., 286.6]]] K
    
    Cell Method: area: mean
    
    Domain Axis: air_pressure(1)
    Domain Axis: latitude(5)
    Domain Axis: longitude(8)
    Domain Axis: ncdim%time_1(36)
    
    Dimension coordinate: latitude
        standard_name = 'latitude'
        units = 'degrees_north'
        Data(latitude(5)) = [-75.0, ..., 75.0] degrees_north
        Bounds:Data(latitude(5), 2) = [[-90.0, ..., 90.0]] degrees_north
    
    Dimension coordinate: longitude
        standard_name = 'longitude'
        units = 'degrees_east'
        Data(longitude(8)) = [22.5, ..., 337.5] degrees_east
        Bounds:Data(longitude(8), 2) = [[0.0, ..., 360.0]] degrees_east
    
    Dimension coordinate: air_pressure
        standard_name = 'air_pressure'
        units = 'hPa'
        Data(air_pressure(1)) = [850.0] hPa
    >>> a.equals(b)
    False
    >>> b.equals(a)
    True
    >>> a.equals(b, verbose=-1)
    Constructs: Comparing <DimensionCoordinate: time(36) days since 1959-01-01 >, <DimensionCoordinate: latitude(5) degrees_north>: 
    Constructs: Can't match constructs spanning axes ['time']
    Constructs: Can't match <DimensionCoordinate: time(36) days since 1959-01-01 >
    Constructs: Can't match <DimensionCoordinate: time(36) days since 1959-01-01 >
    Constructs: Can't match <DimensionCoordinate: time(36) days since 1959-01-01 >
    Field: Different metadata constructs
    False
    
    bug question 
    opened by sadielbartholomew 7
  • Allow functions to be decorated with via_verbosity

    Allow functions to be decorated with via_verbosity

    The _manage_log_level_via_verbosity decorator was only applicable and functioning for use on methods, not for functions not bound to classes, but there are a number of cases of the latter in both cfdm and cf where we want to apply it. This PR:

    • confirms the above via extending the decorator unit test so there is failure when it is applied to a function; &
    • generalises the decorator so it can be used in any case, fixing the new test failures (this was easy as self can be included in the *args already included in the calls, so I just had to remove the explicit reference to self).
    opened by sadielbartholomew 7
  • Allow n=6 example field to be written to classic netCDF-3

    Allow n=6 example field to be written to classic netCDF-3

    As observed in the unit test for append mode, #69, the seventh (index 6) example field can't be written to 'NETCDF3_CLASSIC' format due to having int64 data, whereas the rest of the examples are writable to any of the supported netCDF formats:

    >>> import cfdm
    >>> f = cfdm.example_field(6)
    >>> # Writes fine to all of the other formats, e.g:
    >>> cfdm.write(f, 'file-to-write-to.nc', fmt='NETCDF3_64BIT_DATA')
    >>> cfdm.write(f, 'file-to-write-to.nc', fmt='NETCDF4')
    >>> # ... just not NETCDF3_CLASSIC:
    >>> cfdm.write(f, 'file-to-write-to.nc', fmt='NETCDF3_CLASSIC')
    Traceback (most recent call last):
      File "/home/sadie/cfdm/cfdm/read_write/netcdf/netcdfwrite.py", line 2728, in _write_netcdf_variable
        self._createVariable(**kwargs)
      File "/home/sadie/cfdm/cfdm/read_write/netcdf/netcdfwrite.py", line 2476, in _createVariable
        g["nc"][ncvar] = g["netcdf"].createVariable(**kwargs)
      File "src/netCDF4/_netCDF4.pyx", line 2771, in netCDF4._netCDF4.Dataset.createVariable
      File "src/netCDF4/_netCDF4.pyx", line 3822, in netCDF4._netCDF4.Variable.__init__
      File "src/netCDF4/_netCDF4.pyx", line 1950, in netCDF4._netCDF4._ensure_nc_success
    RuntimeError: NetCDF: Not a valid data type or _FillValue type mismatch
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    ...
        nc_encodings = self._write_node_count(
      File "/home/sadie/cfdm/cfdm/read_write/netcdf/netcdfwrite.py", line 1693, in _write_node_count
        self._write_netcdf_variable(ncvar, (geometry_dimension,), count)
      File "/home/sadie/cfdm/cfdm/read_write/netcdf/netcdfwrite.py", line 2734, in _write_netcdf_variable
        raise ValueError(
    ValueError: Can't write int64 data from <Count: (2) > to a NETCDF3_CLASSIC file. Consider using a netCDF4 format, or use the 'datatype' parameter, or change the datatype before writing.
    

    So we said we would change it so it can be written to all such formats as with all other examples.

    As a workaround for now for testing in #69 I added a skip at the relevant place in 15bd0d12f1347c97bc40d02032ab2b3aea992f7f.

    opened by sadielbartholomew 6
  • Support for CDL

    Support for CDL

    It would be really useful to be able to read CDL files directly into cfdm, rather than having to first convert to binary netCDF files. Can this be added?

    bug enhancement 
    opened by martinjuckes 6
  • Setting a vertical coordinate reference system

    Setting a vertical coordinate reference system

    I've just started using cfdm ... still finding my way around the many classes. I'm trying to implement a vertical coordinate reference system for an atmosphere_hybrid_height_coordinate .. using the "more complete" example in the tutorial (https://ncas-cms.github.io/cfdm/1.7.1/tutorial.html ) .. which shows all the structures I need. In my code I get an error message (copied below) about an unexpected argument. I can reproduce the error if I take your script from the tutorial (which works as it is) and comment out the line "tas.set_construct(horizontal_crs)". Then, as in the script I want to create, you only have a vertical coordinate reference. The script still executes fine. tas.dump() also works as expected, but cfdm.write( "tas.nc", tas ) produces the following:

    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/usr/local/lib/python3.5/dist-packages/cfdm/read_write/write.py", line 357, in write
        verbose=verbose)
      File "/usr/local/lib/python3.5/dist-packages/cfdm/read_write/netcdf/netcdfwrite.py", line 3402, in write
        self._write_field(f)
      File "/usr/local/lib/python3.5/dist-packages/cfdm/read_write/netcdf/netcdfwrite.py", line 2699, in _write_field
        self._create_vertical_datum(ref, owning_coord_key)
      File "/usr/local/lib/python3.5/dist-packages/cfdm/read_write/netcdf/netcdfwrite.py", line 2853, in _create_vertical_datum
        datum=self.implementation.get_datum(ref))
    TypeError: initialise_CoordinateReference() got an unexpected keyword argument 'coordinates'
    
    bug 
    opened by martinjuckes 6
  • Online docs are for the wroing version

    Online docs are for the wroing version

    The on-line docs at https://ncas-cms.github.io/cfdm/ are for v1.9.0.4, but the files in the docs directory have been correctly updated to v1.10.0.0 (i.e. the current version).

    Any ideas, @sadielbartholomew?

    bug 
    opened by davidhassell 5
  • Document accepted argument types for `source` parameters

    Document accepted argument types for `source` parameters

    Equivalent to NCAS-CMS/cf-python#492 but note that here there are many more (in total ~50) classes which accept the source: parameter (since these get inherited downstream for cf-python which doesn't need to redefine them), so once we decide what to document for that parameter, consistently, we probably want to start by updating them here in cfdm:

    $ pwd
    /home/sadie/cfdm/cfdm
    $ git grep "source:"
    auxiliarycoordinate.py:            source: optional
    bounds.py:            source: optional
    cellmeasure.py:            source: optional
    coordinatereference.py:            source: optional
    core/abstract/container.py:            source: optional
    core/abstract/parameters.py:            source: optional
    core/abstract/parameters.py:        if source:
    core/abstract/parametersdomainancillaries.py:            source: optional
    core/abstract/parametersdomainancillaries.py:        if source:
    core/abstract/properties.py:            source: optional
    core/abstract/propertiesdata.py:            source: optional
    core/abstract/propertiesdatabounds.py:            source: optional
    core/bounds.py:            source: optional
    core/cellmeasure.py:            source: optional
    auxiliarycoordinate.py:            source: optional
    bounds.py:            source: optional
    cellmeasure.py:            source: optional
    coordinatereference.py:            source: optional
    core/abstract/container.py:            source: optional
    core/abstract/parameters.py:            source: optional
    core/abstract/parameters.py:        if source:
    core/abstract/parametersdomainancillaries.py:            source: optional
    core/abstract/parametersdomainancillaries.py:        if source:
    core/abstract/properties.py:            source: optional
    core/abstract/propertiesdata.py:            source: optional
    core/abstract/propertiesdatabounds.py:            source: optional
    core/bounds.py:            source: optional
    core/cellmeasure.py:            source: optional
    auxiliarycoordinate.py:            source: optional
    bounds.py:            source: optional
    cellmeasure.py:            source: optional
    coordinatereference.py:            source: optional
    core/abstract/container.py:            source: optional
    core/abstract/parameters.py:            source: optional
    core/abstract/parameters.py:        if source:
    core/abstract/parametersdomainancillaries.py:            source: optional
    core/abstract/parametersdomainancillaries.py:        if source:
    core/abstract/properties.py:            source: optional
    core/abstract/propertiesdata.py:            source: optional
    core/abstract/propertiesdatabounds.py:            source: optional
    core/bounds.py:            source: optional
    core/cellmeasure.py:            source: optional
    core/cellmethod.py:            source: optional
    core/cellmethod.py:        if source:
    core/constructs.py:            source: optional
    core/coordinatereference.py:            source: optional
    core/coordinatereference.py:        if source:
    core/data/data.py:            source: *optional*
    core/domain.py:            source: optional
    core/domainaxis.py:            source:
    core/field.py:            source: optional
    count.py:            source: optional
    data/abstract/compressedarray.py:            source: optional
    data/abstract/raggedarray.py:            source: optional
    data/data.py:            source: optional
    data/gatheredarray.py:            source: optional
    data/netcdfarray.py:            source: optional
    data/raggedcontiguousarray.py:            source: optional
    data/raggedindexedarray.py:            source: optional
    data/raggedindexedcontiguousarray.py:            source: optional
    data/subarray/abstract/subarray.py:            source: optional
    data/subarray/abstract/subsampledsubarray.py:            source: optional
    data/subarray/gatheredsubarray.py:            source: optional
    data/subsampledarray.py:            source: optional
    datum.py:            source: optional
    dimensioncoordinate.py:            source: optional
    domain.py:            source: optional
    (cf-env) :)[14:20:08]cfdm>git grep "source:"
    auxiliarycoordinate.py:            source: optional
    bounds.py:            source: optional
    cellmeasure.py:            source: optional
    coordinatereference.py:            source: optional
    core/abstract/container.py:            source: optional
    core/abstract/parameters.py:            source: optional
    core/abstract/parameters.py:        if source:
    core/abstract/parametersdomainancillaries.py:            source: optional
    core/abstract/parametersdomainancillaries.py:        if source:
    core/abstract/properties.py:            source: optional
    core/abstract/propertiesdata.py:            source: optional
    core/abstract/propertiesdatabounds.py:            source: optional
    core/bounds.py:            source: optional
    core/cellmeasure.py:            source: optional
    core/cellmethod.py:            source: optional
    core/cellmethod.py:        if source:
    core/constructs.py:            source: optional
    core/coordinatereference.py:            source: optional
    core/coordinatereference.py:        if source:
    core/data/data.py:            source: *optional*
    core/domain.py:            source: optional
    core/domainaxis.py:            source:
    core/field.py:            source: optional
    count.py:            source: optional
    data/abstract/compressedarray.py:            source: optional
    data/abstract/raggedarray.py:            source: optional
    data/data.py:            source: optional
    data/gatheredarray.py:            source: optional
    data/netcdfarray.py:            source: optional
    data/raggedcontiguousarray.py:            source: optional
    data/raggedindexedarray.py:            source: optional
    data/raggedindexedcontiguousarray.py:            source: optional
    data/subarray/abstract/subarray.py:            source: optional
    data/subarray/abstract/subsampledsubarray.py:            source: optional
    data/subarray/gatheredsubarray.py:            source: optional
    data/subsampledarray.py:            source: optional
    datum.py:            source: optional
    dimensioncoordinate.py:            source: optional
    domain.py:            source: optional
    domainancillary.py:            source: optional
    domainaxis.py:            source: optional
    field.py:            source: optional
    fieldancillary.py:            source: optional
    index.py:            source: optional
    interiorring.py:            source: optional
    interpolationparameter.py:            source: optional
    list.py:            source: optional
    mixin/files.py:            source: optional
    mixin/netcdf.py:            source: optional
    mixin/propertiesdatabounds.py:            source: optional
    nodecountproperties.py:            source: optional
    partnodecountproperties.py:            source: optional
    tiepointindex.py:            source: optional
    $ git grep "source:" | wc -l
    53
    
    documentation code tidy/refactor 
    opened by sadielbartholomew 0
  • Slight change in `_FillValue` property value on write-read

    Slight change in `_FillValue` property value on write-read

    Possible bug, but also possibly due to floating point precision subtleties. Identified by David during another PR and its review. See https://github.com/NCAS-CMS/cfdm/pull/222#issuecomment-1295208100 for details.

    bug 
    opened by sadielbartholomew 0
  • Zero values reported by `repr` on fully-masked datetime arrays

    Zero values reported by `repr` on fully-masked datetime arrays

    Identified by David during another PR and its review. See https://github.com/NCAS-CMS/cfdm/pull/222#issuecomment-1295197082 for details. The zeroes should not appear and it isn't clear without investigation why they are aren't being printed as masked values like they should be, given the nature of the underlying (masked) data.

    bug 
    opened by sadielbartholomew 0
  • Duplicate initialisation message codes in `NetCDFRead`

    Duplicate initialisation message codes in `NetCDFRead`

    Whilst improving some docstrings in #183 I noticed that there were two cases where duplicate integer values were provided as codes in the _code0 dictionary, namely duplication on 200 and 201 with:

    https://github.com/NCAS-CMS/cfdm/blob/a9816e02f1640f7c63e8214122b302d1206b68ee/cfdm/read_write/netcdf/netcdfread.py#L35-L36

    and

    https://github.com/NCAS-CMS/cfdm/blob/a9816e02f1640f7c63e8214122b302d1206b68ee/cfdm/read_write/netcdf/netcdfread.py#L53-L54

    where for all other keys in that dictionary, and indeed in the similar dict _code1, the values are unique numbers. Possibly this wasn't noticed before due to the ordering of the key-value pairs, which are ordered with the values strictly increasing except for the former pair of bounds-related cases which were out-of-place.

    @davidhassell is this an issue or is there a reason for the duplication? Thanks.

    question 
    opened by sadielbartholomew 0
  • Consider string-like data equal despite different data type

    Consider string-like data equal despite different data type

    To implement the changed behaviour agreed to be appropriate for Data.equals and non-numeric array inputs in the mini-thread https://github.com/NCAS-CMS/cf-python/pull/254#discussion_r786631596, both for cfdm itself and as passed downstream to cf-python. Namely:

    we should allow any two string-like data types to be equal

    noting this is (knowingly) in contrast to the behaviour we have chosen for numeric data, where different (numeric) data types will be considered unequal, even if the same data is the same.

    opened by sadielbartholomew 0
  • Subsampled coordinates (2: writing to disk)

    Subsampled coordinates (2: writing to disk)

    In CF-1.9, lossy compression by subsampled coordinates was introduced, and needs to be implemented in cfdm.

    #167 deals with most of this, but not the writing of subsampled coordinates to netCDF files. This needs to be dealt with separately, as it may involve a refactor of netcdfwrite.py.

    enhancement 
    opened by davidhassell 0
Releases(v1.10.0.1)
  • v1.10.0.1(Oct 31, 2022)

    2022-10-31

    • New method: cfdm.Data.get_tie_point_indices
    • New method: cfdm.Data.get_interpolation_parameters
    • New method: cfdm.Data.get_dependent_tie_points
    • Record the names of files that contain the original data (https://github.com/NCAS-CMS/cfdm/issues/215)
    • New method: cfdm.Field.get_original_filenames
    • New method: cfdm.Data.get_original_filenames
    • New keyword parameter to cfdm.write: omit_data (https://github.com/NCAS-CMS/cfdm/issues/221)
    • Fixed bug that caused incorrect data assignment with some multiple list indices (https://github.com/NCAS-CMS/cfdm/issues/217)
    • Fixed bug that caused a failure when printing date-time data with the first element masked (https://github.com/NCAS-CMS/cfdm/issues/211)
    Source code(tar.gz)
    Source code(zip)
  • v1.10.0.0(Aug 17, 2022)

    2022-08-17

    • New method: cfdm.Field.auxiliary_coordinate
    • New method: cfdm.Field.cell_measure
    • New method: cfdm.Field.cell_method
    • New method: cfdm.Field.coordinate
    • New method: cfdm.Field.coordinate_reference
    • New method: cfdm.Field.dimension_coordinate
    • New method: cfdm.Field.domain_ancillary
    • New method: cfdm.Field.domain_axis
    • New method: cfdm.Field.field_ancillary
    • New method: cfdm.Field.indices
    • New attribute: cfdm.Field.array
    • New attribute: cfdm.Field.datetime_array
    • New construct retrieval API methods (https://github.com/NCAS-CMS/cfdm/issues/179)
    • Implement (bar writing to netCDF files) lossy compression by coordinate subsampling (https://github.com/NCAS-CMS/cfdm/issues/167)
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0.4(Aug 1, 2022)

    2022-07-18

    • Upgrade to allow cfdm to work with Python 3.10 (https://github.com/NCAS-CMS/cfdm/issues/187)
    • Fix bug that caused a hang when reading zero-length files (https://github.com/NCAS-CMS/cfdm/issues/190)
    • Fix bug to prevent error when writing vlen strings to a netCDF file when compression has been set (for netCDF4>=1.6.0) (https://github.com/NCAS-CMS/cfdm/issues/199)
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0.3(Mar 10, 2022)

    2022-03-10

    • Fixed bug that caused a failure from cfdm.write when writing identical (auxiliary) coordinates to different data variables in different groups (https://github.com/NCAS-CMS/cfdm/issues/177)
    • Fixed bug that caused cf.Domain.__str__ to fail when a dimension coordinate construct does not have data (https://github.com/NCAS-CMS/cfdm/issues/174)
    • New dependency: packaging>=20.0
    • Changed dependency: cftime>=1.6.0
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0.2(Jan 31, 2022)

    2022-01-31

    • Fixed bug that caused a cfdm.write failure when a vertical coordinate reference construct has no coordinates (https://github.com/NCAS-CMS/cfdm/issues/164)
    • Fixed bug that caused a failure when downstream identities methods return an itertools.chain object (https://github.com/NCAS-CMS/cfdm/issues/170)
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0.1(Oct 12, 2021)

    2021-10-12

    • Fixed bug that prevented some geometry coordinates being written to netCDF CLASSIC files (https://github.com/NCAS-CMS/cfdm/issues/140)
    • Fixed bug that a caused segmentation fault when appending a string data type to netCDF files (https://github.com/NCAS-CMS/cfdm/issues/155)
    • Fixed bug in cf.Field.get_domain when there are climatological time axes (https://github.com/NCAS-CMS/cfdm/issues/159)
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0.0(Sep 21, 2021)

    2021-09-21

    • Python 3.6 support removed (https://github.com/NCAS-CMS/cfdm/issues/139)
    • Conversion of cfdm.Domain to a non-abstract that may be read from and written to a netCDF dataset (https://github.com/NCAS-CMS/cfdm/issues/111)
    • New method: cfdm.Domain.creation_commands
    • New method: cfdm.Domain.climatological_time_axes
    • New method: cfdm.AuxiliaryCoordinate.del_climatology
    • New method: cfdm.AuxiliaryCoordinate.get_climatology
    • New method: cfdm.AuxiliaryCoordinate.is_climatology
    • New method: cfdm.AuxiliaryCoordinate.set_climatology
    • New method: cfdm.DimensionCoordinate.del_climatology
    • New method: cfdm.DimensionCoordinate.get_climatology
    • New method: cfdm.DimensionCoordinate.is_climatology
    • New method: cfdm.DimensionCoordinate.set_climatology
    • New function: cfdm.unique_constructs
    • New function: cfdm.example_fields
    • Construct access API changes from 1.8.9.0 applied to Field.convert
    • Improved error message for invalid inputs to Field.convert
    • Raise exception when attempting to write multiply defined coordinate reference parameters (https://github.com/NCAS-CMS/cfdm/issues/148)
    • Interpret format specifiers for size 1 cfdm.Data arrays (https://github.com/NCAS-CMS/cfdm/issues/152)
    • Fix file name expansions in cfdm.write (https://github.com/NCAS-CMS/cfdm/issues/157)
    Source code(tar.gz)
    Source code(zip)
  • v1.8.9.0(May 25, 2021)

    2021-05-25

    • Construct access API changes (https://github.com/NCAS-CMS/cfdm/issues/124, https://github.com/NCAS-CMS/cfdm/issues/130, https://github.com/NCAS-CMS/cfdm/issues/132, https://github.com/NCAS-CMS/cfdm/issues/137)
    • Performance enhancements (https://github.com/NCAS-CMS/cfdm/issues/124, https://github.com/NCAS-CMS/cfdm/issues/130)
    • New write mode mode='a' for appending to, rather than over-writing, a netCDF file on disk (https://github.com/NCAS-CMS/cfdm/issues/143)
    • Better error message in the case of a numpy.ma.core.MaskError occurring upon reading of CDL files with only header or coordinate information (https://github.com/NCAS-CMS/cfdm/issues/128)
    • Fix for zero-sized unlimited dimensions when read from a grouped netCDF file (https://github.com/NCAS-CMS/cfdm/issues/113)
    • Fix bug causing occasional non-symmetric equals operations (https://github.com/NCAS-CMS/cfdm/issues/133)
    • Changed dependency: cftime>=1.5.0
    • Changed dependency: netCDF4>=1.5.4
    Source code(tar.gz)
    Source code(zip)
  • v1.8.8.0(May 24, 2021)

    2020-12-18

    • The setting of global constants can now be controlled by a context manager (https://github.com/NCAS-CMS/cfdm/issues/100)
    • Fixed bug that caused a failure when writing a dataset that contains a scalar domain ancillary construct (https://github.com/NCAS-CMS/cfdm/issues/98)
    • Changed dependency: cftime>=1.3.0
    Source code(tar.gz)
    Source code(zip)
Owner
NCAS CMS
Useful tools to support NERC weather and climate research
NCAS CMS
Train SN-GAN with AdaBelief

SNGAN-AdaBelief Train a state-of-the-art spectral normalization GAN with AdaBelief https://github.com/juntang-zhuang/Adabelief-Optimizer Acknowledgeme

Juntang Zhuang 10 Jun 11, 2022
Linear algebra python - Number of operations and problems in Linear Algebra and Numerical Linear Algebra

Linear algebra in python Number of operations and problems in Linear Algebra and

Alireza 5 Oct 09, 2022
Julia and Matlab codes to simulated all problems in El-Hachem, McCue and Simpson (2021)

Substrate_Mediated_Invasion Julia and Matlab codes to simulated all problems in El-Hachem, McCue and Simpson (2021) 2DSolver.jl reproduces the simulat

Matthew Simpson 0 Nov 09, 2021
Code for Transformers Solve Limited Receptive Field for Monocular Depth Prediction

Official PyTorch code for Transformers Solve Limited Receptive Field for Monocular Depth Prediction. Guanglei Yang, Hao Tang, Mingli Ding, Nicu Sebe,

stanley 152 Dec 16, 2022
Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation

Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation Introduction ACoSP is an online pruning algorithm that compr

Merantix 8 Dec 07, 2022
LAVT: Language-Aware Vision Transformer for Referring Image Segmentation

LAVT: Language-Aware Vision Transformer for Referring Image Segmentation Where we are ? 12.27 目前和原论文仍有1%左右得差距,但已经力压很多SOTA了 ckpt__448_epoch_25.pth mIoU

zichengsaber 60 Dec 11, 2022
This is the code for Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning

This is the code for Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning It includes /bert, which is the original BERT repos

Mitchell Gordon 11 Nov 15, 2022
Roger Labbe 13k Dec 29, 2022
Self Driving RC Car Code

Derp Learning Derp Learning is a Python package that collects data, trains models, and then controls an RC car for track racing. Hardware You will nee

Not Karol 39 Dec 07, 2022
Pytorch implementation of our method for regularizing nerual radiance fields for few-shot neural volume rendering.

InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering Pytorch implementation of our method for regularizing nerual radiance fields f

106 Jan 06, 2023
Official repository for the CVPR 2021 paper "Learning Feature Aggregation for Deep 3D Morphable Models"

Deep3DMM Official repository for the CVPR 2021 paper Learning Feature Aggregation for Deep 3D Morphable Models. Requirements This code is tested on Py

38 Dec 27, 2022
Repository of 3D Object Detection with Pointformer (CVPR2021)

3D Object Detection with Pointformer This repository contains the code for the paper 3D Object Detection with Pointformer (CVPR 2021) [arXiv]. This wo

Zhuofan Xia 117 Jan 06, 2023
Official PyTorch implementation of the preprint paper "Stylized Neural Painting", accepted to CVPR 2021.

Official PyTorch implementation of the preprint paper "Stylized Neural Painting", accepted to CVPR 2021.

Zhengxia Zou 1.5k Dec 28, 2022
Lipschitz-constrained Unsupervised Skill Discovery

Lipschitz-constrained Unsupervised Skill Discovery This repository is the official implementation of Seohong Park, Jongwook Choi*, Jaekyeom Kim*, Hong

Seohong Park 17 Dec 18, 2022
Pytorch Implementation of Zero-Shot Image-to-Text Generation for Visual-Semantic Arithmetic

Pytorch Implementation of Zero-Shot Image-to-Text Generation for Visual-Semantic Arithmetic [Paper] [Colab is coming soon] Approach Example Usage To r

170 Jan 03, 2023
Minimalistic PyTorch training loop

Backbone for PyTorch training loop Will try to keep it minimalistic. pip install back from back import Bone Features Progress bar Checkpoints saving/l

Kashin 4 Jan 16, 2020
Image restoration with neural networks but without learning.

Warning! The optimization may not converge on some GPUs. We've personally experienced issues on Tesla V100 and P40 GPUs. When running the code, make s

Dmitry Ulyanov 7.4k Jan 01, 2023
GAN-STEM-Conv2MultiSlice - Exploring Generative Adversarial Networks for Image-to-Image Translation in STEM Simulation

GAN-STEM-Conv2MultiSlice GAN method to help covert lower resolution STEM images generated by convolution methods to higher resolution STEM images gene

UW-Madison Computational Materials Group 2 Feb 10, 2021
Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow

Mask R-CNN for Object Detection and Segmentation This is an implementation of Mask R-CNN on Python 3, Keras, and TensorFlow. The model generates bound

Matterport, Inc 22.5k Jan 04, 2023
🌾 PASTIS 🌾 Panoptic Agricultural Satellite TIme Series

🌾 PASTIS 🌾 Panoptic Agricultural Satellite TIme Series (optical and radar) The PASTIS Dataset Dataset presentation PASTIS is a benchmark dataset for

86 Jan 04, 2023