Graviti TensorBay Python SDK

Overview

TensorBay Python SDK

TensorBay Python SDK is a python library to access TensorBay and manage your datasets.
It provides:

  • A pythonic way to access your TensorBay resources by TensorBay OpenAPI.
  • An easy-to-use CLI tool gas (Graviti AI service) to communicate with TensorBay.
  • A consistent dataset format to read and write your datasets.

Installation

pip3 install tensorbay

Documentation

More information can be found on the documentation site

Usage

An AccessKey is needed to communicate with TensorBay. Please visit this page to get an AccessKey first.

Authorize a client object

from tensorbay import GAS
gas = GAS("")

Create a Dataset

gas.create_dataset("DatasetName")

List Dataset names

# Method "list_dataset_names()" returns an iterator, use "list()" to transfer it to a "list".
dataset_list = list(gas.list_dataset_names())

Upload images to the Dataset

from tensorbay.dataset import Data, Dataset

# Organize the local dataset by the "Dataset" class before uploading.
dataset = Dataset("DatasetName")

# TensorBay uses "segment" to separate different parts in a dataset.
segment = dataset.create_segment()

segment.append(Data("0000001.jpg"))
segment.append(Data("0000002.jpg"))

dataset_client = gas.upload_dataset(dataset)

# TensorBay provides dataset version control feature, commit the uploaded data before using it.
dataset_client.commit("Initial commit")

Read images from the Dataset

from PIL import Image
from tensorbay.dataset import Segment

dataset_client = gas.get_dataset("DatasetName")

segment = Segment("", dataset_client)

for data in segment:
    with data.open() as fp:
        image = Image.open(fp)
        width, height = image.size
        image.show()

Delete the Dataset

gas.delete_dataset("DatasetName")
Comments
  • Dataset empty after upload

    Dataset empty after upload

    I uploaded a dataset with the following code:

    from glob import glob
    from tensorbay import GAS
    from tensorbay.dataset import Dataset, Data
    from tensorbay.label import Classification
    from tensorbay.opendataset._utility import glob
    
    import os
    
    gas = GAS("") # access key
    
    _SEGMENTS = {"train": True, "test": True}
    
    gas.create_dataset("cifar-10")
    
    
    def cifar_10(path: str) -> Dataset:
        root_path = os.path.abspath(os.path.expanduser(path))
        dataset = Dataset("cifar-10")
        dataset.load_catalog(os.path.join(os.path.dirname(__file__), "catalog.json"))
    
        for segment_name, is_labeled in _SEGMENTS.items():
            segment = dataset.create_segment(segment_name)
            image_paths = glob(os.path.join(root_path, segment_name, "*\*.jpg"))
            for image_path in image_paths:
                data = Data(image_path)
                if is_labeled:
                    data.label.classification = Classification(
                        os.path.basename(os.path.dirname(image_path))
                    )
                segment.append(data)
    
        return dataset
    
    ds = cifar_10("..\\CIFAR-10-images")
    dataset_client = gas.upload_dataset(ds, jobs=8)
    
    

    and tried loading the dataset with:

    >>> from tensorbay import GAS
    >>> from tensorbay.dataset import Dataset
    >>> gas = GAS("") # access key
    >>> ds = Dataset("cifar-10", gas)
    >>> ds
    Dataset("cifar-10") []
    >>> ds.keys()
    ()
    

    looks like the dataset is empty.

    opened by FayazRahman 6
  • fix(cli): fix the wrong graph in

    fix(cli): fix the wrong graph in "gas log" when merging branches

    When merging the left branch, the right branch also needs to move to the left The wrong sample:

    * 4c564ea (b3) 15
    * a3f25ce 13
    * ebb775e 8
    | * 2da7d6e (b1) 14
    | * aeac003 6
    | | * 8c9e82e (main) 12
    | | | * 1c22915 (b5) 11
    | | |/
    | |/|
    | * | bec94e0 a
    | | | * ef0e4c2 (b2) 10
    | | | * dee8986 7
    | | |/
    | |/|
    |/| |
    * | | d7677e0 4
    * | | e1d77c4 3
    |/|
    | | * f478341 (b4) 9
    | |/
    | * dc2a3cc 5
    |/
    * d0eeb20 2
    * bce9056 1
    

    Fixed:

    * 4c564ea (* ebb775e 8
    | * 2da7d6e (b1) 14
    | * aeac003 6
    | | * 8c9e82e (main) 12
    | | | * 1c22915 (b5) 11
    | | |/
    | |/|
    | * | bec94e0 a
    | | | * ef0e4c2 (b2) 10
    | | | * dee8986 7
    | | |/
    | |/|
    |/| |
    * | | d7677e0 4
    * | | e1d77c4 3
    |/ /
    | | * f478341 (b4) 9
    | |/
    | * dc2a3cc 5
    |/
    * d0eeb20 2
    * bce9056 1
    
    opened by marshallmallows 2
  • style(geometry): fix the mypy type incompatible error in polyline.py

    style(geometry): fix the mypy type incompatible error in polyline.py

    tensorbay/geometry/polyline.py|187 col 68 error| [mypy][E][]
    Argument 1 to "_max_distance_in_point_pairs" of "Polyline2D" has incompatible
    type "Sequence[Sequence[float]]"; expected "ndarray"  [arg-type]
    
    tensorbay/geometry/polyline.py|187 col 75 error| [mypy][E][]
    Argument 2 to "_max_distance_in_point_pairs" of "Polyline2D" has incompatible
    type "Sequence[Sequence[float]]"; expected "ndarray"  [arg-type]
    
    tensorbay/geometry/polyline.py|197 col 68 error| [mypy][E][]
    Argument 1 to "_max_distance_in_point_pairs" of "Polyline2D" has incompatible
    type "Sequence[Sequence[float]]"; expected "ndarray"  [arg-type]
    
    tensorbay/geometry/polyline.py|197 col 79 error| [mypy][E][]
    Argument 2 to "_max_distance_in_point_pairs" of "Polyline2D" has incompatible
    type "List[Sequence[float]]"; expected "ndarray"  [arg-type]
    
    opened by QianBao8902 2
  • AttributeError occurred when using `keys()` to list dataset segment names

    AttributeError occurred when using `keys()` to list dataset segment names

    AttributeError occurred when listing dataset segments of the dataset "Flower17" bysegment_names = dataset.keys().

    Environment:

    • SDK version 1.5.0
    • MacOS 10.15.7
    • Python 3.8.3

    Error Message:

    Traceback (most recent call last):
      File "issue_for_dataset_keys.py", line 8, in <module>
        segment_names = dataset.keys()
      File "/Users/yanghaote/.pyenv/versions/3.8.3/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorbay/dataset/dataset.py", line 253, in keys
        return tuple(self._segments._data)
    AttributeError: 'Dataset' object has no attribute '_segments'
    
    
    opened by Meeks0125 2
  • ValueError occurred when using `Vector2D.index()` to get the index of the last value.

    ValueError occurred when using `Vector2D.index()` to get the index of the last value.

    ValueError occurred when using Vector2D.index() to get the index of the last value.

    Environment:

    tensorbay 1.4.1 Ubuntu 20.04 python 3.8.5

    Error message:

    In [1]: from tensorbay.geometry import Vector2D                                                                                                                                                                                                                                                                                                                                               
    
    In [2]: v = Vector2D(1, 2)                                                                                                                                                                                                                                                                                                                                                                    
    
    In [3]: v                                                                                                                                                                                                                                                                                                                                                                                     
    Out[3]: Vector2D(1, 2)
    
    In [4]: v.index(1)                                                                                                                                                                                                                                                                                                                                                                            
    Out[4]: 0
    
    In [5]: v.index(2)                                                                                                                                                                                                                                                                                                                                                                            
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    <ipython-input-5-fbcb5f600aae> in <module>
    ----> 1 v.index(2)
    
    ~/.local/lib/python3.8/site-packages/tensorbay/utility/user.py in index(self, value, start, stop)
         76 
         77         """
    ---> 78         return self._data.index(value, start, stop)
         79 
         80     def count(self, value: _T) -> int:
    
    ValueError: tuple.index(x): x not in tuple
    
    opened by newbie255 2
  • AttributeError occurred when using CADC dataloader

    AttributeError occurred when using CADC dataloader

    AttributeError occurred when using CADC dataloader.

    Environment:

    • tensorbay 1.4.0
    • Ubuntu 20.04
    • python 3.6.0

    Error message:

    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/yexuan/virtualenvs/versions_3.6/lib/python3.6/site-packages/tensorbay/opendataset/CADC/loader.py", line 98, in CADC
        segment.append(_load_frame(sensors, data_path, frame_index, annotation, timestamps))
      File "/home/yexuan/virtualenvs/versions_3.6/lib/python3.6/site-packages/tensorbay/opendataset/CADC/loader.py", line 130, in _load_frame
        timestamp = datetime.fromisoformat(timestamps[sensor_name][frame_index][:23]).timestamp()
    AttributeError: type object 'datetime.datetime' has no attribute 'fromisoformat'
    
    opened by newbie255 2
  • JSONDecodeError occurred when uploading dataset

    JSONDecodeError occurred when uploading dataset

    JSONDecodeError occurred when uploading dataset.

    Environment:

    • tensorbay 1.4.0
    • Ubuntu 20.04
    • python 3.8.5

    Error message:

    Traceback (most recent call last):
      File "upload_nuScenes.py", line 15, in <module>
        gas.upload_dataset(dataset, jobs=8)
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/gas.py", line 411, in upload_dataset
        dataset_client._upload_segment(  # pylint: disable=protected-access
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/dataset.py", line 860, in _upload_segment
        multithread_upload(
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/requests.py", line 342, in multithread_upload
        future.result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 432, in result
        return self.__get_result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
        raise self._exception
      File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/dataset.py", line 861, in <lambda>
        lambda args: segment_client.upload_frame(*args),
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/segment.py", line 530, in upload_frame
        version_id, etag = self._post_multipart_formdata(
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/segment.py", line 146, in _post_multipart_formdata
        response_headers = self._client.do(
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/requests.py", line 279, in do
        return self.session.request(method=method, url=url, **kwargs)
      File "/home/newbie/.local/lib/python3.8/site-packages/tensorbay/client/requests.py", line 163, in request
        error_code = response.json()["code"]
      File "/usr/lib/python3/dist-packages/requests/models.py", line 888, in json
        return complexjson.loads(
      File "/usr/lib/python3/dist-packages/simplejson/__init__.py", line 518, in loads
        return _default_decoder.decode(s)
      File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 370, in decode
        obj, end = self.raw_decode(s)
      File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 400, in raw_decode
        return self.scan_once(s, idx=_w(s, idx).end())
    simplejson.errors.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
    

    The last part of the uploading log:

    ERROR:tensorbay.client.requests:requests.exceptions.ConnectionError: HTTPSConnectionPool(host='content-store-prod-version.oss-cn-shanghai.aliyuncs.com', port=443): Max retries exceeded with url: / (Caused by ReadTimeoutError("HTTPSConnectionPool(host='content-store-prod-version.oss-cn-shanghai.aliyuncs.com', port=443): Read timed out. (read timeout=30)"))
    ===================================================================
    ########################## HTTP Request ###########################
    "url": https://content-store-prod-version.oss-cn-shanghai.aliyuncs.com/
    "method": POST
    "headers": {
      "User-Agent": "python-requests/2.22.0",
      "Accept-Encoding": "gzip, deflate",
      "Accept": "*/*",
      "Connection": "keep-alive",
      "Content-Type": "multipart/form-data; boundary=8db056e91aa54e239e3d76a1fe557060",
      "Content-Length": "696198"
    }
    "body": 
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="OSSAccessKeyId"\r\n\r\n'
    b'LTAI4FjgXD3yFJUat4KADigE'
    
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="Signature"\r\n\r\n'
    b'41+Z00FnVUFKWQZdloESGBwxIR8='
    
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="policy"\r\n\r\n'
    b'eyJjb25kaXRpb25zIjpbWyJzdGFydHMtd2l0aCIsIiRrZXkiLCJiYTg4NzljMjUzZmM4Njg1MThkMDhhMjFmM2Y2Y2JmOS8wMTg2ZjUxZS00MDZkLTQzZWUtOWU0MC04MGE5ZDMwOGExNGEvLnNlZ21lbnQvdjEuMC10cmFpbnZhbC9zY2VuZS0wMDY0Ly5zZWdtZW50X2VuZC8iXV0sImV4cGlyYXRpb24iOiIyMDIxLTA1LTE4VDA4OjEwOjI3WiJ9'
    
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="success_action_status"\r\n\r\n'
    b'200'
    
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="key"\r\n\r\n'
    b'ba8879c253fc868518d08a21f3f6cbf9/0186f51e-406d-43ee-9e40-80a9d308a14a/.segment/v1.0-trainval/scene-0064/.segment_end/n008-2018-08-01-15-52-19-0400__LIDAR_TOP__1533153294197161.pcd.bin'
    
    --8db056e91aa54e239e3d76a1fe557060
    b'Content-Disposition: form-data; name="file"; filename="n008-2018-08-01-15-52-19-0400__LIDAR_TOP__1533153294197161.pcd.bin"\r\n\r\n'
    [695040 bytes of object data]
    
    --8db056e91aa54e239e3d76a1fe557060--
    
    opened by newbie255 2
  • raise KeyError: 'translation' when I get a FusionSegment instance

    raise KeyError: 'translation' when I get a FusionSegment instance

    Meet below error when calling segment = FusionSegment("", dataset_client)

    Environment:

    • SDK version 1.2.0
    • MacOS 10.15.7
    • Python 3.7.9

    Error message:

    ---------------------------------------------------------------------------
    KeyError                                  Traceback (most recent call last)
    <ipython-input-11-2f69882c94c1> in <module>
          1 dataset_client = gas.get_dataset(dataset_name, True)
    ----> 2 segment = FusionSegment("FV0180V9_Label_20200730_131113_018", dataset_client)
          3 # dataset_client.checkout(draft_number=1)
          4 # segment_client = dataset_client.get_segment("FV0180V9_Label_20200730_131113_018")
          5 # segment_client.delete_data("FV0180V9_Label_20200730_131113_018.mf400_remap_4I_screenRGB888_0048179.png")
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/dataset/segment.py in __init__(self, name, client)
        137             self._client = client.get_segment(name)
        138             self._data = list(self._client.list_frames())
    --> 139             self.sensors = self._client.get_sensors()
        140         else:
        141             self._data = []
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/client/segment.py in get_sensors(self)
        449         ).json()
        450 
    --> 451         return Sensors.loads(response["sensors"])
        452 
        453     def upload_sensor(self, sensor: Sensor) -> None:
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/sensor/sensor.py in loads(cls, contents)
        591 
        592         """
    --> 593         return common_loads(cls, contents)
        594 
        595     def dumps(self) -> List[Dict[str, Any]]:
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/utility/common.py in common_loads(object_class, contents)
         35     """
         36     obj: _T = object.__new__(object_class)
    ---> 37     obj._loads(contents)  # type: ignore[attr-defined]  # pylint: disable=protected-access
         38     return obj
         39 
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/sensor/sensor.py in _loads(self, contents)
        539         self._data = SortedDict()
        540         for sensor_info in contents:
    --> 541             self.add(Sensor.loads(sensor_info))
        542 
        543     @classmethod
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/sensor/sensor.py in loads(contents)
        158 
        159         """
    --> 160         sensor: "Sensor._Type" = common_loads(SensorType(contents["type"]).type, contents)
        161         return sensor
        162 
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/utility/common.py in common_loads(object_class, contents)
         35     """
         36     obj: _T = object.__new__(object_class)
    ---> 37     obj._loads(contents)  # type: ignore[attr-defined]  # pylint: disable=protected-access
         38     return obj
         39 
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/sensor/sensor.py in _loads(self, contents)
        127         super()._loads(contents)
        128         if "extrinsics" in contents:
    --> 129             self.extrinsics = Transform3D.loads(contents["extrinsics"])
        130 
        131     @staticmethod
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/geometry/transform.py in loads(cls, contents)
        187 
        188         """
    --> 189         return common_loads(cls, contents)
        190 
        191     @property
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/utility/common.py in common_loads(object_class, contents)
         35     """
         36     obj: _T = object.__new__(object_class)
    ---> 37     obj._loads(contents)  # type: ignore[attr-defined]  # pylint: disable=protected-access
         38     return obj
         39 
    
    ~/.pyenv/versions/miniconda3-4.7.12/lib/python3.7/site-packages/tensorbay/geometry/transform.py in _loads(self, contents)
        156 
        157     def _loads(self, contents: Dict[str, Dict[str, float]]) -> None:
    --> 158         self._translation = Vector3D.loads(contents["translation"])
        159         rotation_contents = contents["rotation"]
        160         self._rotation = quaternion(
    
    KeyError: 'translation'
    
    opened by chasers0-0 2
  • Cannot connect to S3

    Cannot connect to S3

    I'm using the following code to create an S3 storage config:

    gas.create_s3_storage_config(
            "S3Provider",
            "hub-2.0-tests/cifar-10-tensorbay",
            endpoint="http://hub-2.0-tests.s3-website.us-east-1.amazonaws.com",
            accesskey_id=os.getenv("AWS_ACCESS_KEY_ID"),
            accesskey_secret=os.getenv("AWS_SECRET_ACCESS_KEY"),
            bucket_name="hub-2.0-tests",
        )
    

    But I'm getting

    Unexpected status code(500)!
    ===================================================================
    ########################## HTTP Request ###########################
    "url": https://gas.graviti.com/gatewayv2/tensorbay-open-api/v1/storage-configs/s3
    "method": POST
    "headers": {
      "User-Agent": "python-requests/2.25.1",
      "Accept-Encoding": "gzip, deflate",
      "Accept": "*/*",
      "Connection": "keep-alive",
      "X-Token": "ACCESSKEY-c7dae4ce277842a3b47a3b683159678c",
      "X-Source": "PYTHON-SDK/1.24.0",
      "X-Request-Id": "e92eab123ec5417aad86ebfb57c206d6",
      "Content-Length": "274",
      "Content-Type": "application/json"
    }
    "body": {"name": "S3Provider", "filePath": "hub-2.0-tests/cifar-10-tensorbay", "endpoint": "http://hub-2.0-tests.s3-website.us-east-1.amazonaws.com", "accesskeyId": "ASIAQYP5ISLXYZQKS3X2", "accesskeySecret": "NxrQKlUTuX2ZdapSRe9aTblfmXOsr5vIzLG4erYc", "bucketName": "hub-2.0-tests"}
    
    ########################## HTTP Response ##########################
    "url": https://gas.graviti.com/gatewayv2/tensorbay-open-api/v1/storage-configs/s3
    "status_code": 500
    "reason": Internal Server Error
    "headers": {
      "Date": "Sun, 08 May 2022 14:16:42 GMT",
      "Content-Type": "application/json; charset=utf-8",
      "Content-Length": "178",
      "Connection": "keep-alive",
      "Access-Control-Allow-Origin": "*",
      "X-Kong-Upstream-Latency": "20",
      "X-Kong-Proxy-Latency": "1",
      "Via": "kong/2.0.4"
    }
    "content": {
      "code": "InternalServerError",
      "message": "An exception occurred inside the service, please try again later : ErrorCreateHostingFailed > ErrorCreateHostingConfigGroupIdEmpty"
    }
    "cost_time": 1.892544s
    ===================================================================
    
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\Users\fayaz\miniconda3\lib\site-packages\tensorbay\client\gas.py", line 262, in create_s3_storage_config
        self._client.open_api_do("POST", "storage-configs/s3", json=post_data)
      File "C:\Users\fayaz\miniconda3\lib\site-packages\tensorbay\client\requests.py", line 126, in open_api_do
        raise ResponseErrorDistributor.get(error_code, ResponseError)(
    tensorbay.exception.InternalServerError: Unexpected status code(500)! https://gas.graviti.com/gatewayv2/tensorbay-open-api/v1/storage-configs/s3!
                         {"code":"InternalServerError","message":"An exception occurred inside the service, please try again later : ErrorCreateHostingFailed \u003e ErrorCreateHostingConfigGroupIdEmpty"}
    

    What could be the reason for this? Are there any examples I can look at? I couldn't find any in your docs.

    opened by FayazRahman 2
Releases(v1.24.1)
  • v1.24.1(Sep 23, 2022)

    Bug Fixes:

    • Fix the TypeError when using _repr1 on python builtin type with _repr_type (#1258)

    Install Dependencies:

    • Set the minimum version of urllib3 to v1.15 (#1254)
    • Remove the limitation numpy <= 1.20.3 (#1265)

    Documentation:

    • Add the pdf download option on the Read The Docs website (#1263)
    • Correct the example code in the docs (#1255, #1257)
    • Replace all the .cn url with .com in the docs (#1252)
    Source code(tar.gz)
    Source code(zip)
  • v1.24.0(Apr 12, 2022)

    New Features:

    • Support basicSearch job (#1213)

      • Implement class BasicSearchJob (#1222, #1231, #1245)
      • Implement class BasicSearch (#1224, #1232, #1234, #1240, #1243, #1248)
      • Implement SearchResult related classes (#1225, #1235, #1244)
    • Support uploading multiple labels (#1223)

    • Add open dataset loaders HKD (#1194, #1238)

    Bug Fixes:

    • Fix the incorrect parentCategories of nuImages dataloader (#1192)
    • Fix the incorrect creation form of frames in nuImages dataloader (#1218)
    • Remove the space in the category names of VGGFace2 dataloader (#1201)
    • Fix the NoneType error in Job.update (#1200)
    • Remove the space in the remote path of CarConnection dataloader (#1204)
    • Add the missing job type to job system (#1233, #1242, #1239)

    Improvements:

    • Replace . with - in attribute names of nuImages dataloader (#1226)
    • Move method retry from Job to SquashAndMergeJob (#1247)
    • Refine the error message when calling unimplemented get_callback_body method (#1246)

    Documentation:

    • Add the doc about "BasicSearch" and "SearchResult" (#1229, #1241)
    • Use Sphinx "List Table" directive to generate tables (#1193)
    • Reconstruct "Fusion Dataset" in chapter "Advanced Features" (#1210)
    • Refine the docs (#1228, #1198, #1214, #1221, #1237)
    Source code(tar.gz)
    Source code(zip)
  • v1.20.0(Jan 21, 2022)

    New Features:

    • Add open dataset loaders VGGFace2 (#1146)

    Bug Fixes:

    • Fix reading cached file failed in multiprocess environment (#1151, #1188)
    • Fix the Job.started_at will not be updated (#1189)
    • Fix the ValueError when init PointList2D and RLE with numpy array (#1190)
    • Fix the NoneType error in Job.update (#1200)

    Improvements:

    • Remove the multiprocess lock and add pid to cached file path (#1188)

    Documentation:

    • Change the Sphinx html theme to "furo" (#1061)
    • Refine the docstring (#1163)
    Source code(tar.gz)
    Source code(zip)
  • v1.19.0(Dec 30, 2021)

    New Features:

    • Implement the async job framework (#1155, #1182)

      • Implement JobMixin._create_job (#1162)
      • Implement JobMixin._get_job (#1166)
      • Implement Job.update (#1174, #1184)
      • Implement JobMixin.delete_job (#1171)
      • Implement Job.retry and Job.abort (#1170)
      • Implement JobMixin._list_job (#1173)
    • Implement SquashAndMerge feature by async job framework (#1134, #1179)

      • Implement SquashAndMerge.create_job (#1168)
      • Implement SquashAndMerge.delete_job (#1175)
      • Implement SquashAndMerge.get_job (#1176)
      • Implement SquashAndMerge.list_job (#1177, #1181)
      • Implement SquashAndMergeJob.result (#1178, #1183)
    • Add the following open dataset loader:

      • CityscapesGTFine (#1160)
      • CityscapesGTCoarse (#1167)
      • RarePlanesReal (#1157)
      • RarePlanesSynthetic (#1158)
      • UrbanObjectDetection (#1172)

    Improvements:

    • Getting mask urls from OpenAPI getDataDetails to reduce the number of requests (#1114)

    Documentation:

    • Update the docs about "Squash And Merge" in "Features" (#1169, #1180)
    • Refine the docs (#1152, #1159)
    • Refine the docstring (#1130, #1154)
    Source code(tar.gz)
    Source code(zip)
  • v1.18.1(Dec 17, 2021)

  • v1.18.0(Dec 15, 2021)

    New Features:

    • Add interfaces to communicate with Sextant Apps (#1112, #1132, #1139, #1153)
      • Implement Sextant.list_benchmarks (#1133)
      • Implement Sextant.list_evaluations (#1136)
      • Implement Sextant.create_evaluations (#1135)
      • Implement Evaluation.get_result (#1137)
    • Add SemanticMask for BDD100K_MOTS2020 dataset (#1102)
    • Add author, updated_at and parent_commit_id for Draft (#1108)
    • Add --show-drafts for CLI gas log to support displaying open drafts (#1124, #1161)

    Improvements:

    • Remove the redundant attributes in BDD100K dataset (#1109)
    • Update VersionControlClient to VersionControlMixin (#1131)
    • Use stem to represent the filename without extension in opendataset module (#1145)

    Bug fixes:

    • Add the missing categoryDelimiter to the OxfordIIITPet catalog (#1111)
    • Fix NoneType not subscriptable error in listMaskUrls (#1118)
    • Fix the AttributeError when calling the mocker class methods (#1119)
    • Correct the wrong attribute names in nuImages catalog (#1148)

    Documentation:

    • Add "PaddlePaddle" instructions in "INTEGRATIONS" chapter (#1128)
    • Add documentation for Sextant in ”Applications“ chapter (#1149)
    • Reconstruct Storage Config in chapter Advanced Features (#1116)
    • Support deleting automatically generated rst files via make clean (#1129)
    • Refine the docstring (#1110, #1115)
    Source code(tar.gz)
    Source code(zip)
  • v1.17.2(Dec 6, 2021)

  • v1.17.1(Dec 6, 2021)

  • v1.17.0(Dec 2, 2021)

    New Features:

    • Add --sort option for CLI gas branch and gas tag to support sorting branches and tags (#1095)
    • Support recording statistical info for Data.open().read() in Profile class (#1096)
    • Add StorageConfig class to store storage config info (#1098)
    • Add SegmentClient.get_data to get data info by its remote_path (#1071)
    • Support showing a piece of data via remote_path in CLI gas ls (#1097)

    Improvements:

    • add URL class for getting and updating the file url (#1088, #1092, #1142)
    • Add train, val and test in segment name in DAVIS2017 (#1100)

    Bug fixes:

    • Fix the wrong graph in CLI gas log when merging branches (#1101)
    • Add the missing categoryDelimiter in COCO2017 catalog (#1106)
    • Fix the FileExistsError when using cache with multiprocess (#1113)

    Documentation:

    • Use autosummary to generate rst files in opendataset module under "API" chapter (#1086)
    • Refine the docs (#1103, #1105)
    • Refine the docstring (#1089)
    Source code(tar.gz)
    Source code(zip)
  • v1.16.1(Nov 30, 2021)

  • v1.16.0(Nov 16, 2021)

    New Features:

    • Add checking and warning for free storage space when enabling cache (#1081)
    • Add the following open dataset loaders:
      • SCUT_FBP5500 (#1076)
      • DAVIS2017 (#1078)

    Improvements:

    • Add argument encoding='utf-8' to open() to make it compatible for Windows (#1064)
    • Use xmltodict to parse xml files instead of xml.ElementTree (#1070)
    • Add file tensorbay/py.typed to comply with PEP-561 (#1085)

    Bug fixes:

    • Fix Data.open() catch exception failed when url is expired (#1062)

    Documentation:

    • Reconstruct "Version Control" chapter (#1063)
    • Fix the typo in docs (#1066)
    • Refine the docs (#1060, #1065, #1067, #1069, #1072, #1050, #1073, #1080)
    • Refine the docstring (#1074)

    Deprecations:

    • Remove the deprecated method GAS.create_auth_dataset (#1087)
    Source code(tar.gz)
    Source code(zip)
  • v1.15.0(Nov 1, 2021)

    New Features:

    • Support setting is_public in GAS.create_dataset (#1038)
    • Support importing auth cloud storage data to fusion dataset (#1037)
    • Support cache when opening remote data (#1041)
    • Add DatasetClientBase.get_total_size to get the total data size in a dataset commit (#1047)
    • Add VersionControlClient.squash_and_merge to merge two different branches (#1019)

    Improvements:

    • Use os.path.expanduser to handle the CLI config file path in different systems (#1042)
    • Replace relative import with absolute import for all python files in SDK (#1043)
    • Remove the useless checks about commits and drafts in ls.py (#1046)
    • Add X-Request-Id to all the tensorbay OpenAPI request headers (#1049, #1052)
    • Add detailed resuming message when uploading process got interrupted in CLI gas cp (#1053)

    Bug fixes:

    • Add a workaround for the IndexError when loading the data with panoptic mask (#1036)
    • Remove the wrong error message in gas dataset when tbrn is absent (#1039)

    Documentation:

    • Add docs about the data cache (#1056)
    • Add docs for using pharos on remote server (#965)
    • Refine the docs (#1035, #1040, #1048, #1045, #1051, #1057)
    Source code(tar.gz)
    Source code(zip)
  • v1.14.0(Oct 18, 2021)

    New Features:

    • Display elapsed time of the request in the debug log (#1031)
    • Use response.elapsed to get the cost time of a request in profile (#1032)
    • Add DatasetClientBase.get_label_statistics to get the label statistics info (#1018)
    • Support updating file urls when the urls are expired (#1020)
    • Add open dataset loaders SegTrack2 (#1007)

    Improvements:

    • Consolidate all config-related functions in ContextInfo (#1003)

    Performances:

    • Adapt OpenAPI getDataDetails to increase the speed of reading labels and urls (#1017, #1025, #1033)
    • Upload auth cloud storage data labels on the cloud-callback OpenAPI to increase the uploading speed (#1014)

    Documentation:

    • Add "Update Dataset Notes" section in "Update Dataset" chapter (#1008)
    • Add docs about cloud file in-place importing in "Cloud Storage" (#1034)
    • Add docs about getting the label statistics info in "Get Label Statistics" (#1018)
    • Refactor the "Version Control" chapter (#782)
    • Refine the docs (#1009, #1011)
    Source code(tar.gz)
    Source code(zip)
  • v1.13.0(Sep 28, 2021)

    New Features:

    • Support filtering drafts by status and branch_name in VersionControlClient.list_drafts (#988)

    • Add class Profile to record the statistical information about the HTTP requests in SDK (#992)

      • Support using Profile in multi-process environment (#996)
      • Support saving Profile summary to csv, txt or json file (#1004, #1021, #1022, #1024)
    • Add the following BDD100K related open dataset loaders (#846, #860, #868, #872, #914, #959, #967, #973, #995, #1006):

      • BDD100K
      • BDD100K-10K
      • BDD100K-MOT
      • BDD100K-MOTS
    • Add the following open dataset loaders:

      • SegTrack (#983, #998)
      • COCO2017 (#1000)
    • Unify the format of prompt messages in CLI (#980)

    Bug fixes:

    • Fix the dataset loader LIP and CIHP cannot be imported from opendataset module (#1013)
    • Correct the wrong visible status in LIP dataset loader (#1026)

    Improvements:

    • Unify the variable names about directory and tbrn (#997, #1002)
    • Use ValueError, StatusError to replace OperationError (#1028)

    Documentation:

    • Add examples about SemanticMask and InstanceMask (#990)
    • Add catalog structure for different label types (#991)
    • Add docs about how to use Profile (#1010)
    • Fix the typos in docs (#1005)
    • Refine the docs (#982)

    Deprecations:

    • Remove the deprecated class ResponseSystemError(#1029)
    Source code(tar.gz)
    Source code(zip)
  • v1.12.0(Sep 13, 2021)

    New Features:

    • Support creating dataset with different cloud storage configs in GAS.create_dataset (#984)

    • Add the following methods to create customized cloud storage configs for different cloud services (#987, #989):

      • GAS.create_oss_storage_config
      • GAS.create_s3_storage_config
      • GAS.create_azure_storage_config
    • Add GAS.delete_storage_config to delete customized cloud storage config (#966)

    • Enable the method SegmentClient.delete_data to delete data (#946)

    • Add FusionSegmentClient.delete_frame to delete frame (#960)

    • Support getting and updating dataset isPublic flag (#972, #974)

    • Add the following open dataset loaders:

      • CIHP (#948)
      • VOC2012Segmentation (#925)
      • PASCALContext (#942)
      • LIP (#947)
      • OxfordIIITPet (#941)
      • nuImages (#958)
    • Print more friendly error messages in CLI (#939)

    Interface Adjustments:

    • Set current revision as the default in VersionControlClient.list_commits (#949)
    • Disable method FusionSegmentClient.delete_data to stop deleting data in Fusion dataset (#962)

    Improvements:

    • Prohibit closing current draft in VersionControlClient.close_draft (#955)
    • Avoid getting done_frames when skip_upload_files is False in FusionDatasetClient.upload_segment (#993)

    Documentation:

    • Add docs about "Update Dataset Meta" (#999)
    • Add docs about "Delete Frame" (#1001)
    • Fix the typos in docs (#977)
    • Refine the docs (#963, #825, #994)

    Deprecations:

    • Remove the deprecated code before v1.10.0 (#976)
    • Deprecate GAS.create_auth_dataset and use GAS.create_dataset instead (#984)
    Source code(tar.gz)
    Source code(zip)
  • v1.11.0(Aug 30, 2021)

    New Features:

    • Add DatasetClient.get_diff to get the diff of the specified commit (#897, #911, #935, #936, #940)
    • Support Beizer Curve in LabeledPolyline2D (#956)
    • Show colorized commits info in CLI gas log (#892, #894)
    • Add open dataset loader SVHN (#896, #908)

    Bug Fixes:

    • Correct the wrong attribute name in opendataset.VOC2012Detection (#895)

    Improvements:

    • Refactor the exception system

      • Rename ResponseSystemError to InternalServerError (#920, #932)
      • Add ForbiddenError (#921)
      • Unify the signature of the exception classes (#916)
      • Amend the status code of NameConflictError to 409 (#922)
      • Fix the AttributeError when calling ResponseError.response (#928)
    • Add class ContextInfo to avoid reading config file repeatedly in CLI module (#893)

    • Check the config name in GAS.get_cloud_client (#898)

    • Remove the useless code about checking whether the catalog is empty (#913)

    • Transfer the input path to absolute path for VOC2012Detection and VOC2012ActionClassification (#927)

    • Delete trainval segments for VOC2012Detection and VOC2012ActionClassification (#926)

    Documentation:

    • Add docs about diff (#934)
    • Add docs about "Beizer Curve" (#964)
    • Add docs about "merge datasets" (#953, #978)
    • Refine the docs (#923)
    Source code(tar.gz)
    Source code(zip)
  • v1.10.2(Aug 26, 2021)

  • v1.10.1(Aug 24, 2021)

    New Features:

    • Support new label types: SemanticMask, InstanceMask and PanopticMask (#854, #915, #917, #918, #924, #954, #957)

    Interface Adjustments:

    • Remove the loads method for DataBase, Data and AuthData (#900)
    • Add get_callback_body method to replace dumps in Data (#902)
    • Add from_response_body method to replace loads in RemoteData & Frame (#903)

    Improvements:

    • Add FileMixin and RemoteFileMixin for file-related methods (#899)
    • Support setting custom loader and dumper in AttrsMixin (#910)

    Documentation:

    • Add docs about SemanticMask, InstanceMask and PanopticMask (#943, #951, #961)
    Source code(tar.gz)
    Source code(zip)
  • v1.10.0(Aug 16, 2021)

    New Features:

    • Support new label types Polygon and Polyline2D, add five label formats:

      • Polygon + MultiPolygon + RLE (#822, #824, #832, #840)
      • Polyline2D + MultiPolyline2D (#802, #821)
    • Add the following open dataset loaders:

      • VOC2012Detection (#830, #844, #857)
      • VOC2012ActionClassification (#837, #842, #857)
      • CCPD & CCPDGreen (#850)
    • Add name conflict check for SortedNameList (#883)

    • Support setting network request config (is_internal, max_retries, and timeout) in CLI gas config (#836)

    • Support validating the AccessKey and displaying the user info in CLI gas auth (#848)

    • Add --status option for CLI gas auth to display the user and auth info (#855)

    • Support displaying error messages for CLI gas auth (#856)

    • Add -l option for CLI gas ls to show the total number of resources (#864)

    • Add --all option for CLI gas log to show the commits in all branches (#829)

    • Add --graph option for CLI gas log to show the graphical commits (#874)

    • Set -h as an abbreviation of --help to show CLI help messages (#880)

    Interface Adjustments:

    • Remove the path argument in GAS.create_auth_dataset because the path is bound to the cloud storage config (#919)

    Bug Fixes:

    • Add a workaround for the issue Chinese characters display garbled in Windows system (#906)

    Improvements:

    • Use list comprehension in PointList2D.__init__ instead of for-loop (#841)
    • Set None as the default value of the argument alias in GAS.update_dataset (#870)
    • Correct the illegal remote paths for RP2K dataset loader (#871)
    • Encapsulate the moving segment logic into DatasetClientBase._move_segment (#879)
    • Move the deprecation related classes into deprecated.py (#878)
    • Stop showing the redundant field name in AttributeInfo .__repr__ (#887)
    • Cleanup the following useless codes:
      • Remove the useless class variable _label_attrs from label related classes (#843)
      • Remove the useless class SubcatalogTypeRegister (#847)
      • Remove the useless class LabelType (#849)
      • Remove the useless type Subcatalogs (#853)

    Documentation:

    • Add docs about "Shell Completion" in the "CLI" section (#873, #886)
    • Add docs about new label formats Polygon and Polyline2D (#884)
    • Add docs about setting network request config in CLI gas config (#907)
    • Add docs about --all and --graph options in CLI gas log (#909)
    • Fix the typo in docs (#845)
    • Refine the docs (#866)
    Source code(tar.gz)
    Source code(zip)
  • v1.9.0(Aug 4, 2021)

    New Features:

    • Support override and skip strategies in SegmentClient.move_data (#819)

    • Add GAS.get_user to get the current user info (#808)

    • Support manipulating dataset alias:

      • Support getting the dataset alias in GAS.get_dataset (#810)
      • Support setting the dataset alias in GAS.create_dataset (#812)
      • Add GAS.update_dataset to update the dataset alias (#813)
    • Support setting and showing description in CLI gas draft (#803)

    • Add --edit option in CLI gas draft to edit a draft (#805)

    • Add --close option in CLI gas draft to close a draft (#811)

    • Add draft description to the pop-up editor of gas commit as default message (#815)

    • Display the corresponding message after deleting data or a segment in gas rm (#859)

    Interface Adjustments:

    • Change the signature of Transform3D.set_rotation and Sensor.set_rotation (#881)

    Improvements:

    • Unify the description display logic in gas draft and gas commit (#817)
    • Set the file name as "" when posting files to cloud storage (#858)
    • Add utility function chunked to break an iterable into length n tuples (#876)
    • Set the minimum version of urllib3 back to v1.15 (#867)

    Performance:

    • Speed up file uploads for fusion dataset in GAS.upload_dataset by adapting OpenAPI multiCallback (#804)
    • Speed up file uploads by sending file size to TensorBay (#862)
    • Speed up file uploads by enlarging the batch size of multiCallback (#882)

    Documentation:

    • Add the docs about --edit and --close option in CLI gas draft (#827)
    • Add the docs about CLI "Profile" (#781)
    • Refine the docs (#791, #783, #861, #890)
    • Fix the typo in docstrings (#801, #818, #888)

    Deprecations:

    • Remove the default value of the argument title in VersionControlClient.create_draft (#820)
    • Remove the deprecated message about setting AccessKey in gas config (#839)
    Source code(tar.gz)
    Source code(zip)
  • v1.8.1(Jul 16, 2021)

    Bug fixes:

    • Fix CLI gas log --oneline that displays all commits in oneline (#835)

    Documentation:

    • Replace -t with -m in the docs of CLI gas draft (#828)
    • Add details about strategy in the docs of copy and move operations (#831)
    • Fix the Synopsis section display error in CLI gas commit --help (#833)
    • Use batch move in the example code of moving data instead of one by one (#834)
    Source code(tar.gz)
    Source code(zip)
  • v1.8.0(Jul 14, 2021)

    New Features:

    • Add Data.get_url to get the file:// url for a local file (#789)

    • Support more operations on draft:

      • Add status and description to the Draft class (#796)
      • Add VersionControlClient.update_draft to update title and description of a draft (#793)
      • Add VersionControlClient.close_draft to close a draft (#793)
      • Support setting description in VesionControlClient.create_draft (#799)
    • Add --message option in CLI gas draft to set title and description of a draft (#735)

    • Add SegmentClient.list_urls and FusionSegmentClient.list_urls to list the file urls (#807)

    • Add the following open dataset loaders:

      • UAVDT (#723)
      • CACD (#728)
      • AADB (#712)
      • COVID_CT (#784)

    Bug fixes:

    • Fix the ImportError when import tensorbay in python 3.6 (#814)

    Improvements:

    • Add shorten function to get the short commit ID (#755)
    • Merge the CLASSIFICATION label into BOX2D label in the CompCars open dataset (#780)
    • Remove useless DatasetClient.import_all_files interface (#790)
    • Correct the illegal segment names in all the open dataset loaders (#794)
    • Move customized click classes from cli/cli.py to cli/custom.py (#797)
    • Stop showing the description in Commit.__repr__ (#800)
    • Exclude useless docs and tests modules from the tensorbay package (#826)

    Performance:

    • Speed up file uploads in GAS.upload_dataset by adapting OpenAPI multiCallback (#788)
    • Enhance the performance of RemoteData.open by batch requesting file urls (#809)

    Documentation:

    • Add Synopsis section to CLI --help message (#746)
    • Refine the CLI docs (#743, #745)

    Deprecations:

    • Deprecate the --title option in CLI gas draft (#735)
    • Remove the deprecated CLI gas create and gas delete (#786)
    • Remove the deprecated exception CommitStatusError (#787)
    Source code(tar.gz)
    Source code(zip)
  • v1.7.0(Jun 29, 2021)

    New Features:

    • Support copying and moving data and segment (#748, #753, #777)

      • Add DatasetClient.move_segment and FusionDatasetClient.move_segment (#753)
      • Add DatasetClient.copy_segment and FusionDatasetClient.copy_segment (#757)
      • Add SegmentClient.copy_data (#762)
      • Add SegmentClient.move_data (#766)
    • Support importing data from auth cloud storage (#747, #768, #769, #770)

      • Add AuthData.open and CloudClient.list_auth_data (#763, #773)
      • Add SegmentClient.import_auth_data (#760)
      • Support dataset with AuthData in GAS.upload_dataset (#774)
    • Add special error messages to all attrs in Label and Catalog when the attr does not exist (#731)

    • Add gas auth command to authenticate, list and unset TensorBay account (#672, #681, #684, #693)

    • Support listing and unsetting config in gas config command (#695, #697)

    • The CLI config file format is updated to support multiple profiles better (#638)

    • Add Vector.__abs__ to get the 2-norm of a vector (#674)

    • Add the following basic arithmetic methods for Vector: (#673)

      • Vector.__sub__
      • Vector.__rsub__
      • Vector.__mul__
      • Vector.__rmul__
      • Vector.__truediv__
      • Vector.__floordiv__
    • Add __contains__ method for the following Sequence subclasses (#707, #708):

      • UserSequence
      • NameList
      • SortedNameList
      • DatasetBase
    • Add __eq__ method for UserSequence and UserMapping (#719, #726)

    • Add __reversed__ method for UserSequence (#727)

    Bug fixes:

    • Fix the issue that .png files are all missing in RP2K open dataset loader (#714)

    Interface adjustment:

    • Use NameSortedList in Sensors to replace NameSortedDict (#704)
    • Rename NamedList to NameList and NameSortedList to SortedNameList (#706)

    Improvements:

    • Move lazy evaluation related code to client/lazy.py (#688)
    • Use bisect in NameSortedList to replace SortedDict (#703)
    • Remove sortedcontainers from dependancies (#705)
    • Group version control methods from DatasetClientBase into VersionControlClient (#750)

    Performance:

    • Enhance the performance of the following methods: (#675)
      • Polyline2D.uniform_frechet_distance
      • Polyline2D.similarity
    • Override PagingList mixin methods inherited from MutableSequence to enhance its performance (#686)

    Documentation:

    • Add docs about copying and moving data and segment (#761, #765)
    • Add docs about importing data from auth cloud storage (#756)
    • Add docs about gas auth and gas config (#696, #702)
    • Fix typos in docs (#679)
    • Refine the docs (#700, #736, #725, #754)

    Deprecations:

    • The setting access key feature in CLI gas config is deprecated, please use gas auth (#689)
    • Remove the deprecated method get_segment_by_name (#698)
    Source code(tar.gz)
    Source code(zip)
  • v1.6.0(Jun 29, 2021)

    IMPORTANT: TensorBay system underwent a huge refactoring, which broke the downward compatibility of OpenAPI and SDK. As a result, the SDK under version v1.6.0 does not work anymore.
    Please update tensorbay SDK to v1.6.0 or a higher version.


    New Features:

    • Add the following methods to support multiple branches (#644, #657):

      • DatasetClientBase.create_branch (#613)
      • DatasetClientBase.delete_branch (#616)
    • Support creating draft on different branches (#652, #649, #659, #733)

    • Support uploading dataset on different branches in GAS.upload_dataset (#694)

    • Add open dataset loader BDD100K for its CLASSIFICATION and BOX2D label (#631)

    • Add the following CLI commands:

      • gas branch to create, list and delete branches (#587, #640, #663)
      • gas tag to create, list and delete tags (#600, #611, #622)
      • gas log to show commit history (#614)
    • Add popup editor for the following CLI commands to edit title and description (#662):

      • gas draft (#602)
      • gas commit (#625)
    • Support setting description in gas commit command (#730)

    Interface adjustment:

    • Change the default segment name from "" to "default" (#620, #686)
    • Set argument title of DatasetClientBase.create_draft required (#668)
    • Use dynamic attr and public attrs in CameraIntrinsics (#606)
    • Implement NamedList to replace NameOrderedDict (#621)
    • Change the type of Notes.bin_point_cloud_fields to list (#660)
    • Disable method SegmentClientBase.delete_data temporarily (#737)

    Improvements:

    • Rebuild uploading data procedure to adapt new TensorBay backend (#671)
    • Use title and description instead of message in commit (#713, #718, #724)
    • Handle the branch without commit history (#618, #701, #709, #734, #596)
    • Fix type hint errors reported by mypy (#590)
    • No longer dumps "skew" of zero value in CameraMatrix (#609)
    • Add error() function for CLI to echo error message then exit (#642, #647, #658)
    • Delete useless need_team_dataset argument in GAS._list_datasets() (#682)
    • Add DefaultValueDeprecated to deprecate arg default value (#639, #654)
    • Refine AttrsMixin framework (#586, #591, #595, #598, #603, #607, #629, #633, #636)
    • Apply AttrsMixin in the following modules and classes:
      • NameMixin class (#601)
      • sensor module (#617)
      • label module (#635)
      • dataset module (#570)
      • client.struct module (#692, #716)

    Documentation:

    • Add docs about multiple branches (#650, #670, #720)
    • Add docs about gas tag, gas branch and gas log(#624, #630, #667)
    • Update docstrings about version control (#721)
    • Refine the docs (#661)
    Source code(tar.gz)
    Source code(zip)
  • v1.5.3(Jun 24, 2021)

  • v1.5.2(Jun 8, 2021)

  • v1.5.1(Jun 3, 2021)

  • v1.5.0(May 31, 2021)

    New Features:

    • Redesign and implement the gas CLI based on TensorBay version control (#575):

      • Support draft number and revision in TBRN (#525)
      • Add gas dataset command to create, list and delete dataset (#523, #526, #528, #530, #534, #571)
      • Support gas ls command to list segments and data in specific draft or revision (#541, #618)
      • Add gas draft command to create and list drafts (#546, #547, #548)
      • Add gas commit command to commit draft (#549)
      • Add gas cp command to upload files to draft (#551)
      • Add gas rm command to remove segment and data in draft (#555, #582)
    • Add detailed resuming message when uploading process got interrupted (#516, #536)

    • Add ModuleImportError to print detailed install instruction when the optional requirement package is not installed (#544)

    • Implement DatasetBase.__delitem__ to support delete segment from dataset by del (#542)

    • Add open dataset loader LISA Traffic Sign (#518)

    Bug fixes:

    • Fix the ValueError when passing the last item to UserSequence.index (#627, Fix #610)

    Improvements:

    • Remove the redundant class variable description in KeypointsInfo (#506)
    • Implement python attr framework AttrsMixin (#447, #559, #560, #565, #569, #566, #580)
    • Move TBRN from utility module to cli module (#540)
    • Inherit EqMixin to replace __eq__ in the following intrinsic classes (#539):
      • CameraMatrix
      • DistortionCoefficients
      • CameraIntrinsics

    Documentation:

    • Refactor CLI related docs (#520, #527, #533, #556, #558, #561, #578, #608, #628, #632)
    • Add Integrations chapter for PyTorch and TensorFlow integration (#550, #553, #588, #589, #594)
    • Refine the docs (#501, #508, #514, #517, #513, #509, #519, #537)
    • Fix the typo in docs (#531)

    Deprecations:

    • Deprecate CLI gas create and gas delete (#568)
    • Deprecate legacy fusion dataset TBRN (#581)
    • Remove the following deprecated interfaces (#511):
      • Following exceptions:
        • GASDatasetError
        • GASDatasetTypeError
        • GASException
        • GASPathError
        • GASResponseError
        • GASSegmentError
      • The start and stop keyword arguments in the following methods:
        • GAS.list_dataset_names
        • DatasetClientBase.list_drafts
        • DatasetClientBase.list_commits
        • DatasetClientBase.list_tags
        • DatasetClientBase.list_branches
        • DatasetClientBase.list_segment_names
        • SegmentClient.list_data_paths
        • SegmentClient.list_data
        • FusionSegmentClient.list_frames
      • Following methods:
        • DatasetClientBase.list_draft_titles_and_numbers
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(May 19, 2021)

    New Features:

    • The local visualization plugin pharos is released, it can be installed by pip3 install pharos:
      • Pypi: https://pypi.org/project/pharos/
      • Docs: https://tensorbay-python-sdk.graviti.com/en/v1.4.1/features/visualization.html

    Bug fixes:

    • Fix the JSONDecodeError occurred when uploading dataset (#563, Fix #562)
    • Fix the AttributeError when using CADC dataloader in python3.6 (#576, Fix #574)
    • Fix the AttributeError when resuming upload of fusion dataset (#583, Fix #579)

    Documentation:

    • Add docs for local visualization plugin pharos (#510, #515, #577, #573, #585)
    Source code(tar.gz)
    Source code(zip)
  • v1.4.0(May 17, 2021)

    New Features:

    • Make PagingList mutable, and follow MutableSequence protocol (#462, #472, #475)

    • Support reading remote data lazily in Dataset and Segment:

      • Lazy evaluation in Segment (#476, #484)
      • Lazy evaluation in Dataset (#477, #482, #483, #492, #496, #497, #500, #545)
    • Support getting segment by name in DatasetBase.__getitem__ (#498)

      • Use segment = dataset["test"] to get segment by name instead of segment = dataset.get_segment_by_name("test").
      • Use segment_names = dataset.keys() to get all segment names in a dataset.
    • Add the following methods to convert between category and index for writing training code easier (#468)

      • CategoriesMixin.get_category_to_index
      • CategoriesMixin.get_index_to_category
    • Add the following exceptions as subclasses of ResponseError (#437, #458)

      • AccessDeniedError
      • InvalidParamsError
      • NameConflictError
      • RequestParamsMissingError
      • ResourceNotExistError
      • ResponseSystemError
      • UnauthorizedError
    • Support skip_uploaded_files flag in GAS.upload_dataset for fusion dataset (#494)

    • Add open dataset loader COVID-chestxray and nuScenes (#459, #481)

    Improvements:

    • Refactor CLI related code to a new module (#479)
    • Stop checking the commit_id in DatasetClientBase.__init__ to avoid sending redundant request (#485)
    • Fix the possibly unbound variable warning in CompCars (#490)

    Documentation:

    • Add Update Dataset, Update Label and Update Data chapter (#465, #457, #495)
    • Add docs for specific response exceptions (#478)
    • Add continuity and tracking in glossary (#493)
    • Update docs for reading segments from lazy evaluation Dataset rather than DatasetClient (#486)
    • Refine the docs (#448, #451, #426)
    • Refine the example in docs (#440, #444, #430, #441, #443, #450, #453)

    Deprecations:

    • Deprecate DatasetBase.get_segment_by_name (#498)
    Source code(tar.gz)
    Source code(zip)
Owner
Graviti
AI empowers everyone and anything
Graviti
Telegram bot made with Python to get notified when visa slots are available

Visa slot bot I created this bot to getnotified when screenshots are available in the Telegram channel for dropbox appointments. How do I use this? Ch

Jimil 7 Jan 03, 2023
EduuRobot Telegram bot source code.

EduuRobot A multipurpose Telegram Bot made with Pyrogram and asynchronous programming. Requirements Python 3.6+ An Unix-like operating system (Running

Amano Team 119 Dec 23, 2022
To dynamically change the split direction in I3/Sway so as to split new windows automatically based on the width and height of the focused window

To dynamically change the split direction in I3/Sway so as to split new windows automatically based on the width and height of the focused window Insp

Ritin George 6 Mar 11, 2022
Moon-TikTok-Checker - A TikTok Username checking tool that probably 3/4 people use to get rare usernames

Moon Checker (educational Purposes Only) What Is Moon Checker? This is a TikTok

glide 4 Nov 30, 2022
ESOLinuxAddonManager - Very simple addon manager for Elder Scrolls Online running on Linux.

ESOLinuxAddonManager Very simple addon manager for Elder Scrolls Online running on Linux. Well, more a downloader for now. Currently it's quite ugly b

Akseli 25 Aug 28, 2022
JAWS Pankration 2021 - DDD on AWS Lambda sample

JAWS Pankration 2021 - DDD on AWS Lambda sample What is this project? This project contains sample code for AWS Lambda with domain models. I presented

Atsushi Fukui 21 Mar 30, 2022
Easily update resume to naukri with one click

NAUKRI RESUME AUTO UPDATER I am using poetry for dependencies. you can check or change in data.txt file for username and password Resume file must be

Rahul.p 1 May 02, 2022
Grade Notifyer Bot

A bot that automatically crawl the submission platform of montefiore to notify the student when a project has been graded.

Julien Gustin 2 Jun 02, 2022
A Telegram bot to index Chinese and Japanese group contents, works with @lilydjwg/luoxu.

luoxu-bot luoxu-bot 是类似于 luoxu-web 的 CJK 友好的 Telegram Bot,依赖于 luoxu 所创建的后端。 测试环境 Python 3.7.9 pip 21.1.2 开发中使用到的 Telethon 需要 Python 3+ 配置 前往 luoxu 根据相

TigerBeanst 10 Nov 18, 2022
SmartFile API Client (Python).

A SmartFile Open Source project. Read more about how SmartFile uses and contributes to Open Source software. Summary This library includes two API cli

SmartFile 19 Jan 11, 2022
Discord Bot written in Python that plays music in your voice channel

Discord Bot that plays music! I decided to create a simple Discord bot using Python in order to advance my coding skills. Please don't ask me for help

Eric Yeung 39 Jan 01, 2023
Programa de código abierto para probar el API de Bitso, el exchange más importante de América Latina.

Bitso Semiautomático Programa de código abierto para probar el API de Bitso, el exchange más importante de América Latina. Desarrollador Fernando Mire

Fernando Mireles 17 Dec 07, 2022
Decryption utility for PGP Whole Disk Encryption

wdepy: Decryption and Inspection for PGP WDE Disks This is a small python tool to inspect and decrypt disk images encrypted with PGP Whole Disk Encryp

Brendan Dolan-Gavitt 17 Oct 07, 2022
The first open-source PyTgCalls-based project.

Calls Music — Telegram bot + userbot for streaming audio in group calls ✍🏻 Requirements FFmpeg Python 3.7+ 🚀 Deployment 🛠 Configuration Copy exampl

Calls Music 74 Nov 19, 2022
NMux is the version of the NMscript in termux

NMscript-termux-version Termux-Version NMux is the termux version of NMscript which is NMscript? NMscript is a simple script written in Python that he

cabeson sin z 5 Apr 23, 2022
An powerfull telegram group management anime themed bot.

ErzaScarlet Erza Scarlet is the female deuteragonist of the anime/manga series Fairy Tail. She is an S-class Mage from the Guild Fairy Tail. Like most

ꜱōʜᴇʀᴜ ᴋāɴ (AKA) ꜱᴏʜᴀɪʟ ᴋʜᴀɴ 2 May 19, 2022
Bitstamp API wrapper for Python

NOTICE: THIS REPOSITORY IS NO LONGER ACTIVELY MAINTAINED It is highly unlikely that I will respond to PRs and questions about usage. This library was

Jack Preston 53 Mar 09, 2022
A Bot to get RealTime Tweets to a Specific Chats from Desired Persons on Twitter to Telegram Chat.

TgTwitterStreamer A Bot to get RealTime Tweets to a Specific Chats from Desired Persons on Twitter to Telegram Chat. For Getting ENV's Refer this Link

Anonymous 69 Dec 20, 2022
Template to create a telegram bot in python

Template for Telegram Bot Template to create a telegram bot in python. How to Run First add src to PYTHONPATH: export PYTHONPATH=${PWD} Then run: pyt

Ali Hejazizo 12 Dec 24, 2022
Crud-python-sqlite: used to manage telephone contacts through python and sqlite

crud-python-sqlite This program is used to manage telephone contacts through python and sqlite. Dependencicas python3 sqlite3 Installation Clone the r

Luis Negrón 0 Jan 24, 2022