A Python library for Deep Graph Networks

Related tags

Deep LearningPyDGN
Overview

PyDGN

Wiki

Description

This is a Python library to easily experiment with Deep Graph Networks (DGNs). It provides automatic management of data splitting, loading and the most common experimental settings. It also handles both model selection and risk assessment procedures, by trying many different configurations in parallel (CPU or GPU). This repository is built upon the Pytorch Geometric Library, which provides support for data management.

If you happen to use or modify this code, please remember to cite our tutorial paper:

Bacciu Davide, Errica Federico, Micheli Alessio, Podda Marco: A Gentle Introduction to Deep Learning for Graphs, Neural Networks, 2020. DOI: 10.1016/j.neunet.2020.06.006.

If you are interested in a rigorous evaluation of Deep Graph Networks, check this out:

Errica Federico, Podda Marco, Bacciu Davide, Micheli Alessio: A Fair Comparison of Graph Neural Networks for Graph Classification. Proceedings of the 8th International Conference on Learning Representations (ICLR 2020). Code

Installation:

(We assume git and Miniconda/Anaconda are installed)

First, make sure gcc 5.2.0 is installed: conda install -c anaconda libgcc=5.2.0. Then, echo $LD_LIBRARY_PATH should always contain :/home/[your user name]/miniconda3/lib. Then run from your terminal the following command:

source setup/install.sh [<your_cuda_version>]
pip install pydgn

Where <your_cuda_version> is an optional argument that can be either cpu, cu102 or cu111 for Pytorch >= 1.8.0. If you do not provide a cuda version, the script will default to cpu. The script will create a virtual environment named pydgn, with all the required packages needed to run our code. Important: do NOT run this command using bash instead of source!

Remember that PyTorch MacOS Binaries dont support CUDA, install from source if CUDA is needed

Usage:

Preprocess your dataset (see also Wiki)

python build_dataset.py --config-file [your data config file]

Exampla

python build_dataset.py --config-file DATA_CONFIGS/config_PROTEINS.yml 

Launch an experiment in debug mode (see also Wiki)

python launch_experiment.py --config-file [your exp. config file] --splits-folder [the splits MAIN folder] --data-splits [the splits file] --data-root [root folder of your data] --dataset-name [name of the dataset] --dataset-class [class that handles the dataset] --max-cpus [max cpu parallelism] --max-gpus [max gpu parallelism] --gpus-per-task [how many gpus to allocate for each job] --final-training-runs [how many final runs when evaluating on test. Results are averaged] --result-folder [folder where to store results]

Example (GPU required)

python launch_experiment.py --config-file MODEL_CONFIGS/config_SupToyDGN_RandomSearch.yml --splits-folder DATA_SPLITS/CHEMICAL/ --data-splits DATA_SPLITS/CHEMICAL/PROTEINS/PROTEINS_outer10_inner1.splits --data-root DATA --dataset-name PROTEINS --dataset-class pydgn.data.dataset.TUDatasetInterface --max-cpus 1 --max-gpus 1 --final-training-runs 1 --result-folder RESULTS/DEBUG

To debug your code it is useful to add --debug to the command above. Notice, however, that the CLI will not work as expected here, as code will be executed sequentially. After debugging, if you need sequential execution, you can use --max-cpus 1 --max-gpus 1 --gpus-per-task [0/1] without the --debug option.

Grid Search 101

Have a look at one of the config files.

Random Search 101

Specify a num_samples in the config file with the number of random trials, replace grid with random, and specify a sampling method for each hyper-parameter. We provide different sampling methods:

  • choice --> pick at random from a list of arguments
  • uniform --> pick uniformly from min and max arguments
  • normal --> sample from normal distribution with mean and std
  • randint --> pick at random from min and max
  • loguniform --> pick following the recprocal distribution from log_min, log_max, with a specified base

There is one config file, namely config_SupToyDGN_RandomSearch.yml, which you can check to see an example.

Data Splits

We provide the data splits taken from

Errica Federico, Podda Marco, Bacciu Davide, Micheli Alessio: A Fair Comparison of Graph Neural Networks for Graph Classification. Proceedings of the 8th International Conference on Learning Representations (ICLR 2020). Code

in the DATA_SPLITS folder.

Credits:

This is a joint project with Marco Podda (Github /Homepage), whom I thank for his relentless dedication.

Many thanks to Antonio Carta (Github/Homepage) for incorporating the Ray library (see v0.4.0) into PyDGN! This will be of tremendous help.

Many thanks to Danilo Numeroso (Github /Homepage) for implementing a very flexible random search! This is a very convenient alternative to grid search.

Contributing

This research software is provided as-is. We are working on this library in our spare time.

If you find a bug, please open an issue to report it, and we will do our best to solve it. For generic/technical questions, please email us rather than opening an issue.

License:

PyDGN is GPL 3.0 licensed, as written in the LICENSE file.

Troubleshooting

As of 15th of August 2021, there is an issue with Pytorch 1.9.0 which impacts the CLI. This is why the setup script installs Pytorch 1.8.1 in the pydgn conda environment until Pytorch 1.10 is released (known to solve the issue).

--

If you get errors like /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found:

  • make sure gcc 5.2.0 is installed: conda install -c anaconda libgcc=5.2.0
  • echo $LD_LIBRARY_PATH should contain :/home/[your user name]/[your anaconda or miniconda folder name]/lib
  • after checking the above points, you can reinstall everything with pip using the --no-cache-dir option
Comments
  • Keep getting raylet error

    Keep getting raylet error

    🔨 Describe the bug

    Hi, I am keep getting raylet error when tryin g to run the example. Is there a way to stop using ray since I am running the experiment on my local computer?

    Thank you!

    (raylet) /home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/autoscaler/_private/cli_logger.py:57: FutureWarning: Not all Ray CLI dependencies were found. In Ray 1.4+, the Ray CLI, autoscaler, and dashboard will only be usable via pip install 'ray[default]'. Please update your install command. (raylet) warnings.warn( 2022-10-14 13:08:12,974 WARNING worker.py:1189 -- The agent on node EW22-05284 failed with the following error: Traceback (most recent call last): File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 354, in loop.run_until_complete(agent.run()) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete return future.result() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 144, in run modules = self._load_modules() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 98, in _load_modules c = cls(self) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/modules/reporter/reporter_agent.py", line 148, in init self._metrics_agent = MetricsAgent(dashboard_agent.metrics_export_port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/metrics_agent.py", line 75, in init prometheus_exporter.new_stats_exporter( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 333, in new_stats_exporter exporter = PrometheusStatsExporter( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 266, in init self.serve_http() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 320, in serve_http start_http_server( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 168, in start_wsgi_server TmpServer.address_family, addr = _get_best_family(addr, port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 157, in _get_best_family infos = socket.getaddrinfo(address, port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/socket.py", line 918, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -2] Name or service not known

    (raylet) Traceback (most recent call last): (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 366, in (raylet) raise e (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 354, in (raylet) loop.run_until_complete(agent.run()) (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete (raylet) return future.result() (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 144, in run (raylet) modules = self._load_modules() (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 98, in _load_modules (raylet) c = cls(self) (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/modules/reporter/reporter_agent.py", line 148, in init (raylet) self._metrics_agent = MetricsAgent(dashboard_agent.metrics_export_port) (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/metrics_agent.py", line 75, in init (raylet) prometheus_exporter.new_stats_exporter( (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 333, in new_stats_exporter (raylet) exporter = PrometheusStatsExporter( (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 266, in init (raylet) self.serve_http() (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 320, in serve_http (raylet) start_http_server( (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 168, in start_wsgi_server (raylet) TmpServer.address_family, addr = _get_best_family(addr, port) (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 157, in _get_best_family (raylet) infos = socket.getaddrinfo(address, port) (raylet) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/socket.py", line 918, in getaddrinfo (raylet) for res in _socket.getaddrinfo(host, port, family, type, proto, flags): (raylet) socket.gaierror: [Errno -2] Name or service not known (raylet) /home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/autoscaler/_private/cli_logger.py:57: FutureWarning: Not all Ray CLI dependencies were found. In Ray 1.4+, the Ray CLI, autoscaler, and dashboard will only be usable via pip install 'ray[default]'. Please update your install command. (raylet) warnings.warn( 2022-10-14 13:08:14,624 WARNING worker.py:1189 -- The agent on node EW22-05284 failed with the following error: Traceback (most recent call last): File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 354, in loop.run_until_complete(agent.run()) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete return future.result() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 144, in run modules = self._load_modules() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 98, in _load_modules c = cls(self) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/new_dashboard/modules/reporter/reporter_agent.py", line 148, in init self._metrics_agent = MetricsAgent(dashboard_agent.metrics_export_port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/metrics_agent.py", line 75, in init prometheus_exporter.new_stats_exporter( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 333, in new_stats_exporter exporter = PrometheusStatsExporter( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 266, in init self.serve_http() File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/ray/_private/prometheus_exporter.py", line 320, in serve_http start_http_server( File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 168, in start_wsgi_server TmpServer.address_family, addr = _get_best_family(addr, port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/site-packages/prometheus_client/exposition.py", line 157, in _get_best_family infos = socket.getaddrinfo(address, port) File "/home/jwtxwd/anaconda3/envs/pydgn/lib/python3.8/socket.py", line 918, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -2] Name or service not known

    bug 
    opened by jwtxwd 4
  • Training engine and returned data list

    Training engine and returned data list

    Feature description

    When shuffling the dataset, the training engine will return the data list shuffled according to the permutation of the data sampler. We should make sure that we reorder the data list.

    Ideas on how to do it

    No response

    Additional info

    No response

    opened by diningphil 2
  • [WIP] feat(plotter): Add W&B Logging

    [WIP] feat(plotter): Add W&B Logging

    This PR proposes to add Weights & Biases logging to the library using helpful tensorboard based utilities. Instead of a separate logger, currently I propose we just monkeypatch and upload the Tensorboard logs to W&B.

    Leaving it as a Draft for now to start a conversation.

    CC: @diningphil @gravins

    opened by SauravMaheshkar 2
  • Fix Ray not deallocating GPU memory

    Fix Ray not deallocating GPU memory

    🔨 Describe the bug

    Some IDLE workers do not release the memory. This is a problem as described (together with potential fix) here: https://docs.ray.io/en/latest/ray-core/tasks/using-ray-with-gpus.html#workers-not-releasing-gpu-resources

    bug 
    opened by diningphil 1
  • Add option to specify subset of GPUs

    Add option to specify subset of GPUs

    Feature description

    In case one wants to force specific GPU IDs to be used, it should be possible to do so when running pydgn-train.

    Ideas on how to do it

    No response

    Additional info

    No response

    opened by diningphil 1
  • Bump aiohttp from 3.7 to 3.7.4

    Bump aiohttp from 3.7 to 3.7.4

    Bumps aiohttp from 3.7 to 3.7.4.

    Release notes

    Sourced from aiohttp's releases.

    aiohttp 3.7.3 release

    Features

    • Use Brotli instead of brotlipy [#3803](https://github.com/aio-libs/aiohttp/issues/3803) <https://github.com/aio-libs/aiohttp/issues/3803>_
    • Made exceptions pickleable. Also changed the repr of some exceptions. [#4077](https://github.com/aio-libs/aiohttp/issues/4077) <https://github.com/aio-libs/aiohttp/issues/4077>_

    Bugfixes

    • Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. [#3532](https://github.com/aio-libs/aiohttp/issues/3532) <https://github.com/aio-libs/aiohttp/issues/3532>_
    • Fix web_middlewares.normalize_path_middleware behavior for patch without slash. [#3669](https://github.com/aio-libs/aiohttp/issues/3669) <https://github.com/aio-libs/aiohttp/issues/3669>_
    • Fix overshadowing of overlapped sub-applications prefixes. [#3701](https://github.com/aio-libs/aiohttp/issues/3701) <https://github.com/aio-libs/aiohttp/issues/3701>_
    • Make BaseConnector.close() a coroutine and wait until the client closes all connections. Drop deprecated "with Connector():" syntax. [#3736](https://github.com/aio-libs/aiohttp/issues/3736) <https://github.com/aio-libs/aiohttp/issues/3736>_
    • Reset the sock_read timeout each time data is received for a aiohttp.client response. [#3808](https://github.com/aio-libs/aiohttp/issues/3808) <https://github.com/aio-libs/aiohttp/issues/3808>_
    • Fixed type annotation for add_view method of UrlDispatcher to accept any subclass of View [#3880](https://github.com/aio-libs/aiohttp/issues/3880) <https://github.com/aio-libs/aiohttp/issues/3880>_
    • Fixed querying the address families from DNS that the current host supports. [#5156](https://github.com/aio-libs/aiohttp/issues/5156) <https://github.com/aio-libs/aiohttp/issues/5156>_
    • Change return type of MultipartReader.aiter() and BodyPartReader.aiter() to AsyncIterator. [#5163](https://github.com/aio-libs/aiohttp/issues/5163) <https://github.com/aio-libs/aiohttp/issues/5163>_
    • Provide x86 Windows wheels. [#5230](https://github.com/aio-libs/aiohttp/issues/5230) <https://github.com/aio-libs/aiohttp/issues/5230>_

    Improved Documentation

    • Add documentation for aiohttp.web.FileResponse. [#3958](https://github.com/aio-libs/aiohttp/issues/3958) <https://github.com/aio-libs/aiohttp/issues/3958>_
    • Removed deprecation warning in tracing example docs [#3964](https://github.com/aio-libs/aiohttp/issues/3964) <https://github.com/aio-libs/aiohttp/issues/3964>_
    • Fixed wrong "Usage" docstring of aiohttp.client.request. [#4603](https://github.com/aio-libs/aiohttp/issues/4603) <https://github.com/aio-libs/aiohttp/issues/4603>_
    • Add aiohttp-pydantic to third party libraries [#5228](https://github.com/aio-libs/aiohttp/issues/5228) <https://github.com/aio-libs/aiohttp/issues/5228>_

    Misc

    ... (truncated)

    Changelog

    Sourced from aiohttp's changelog.

    3.7.4 (2021-02-25)

    Bugfixes

    • (SECURITY BUG) Started preventing open redirects in the aiohttp.web.normalize_path_middleware middleware. For more details, see https://github.com/aio-libs/aiohttp/security/advisories/GHSA-v6wp-4m6f-gcjg.

      Thanks to Beast Glatisant <https://github.com/g147>__ for finding the first instance of this issue and Jelmer Vernooij <https://jelmer.uk/>__ for reporting and tracking it down in aiohttp. [#5497](https://github.com/aio-libs/aiohttp/issues/5497) <https://github.com/aio-libs/aiohttp/issues/5497>_

    • Fix interpretation difference of the pure-Python and the Cython-based HTTP parsers construct a yarl.URL object for HTTP request-target.

      Before this fix, the Python parser would turn the URI's absolute-path for //some-path into / while the Cython code preserved it as //some-path. Now, both do the latter. [#5498](https://github.com/aio-libs/aiohttp/issues/5498) <https://github.com/aio-libs/aiohttp/issues/5498>_


    3.7.3 (2020-11-18)

    Features

    • Use Brotli instead of brotlipy [#3803](https://github.com/aio-libs/aiohttp/issues/3803) <https://github.com/aio-libs/aiohttp/issues/3803>_
    • Made exceptions pickleable. Also changed the repr of some exceptions. [#4077](https://github.com/aio-libs/aiohttp/issues/4077) <https://github.com/aio-libs/aiohttp/issues/4077>_

    Bugfixes

    • Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. [#3532](https://github.com/aio-libs/aiohttp/issues/3532) <https://github.com/aio-libs/aiohttp/issues/3532>_
    • Fix web_middlewares.normalize_path_middleware behavior for patch without slash. [#3669](https://github.com/aio-libs/aiohttp/issues/3669) <https://github.com/aio-libs/aiohttp/issues/3669>_
    • Fix overshadowing of overlapped sub-applications prefixes. [#3701](https://github.com/aio-libs/aiohttp/issues/3701) <https://github.com/aio-libs/aiohttp/issues/3701>_

    ... (truncated)

    Commits
    • 0a26acc Bump aiohttp to v3.7.4 for a security release
    • 021c416 Merge branch 'ghsa-v6wp-4m6f-gcjg' into master
    • 4ed7c25 Bump chardet from 3.0.4 to 4.0.0 (#5333)
    • b61f0fd Fix how pure-Python HTTP parser interprets //
    • 5c1efbc Bump pre-commit from 2.9.2 to 2.9.3 (#5322)
    • 0075075 Bump pygments from 2.7.2 to 2.7.3 (#5318)
    • 5085173 Bump multidict from 5.0.2 to 5.1.0 (#5308)
    • 5d1a75e Bump pre-commit from 2.9.0 to 2.9.2 (#5290)
    • 6724d0e Bump pre-commit from 2.8.2 to 2.9.0 (#5273)
    • c688451 Removed duplicate timeout parameter in ClientSession reference docs. (#5262) ...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump aiohttp from 3.7 to 3.7.4 in /.github

    Bump aiohttp from 3.7 to 3.7.4 in /.github

    Bumps aiohttp from 3.7 to 3.7.4.

    Release notes

    Sourced from aiohttp's releases.

    aiohttp 3.7.3 release

    Features

    • Use Brotli instead of brotlipy [#3803](https://github.com/aio-libs/aiohttp/issues/3803) <https://github.com/aio-libs/aiohttp/issues/3803>_
    • Made exceptions pickleable. Also changed the repr of some exceptions. [#4077](https://github.com/aio-libs/aiohttp/issues/4077) <https://github.com/aio-libs/aiohttp/issues/4077>_

    Bugfixes

    • Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. [#3532](https://github.com/aio-libs/aiohttp/issues/3532) <https://github.com/aio-libs/aiohttp/issues/3532>_
    • Fix web_middlewares.normalize_path_middleware behavior for patch without slash. [#3669](https://github.com/aio-libs/aiohttp/issues/3669) <https://github.com/aio-libs/aiohttp/issues/3669>_
    • Fix overshadowing of overlapped sub-applications prefixes. [#3701](https://github.com/aio-libs/aiohttp/issues/3701) <https://github.com/aio-libs/aiohttp/issues/3701>_
    • Make BaseConnector.close() a coroutine and wait until the client closes all connections. Drop deprecated "with Connector():" syntax. [#3736](https://github.com/aio-libs/aiohttp/issues/3736) <https://github.com/aio-libs/aiohttp/issues/3736>_
    • Reset the sock_read timeout each time data is received for a aiohttp.client response. [#3808](https://github.com/aio-libs/aiohttp/issues/3808) <https://github.com/aio-libs/aiohttp/issues/3808>_
    • Fixed type annotation for add_view method of UrlDispatcher to accept any subclass of View [#3880](https://github.com/aio-libs/aiohttp/issues/3880) <https://github.com/aio-libs/aiohttp/issues/3880>_
    • Fixed querying the address families from DNS that the current host supports. [#5156](https://github.com/aio-libs/aiohttp/issues/5156) <https://github.com/aio-libs/aiohttp/issues/5156>_
    • Change return type of MultipartReader.aiter() and BodyPartReader.aiter() to AsyncIterator. [#5163](https://github.com/aio-libs/aiohttp/issues/5163) <https://github.com/aio-libs/aiohttp/issues/5163>_
    • Provide x86 Windows wheels. [#5230](https://github.com/aio-libs/aiohttp/issues/5230) <https://github.com/aio-libs/aiohttp/issues/5230>_

    Improved Documentation

    • Add documentation for aiohttp.web.FileResponse. [#3958](https://github.com/aio-libs/aiohttp/issues/3958) <https://github.com/aio-libs/aiohttp/issues/3958>_
    • Removed deprecation warning in tracing example docs [#3964](https://github.com/aio-libs/aiohttp/issues/3964) <https://github.com/aio-libs/aiohttp/issues/3964>_
    • Fixed wrong "Usage" docstring of aiohttp.client.request. [#4603](https://github.com/aio-libs/aiohttp/issues/4603) <https://github.com/aio-libs/aiohttp/issues/4603>_
    • Add aiohttp-pydantic to third party libraries [#5228](https://github.com/aio-libs/aiohttp/issues/5228) <https://github.com/aio-libs/aiohttp/issues/5228>_

    Misc

    ... (truncated)

    Changelog

    Sourced from aiohttp's changelog.

    3.7.4 (2021-02-25)

    Bugfixes

    • (SECURITY BUG) Started preventing open redirects in the aiohttp.web.normalize_path_middleware middleware. For more details, see https://github.com/aio-libs/aiohttp/security/advisories/GHSA-v6wp-4m6f-gcjg.

      Thanks to Beast Glatisant <https://github.com/g147>__ for finding the first instance of this issue and Jelmer Vernooij <https://jelmer.uk/>__ for reporting and tracking it down in aiohttp. [#5497](https://github.com/aio-libs/aiohttp/issues/5497) <https://github.com/aio-libs/aiohttp/issues/5497>_

    • Fix interpretation difference of the pure-Python and the Cython-based HTTP parsers construct a yarl.URL object for HTTP request-target.

      Before this fix, the Python parser would turn the URI's absolute-path for //some-path into / while the Cython code preserved it as //some-path. Now, both do the latter. [#5498](https://github.com/aio-libs/aiohttp/issues/5498) <https://github.com/aio-libs/aiohttp/issues/5498>_


    3.7.3 (2020-11-18)

    Features

    • Use Brotli instead of brotlipy [#3803](https://github.com/aio-libs/aiohttp/issues/3803) <https://github.com/aio-libs/aiohttp/issues/3803>_
    • Made exceptions pickleable. Also changed the repr of some exceptions. [#4077](https://github.com/aio-libs/aiohttp/issues/4077) <https://github.com/aio-libs/aiohttp/issues/4077>_

    Bugfixes

    • Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. [#3532](https://github.com/aio-libs/aiohttp/issues/3532) <https://github.com/aio-libs/aiohttp/issues/3532>_
    • Fix web_middlewares.normalize_path_middleware behavior for patch without slash. [#3669](https://github.com/aio-libs/aiohttp/issues/3669) <https://github.com/aio-libs/aiohttp/issues/3669>_
    • Fix overshadowing of overlapped sub-applications prefixes. [#3701](https://github.com/aio-libs/aiohttp/issues/3701) <https://github.com/aio-libs/aiohttp/issues/3701>_

    ... (truncated)

    Commits
    • 0a26acc Bump aiohttp to v3.7.4 for a security release
    • 021c416 Merge branch 'ghsa-v6wp-4m6f-gcjg' into master
    • 4ed7c25 Bump chardet from 3.0.4 to 4.0.0 (#5333)
    • b61f0fd Fix how pure-Python HTTP parser interprets //
    • 5c1efbc Bump pre-commit from 2.9.2 to 2.9.3 (#5322)
    • 0075075 Bump pygments from 2.7.2 to 2.7.3 (#5318)
    • 5085173 Bump multidict from 5.0.2 to 5.1.0 (#5308)
    • 5d1a75e Bump pre-commit from 2.9.0 to 2.9.2 (#5290)
    • 6724d0e Bump pre-commit from 2.8.2 to 2.9.0 (#5273)
    • c688451 Removed duplicate timeout parameter in ClientSession reference docs. (#5262) ...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Weighted Additive Loss

    Weighted Additive Loss

    Feature description

    Improve AdditiveLoss by adding the possibility of weighting the different losses.

    Ideas on how to do it

    No response

    Additional info

    No response

    opened by diningphil 1
  • Telegram Bot support

    Telegram Bot support

    Feature description

    Add telegram bot support to send messages whenever an experiment breaks suddenly or the entire set of experiments (with the chance to have granularity) has finished

    Ideas on how to do it

    Once you create a bot, it should be easy to send a message to a particular chat.

    Additional info

    No response

    opened by diningphil 1
  • Accumulate predictions for metric that require global statistics

    Accumulate predictions for metric that require global statistics

    Feature description

    Metrics like AP require that all the samples of the train/valt/test dataset are taken into account when computing a score. We should add an option to allow, with greater usage of memory, to accumulate all predictions and target values until the end of an epoch, and subsequently compute the epoch score.

    Ideas on how to do it

    No response

    Additional info

    No response

    opened by diningphil 1
  • Metric improvement

    Metric improvement

    Feature description

    Metric should always use the result from the _handle_reduction function, even when accumulating the number of samples.

    This is because _handle_reduction may do something more complicated in some metrics that override the function.

    Be careful, however, that this works for both "mean" and "sum" reductions.

    Ideas on how to do it

    No response

    Additional info

    No response

    opened by diningphil 1
  • Support for Single-experiment/Multi-GPU

    Support for Single-experiment/Multi-GPU

    Feature description

    Allow an experiment to run on multiple GPUs.

    Ideas on how to do it

    No response

    Additional info

    Remember to set the appropriate seed on all GPUs using torch.cuda.manual_seed_all

    opened by diningphil 1
Releases(v1.3.0.post2)
Owner
Federico Errica
Teaching Machines
Federico Errica
PyTorch code of my ICDAR 2021 paper Vision Transformer for Fast and Efficient Scene Text Recognition (ViTSTR)

Vision Transformer for Fast and Efficient Scene Text Recognition (ICDAR 2021) ViTSTR is a simple single-stage model that uses a pre-trained Vision Tra

Rowel Atienza 198 Dec 27, 2022
Convolutional Neural Network for Text Classification in Tensorflow

This code belongs to the "Implementing a CNN for Text Classification in Tensorflow" blog post. It is slightly simplified implementation of Kim's Convo

Denny Britz 5.5k Jan 02, 2023
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"

DiscoGAN in PyTorch PyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. * All samples in READM

Taehoon Kim 1k Jan 04, 2023
Multiple Object Extraction from Aerial Imagery with Convolutional Neural Networks

This is an implementation of Volodymyr Mnih's dissertation methods on his Massachusetts road & building dataset and my original methods that are publi

Shunta Saito 255 Sep 07, 2022
Multi-Horizon-Forecasting-for-Limit-Order-Books

Multi-Horizon-Forecasting-for-Limit-Order-Books This jupyter notebook is used to demonstrate our work, Multi-Horizon Forecasting for Limit Order Books

Zihao Zhang 116 Dec 23, 2022
Contrastive Learning for Compact Single Image Dehazing, CVPR2021

AECR-Net Contrastive Learning for Compact Single Image Dehazing, CVPR2021. Official Pytorch based implementation. Paper arxiv Pytorch Version TODO: mo

glassy 253 Jan 01, 2023
[LREC] MMChat: Multi-Modal Chat Dataset on Social Media

MMChat This repo contains the code and data for the LREC2022 paper MMChat: Multi-Modal Chat Dataset on Social Media. Dataset MMChat is a large-scale d

Silver 47 Jan 03, 2023
Vector.ai assignment

fabio-tests-nisargatman Low Level Approach: ###Tables: continents: id*, name, population, area, createdAt, updatedAt countries: id*, name, population,

Ravi Pullagurla 1 Nov 09, 2021
A proof of concept ai-powered Recaptcha v2 solver

Recaptcha Fullauto I've decided to open source my old Recaptcha v2 solver. My latest version will be opened sourced this summer. I am hoping this proj

Nate 60 Dec 20, 2022
Help you understand Manual and w/ Clutch point while driving.

简体中文 forza_auto_gear forza_auto_gear is a tool for Forza Horizon 5. It will help us understand the best gear shift point using Manual or w/ Clutch in

15 Oct 08, 2022
S2s2net - Sentinel-2 Super-Resolution Segmentation Network

S2S2Net Sentinel-2 Super-Resolution Segmentation Network Getting started Install

Wei Ji 10 Nov 10, 2022
Official implementation of the Implicit Behavioral Cloning (IBC) algorithm

Implicit Behavioral Cloning This codebase contains the official implementation of the Implicit Behavioral Cloning (IBC) algorithm from our paper: Impl

Google Research 210 Dec 09, 2022
A framework for joint super-resolution and image synthesis, without requiring real training data

SynthSR This repository contains code to train a Convolutional Neural Network (CNN) for Super-resolution (SR), or joint SR and data synthesis. The met

83 Jan 01, 2023
PyTorch implementation of federated learning framework based on the acceleration of global momentum

Federated Learning with Acceleration of Global Momentum PyTorch implementation of federated learning framework based on the acceleration of global mom

0 Dec 23, 2021
This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transformers.

TransMix: Attend to Mix for Vision Transformers This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transf

Jie-Neng Chen 130 Jan 01, 2023
GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

564 Jan 02, 2023
CONditionals for Ordinal Regression and classification in PyTorch

CONDOR pytorch implementation for ordinal regression with deep neural networks. Documentation: https://GarrettJenkinson.github.io/condor_pytorch About

7 Jul 25, 2022
[ICCV 2021] Focal Frequency Loss for Image Reconstruction and Synthesis

Focal Frequency Loss - Official PyTorch Implementation This repository provides the official PyTorch implementation for the following paper: Focal Fre

Liming Jiang 460 Jan 04, 2023
4th place solution for the SIGIR 2021 challenge.

SIGIR-2021 (Tinkoff.AI) How to start Download train and test data: https://sigir-ecom.github.io/data-task.html Place it under sigir-2021/data/. Run py

Tinkoff.AI 4 Jul 01, 2022
PyTorch implementation of "Contrast to Divide: self-supervised pre-training for learning with noisy labels"

Contrast to Divide: self-supervised pre-training for learning with noisy labels This is an official implementation of "Contrast to Divide: self-superv

55 Nov 23, 2022