🐦 Opytimizer is a Python library consisting of meta-heuristic optimization techniques.

Overview

Opytimizer: A Nature-Inspired Python Optimizer

Latest release DOI Build status Open issues License

Welcome to Opytimizer.

Did you ever reach a bottleneck in your computational experiments? Are you tired of selecting suitable parameters for a chosen technique? If yes, Opytimizer is the real deal! This package provides an easy-to-go implementation of meta-heuristic optimizations. From agents to search space, from internal functions to external communication, we will foster all research related to optimizing stuff.

Use Opytimizer if you need a library or wish to:

  • Create your optimization algorithm;
  • Design or use pre-loaded optimization tasks;
  • Mix-and-match different strategies to solve your problem;
  • Because it is fun to optimize things.

Read the docs at opytimizer.readthedocs.io.

Opytimizer is compatible with: Python 3.6+.


Package guidelines

  1. The very first information you need is in the very next section.
  2. Installing is also easy if you wish to read the code and bump yourself into, follow along.
  3. Note that there might be some additional steps in order to use our solutions.
  4. If there is a problem, please do not hesitate, call us.
  5. Finally, we focus on minimization. Take that in mind when designing your problem.

Citation

If you use Opytimizer to fulfill any of your needs, please cite us:

@misc{rosa2019opytimizer,
    title={Opytimizer: A Nature-Inspired Python Optimizer},
    author={Gustavo H. de Rosa, Douglas Rodrigues and João P. Papa},
    year={2019},
    eprint={1912.13002},
    archivePrefix={arXiv},
    primaryClass={cs.NE}
}

Getting started: 60 seconds with Opytimizer

First of all. We have examples. Yes, they are commented. Just browse to examples/, chose your subpackage, and follow the example. We have high-level examples for most tasks we could think of and amazing integrations (Learnergy, NALP, OPFython, PyTorch, Scikit-Learn, Tensorflow).

Alternatively, if you wish to learn even more, please take a minute:

Opytimizer is based on the following structure, and you should pay attention to its tree:

- opytimizer
    - core
        - agent
        - function
        - node
        - optimizer
        - space
    - functions
        - weighted
    - math
        - distribution
        - general
        - hyper
        - random
    - optimizers
        - boolean
        - evolutionary
        - misc
        - population
        - science
        - social
        - swarm
    - spaces
        - boolean
        - grid
        - hyper_complex
        - search
        - tree
    - utils
        - constants
        - decorator
        - exception
        - history
        - logging
    - visualization
        - convergence
        - surface

Core

Core is the core. Essentially, it is the parent of everything. You should find parent classes defining the basis of our structure. They should provide variables and methods that will help to construct other modules.

Functions

Instead of using raw and straightforward functions, why not try this module? Compose high-level abstract functions or even new function-based ideas in order to solve your problems. Note that for now, we will only support multi-objective function strategies.

Math

Just because we are computing stuff, it does not means that we do not need math. Math is the mathematical package, containing low-level math implementations. From random numbers to distributions generation, you can find your needs on this module.

Optimizers

This is why we are called Opytimizer. This is the heart of the heuristics, where you can find a large number of meta-heuristics, optimization techniques, anything that can be called as an optimizer. Please take a look on the available optimizers.

Spaces

One can see the space as the place that agents will update their positions and evaluate a fitness function. However, the newest approaches may consider a different type of space. Thinking about that, we are glad to support diverse space implementations.

Utils

This is a utility package. Common things shared across the application should be implemented here. It is better to implement once and use as you wish than re-implementing the same thing over and over again.

Visualization

Everyone needs images and plots to help visualize what is happening, correct? This package will provide every visual-related method for you. Check a specific variable convergence, your fitness function convergence, plot benchmark function surfaces, and much more!


Installation

We believe that everything has to be easy. Not tricky or daunting, Opytimizer will be the one-to-go package that you will need, from the very first installation to the daily-tasks implementing needs. If you may just run the following under your most preferred Python environment (raw, conda, virtualenv, whatever):

pip install opytimizer

Alternatively, if you prefer to install the bleeding-edge version, please clone this repository and use:

pip install -e .

Environment configuration

Note that sometimes, there is a need for additional implementation. If needed, from here, you will be the one to know all of its details.

Ubuntu

No specific additional commands needed.

Windows

No specific additional commands needed.

MacOS

No specific additional commands needed.


Support

We know that we do our best, but it is inevitable to acknowledge that we make mistakes. If you ever need to report a bug, report a problem, talk to us, please do so! We will be available at our bests at this repository or [email protected].


Comments
  • [BUG] AttributeError: 'History' object has no attribute 'show'

    [BUG] AttributeError: 'History' object has no attribute 'show'

    Describe the bug A clear and concise description of what the bug is.

    It looks like there is no show() method for the returned opytimizer history.

    To Reproduce Steps to reproduce the behavior:

    1. Follow the steps from the wiki Tutorial: Your first optimization
      1. Run the optimizer with o.start(): o = Opytimizer(space=s, optimizer=p, function=f) history = o.start()
      2. Show the history: history.how()

    Expected behavior Not sure what I expected, just curious :)

    Screenshots

    2020-01-02 13:26:28,270 - opytimizer.optimizers.fa — INFO — Iteration 1000/1000
    2020-01-02 13:26:28,278 - opytimizer.optimizers.fa — INFO — Fitness: 4.077875713322641e-14
    2020-01-02 13:26:28,279 - opytimizer.optimizers.fa — INFO — Position: [[-1.42791381e-07]
     [-1.42791381e-07]]
    2020-01-02 13:26:28,279 - opytimizer.opytimizer — INFO — Optimization task ended.
    2020-01-02 13:26:28,279 - opytimizer.opytimizer — INFO — It took 7.672951936721802 seconds.
    >>> history.show()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    AttributeError: 'History' object has no attribute 'show'
    >>> history
    <opytimizer.utils.history.History object at 0x113b75be0>
    >>> history.show()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    AttributeError: 'History' object has no attribute 'show'
    

    Desktop (please complete the following information):

    • OS: macOS Mojave 10.14.6
    • Virtual Environment: conda base
    • Python Version:
    (base) justinmai$ python3
    Python 3.7.3 (default, Mar 27 2019, 16:54:48) 
    [Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
    

    Additional context Add any other context about the problem here.

    bug 
    opened by justinTM 5
  • [REG] How to supress DEBUG log message in opytimizer.core.space

    [REG] How to supress DEBUG log message in opytimizer.core.space

    Hello, After inititializing Searchspace there is a debug message that is printed to stdout. How can we turn it off/on? Following is the message. opytimizer.core.space — DEBUG — Agents: 25 |....

    I believe its printed because of line #223 in file opytimizer/core/space.py.

    For large dimensions it prints all the lower and upper bounds which we may not always require.

    Thanks.

    general 
    opened by amir1m 4
  • [NEW] Dump optimization progress

    [NEW] Dump optimization progress

    Is your feature request related to a problem? Please describe. If the server running fails. We lose the time running.

    Describe the solution you'd like Dump the optimization object from a time to time.

    Describe alternatives you've considered Maybe dump the agents?

    enhancement 
    opened by gcoimbra 4
  • [NEW] Constrained optimization

    [NEW] Constrained optimization

    Hi, thanks for the work.

    Any plans to add functionality to define constraints for the optimization? For instance, inequalities or any arbitrary non-linear constraints on the inputs and/or on the outputs?

    enhancement 
    opened by ogencoglu 4
  • [NEW] Using population data for population-based algorithms?

    [NEW] Using population data for population-based algorithms?

    Hello, First of all, thank you for sharing such a fantastic repo that can be used as an off-the-shelf meta-heuristic optimization algorithms!

    I have a question regarding how to use my own population data for optimizers in optimizer. Rather than using SearchSpace that uses predetermined upper/lower bounds, is there any way I can use my own population samples to start the optimization from?

    Thank you, hope you have a wonderful day!

    enhancement 
    opened by jimmykimmy68 3
  • [REG] What is the difference between grid space, and discrete space example?

    [REG] What is the difference between grid space, and discrete space example?

    Greetings,

    I have a mixed search space problem of five dimensions, 4 discrete and one continuous. how to implement a discrete search space, with these dimensions where the increment won't be the same. I found that grid space offer me the flexibility I need, yet I noticed you used a different implementation in discrete space example so which one should I use?

    My search space:

    step = [1, 1, 1, 1, 0.1]
    lower_bound = [16, 3, 2, 0, 0]
    upper_bound = [400, 20, 20, 1, 0.33]
    

    Thanks in advance.

    general 
    opened by TamerAbdelmigid 3
  • [NEW] Different number of step for each variable

    [NEW] Different number of step for each variable

    Is your feature request related to a problem? Please describe. It's not

    Describe the solution you'd like I'd like a different step size for each

    Additional context Sometimes variables are less sensible than others. Some are integers. Is there anyway to use a different step for each one?

    enhancement 
    opened by gcoimbra 3
  • [REG] How to get a detailed print out during optimization?

    [REG] How to get a detailed print out during optimization?

    Greetings,

    My function takes time getting evaluated, and I want to closely monitor what happens during optimization process. When using grid search space and printing the fitness from my function, I could see the movement along the grid. But, when using normal search space, my processor utilization is 100% and it takes hours with no print out. So, hot to get more details? is there something like a degree of verbosity?

    Thanks in advance.

    general 
    opened by TamerAbdelmigid 2
  • [NEW] Define objective function for regression problem

    [NEW] Define objective function for regression problem

    Hi there,

    I attempted to define an objective function (using CatBoost model for data) to solve a minimum problem in a regression task, however failed to create new objective function. So, do your package offer the solution for regression and can we define such an objective function in this case?

    My desired objective function something like this to minimize the MSE:

    from catboost import CatBoostRegressor as cbr
    cbr_model = cbr()
    
    def objective_function(cbr_model,X_train3, y_train3, X_test3, y_test3):      
        cbr_model.fit(X_train3,y_train3)  
        mse=mean_squared_error(y_test3,cbr_model.predict(X_test3))
        return mse
    

    Many thanks, Thang

    enhancement 
    opened by hanamthang 2
  • [REG]How to plot convergence diagram?

    [REG]How to plot convergence diagram?

    Hello, I was looking for convergence diagram. Found an example of using convergence function opytimizer/examples/visualization/convergence_plotting.py /. However, it uses few constant values of agent positions. Is there a convergence example that shows how to use this function with actual optimization problem such as after carrying out PSO?

    Thanks,

    general 
    opened by amir1m 2
  • [REG]Is there a way to provide initial values before starting optimization?

    [REG]Is there a way to provide initial values before starting optimization?

    Hello, Hope you keeping well.

    I am looking to provide an initial value to optimization loop. Is there a way through searchspace or otherwise to provide initial/starting solution?

    Thanks.

    general 
    opened by amir1m 2
Releases(v3.1.2)
  • v3.1.2(Sep 9, 2022)

    Changelog

    Description

    Welcome to v3.1.2 release.

    In this release, we added variables name mapping to search spaces

    Includes (or changes)

    • core.search_space
    Source code(tar.gz)
    Source code(zip)
  • v3.1.1(May 4, 2022)

    Changelog

    Description

    Welcome to v3.1.1 release.

    In this release, we added pre-commit hooks and annotated typing.

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v3.1.0(Jan 7, 2022)

    Changelog

    Description

    Welcome to v3.1.0 release.

    In this release, we implemented the initial parts for holding graph-related searches, which will support Neural Architecture Search (NAS) in the feature.

    Additionally, we have added the first algorithm for calculating the Pareto frontier of pre-defined points (Non-Dominated Sorting).

    Includes (or changes)

    • core
    • optimizers
    • spaces
    Source code(tar.gz)
    Source code(zip)
  • v3.0.2(Jun 28, 2021)

    Changelog

    Description

    Welcome to v3.0.2 release.

    In this release, we implemented the remaining meta-heuristics that were on hold. Please note that they are supposed to be 100% working, yet we still need to experimentally evaluate their performance.

    Additionally, an important hot-fix regarding the calculation of Euclidean Distance has been corrected.

    Includes (or changes)

    • optimizers
    • math.general
    Source code(tar.gz)
    Source code(zip)
  • v3.0.1(May 31, 2021)

    Changelog

    Description

    Welcome to v3.0.1 release.

    In this release, we have added a bunch of meta-heuristics that were supposed to be implemented earlier.

    Includes (or changes)

    • optimizers
    Source code(tar.gz)
    Source code(zip)
  • v3.0.0(May 12, 2021)

    Changelog

    Description

    Welcome to v3.0.0 release.

    In this release, we have revamped the whole library, rewriting base packages, such as agent, function, node and space, as well as implementing new features, such as callbacks and a better Opytimizer bundler.

    Additionally, we have rewritten every optimizer and their tests, removing more than 2.5k lines that were tagged as "repeatable". Furthermore, we have removed excessive commentaries to provide a cleaner reading and have rewritten every example to include the newest features.

    Please take a while to check our most important advancements and read the docs at: opytimizer.readthedocs.io

    Note that this is a major release and we expect everyone to update their corresponding packages, as this update will not work with v2.x.x versions.

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v2.1.4(Apr 28, 2021)

    Changelog

    Description

    Welcome to v2.1.4 release.

    In this release, we have added the following optimizers: AOA and AO. Additionally, we have implemented a step size for each variable in the grid, as pointed by @gcoimbra.

    Please read the docs at: opytimizer.readthedocs.io

    Note that this is the latest update of v2 branch. The following update will feature a major rework on Opytimizer classes and will not be retroactive with past versions.

    Includes (or changes)

    • optimizers.misc.aoa
    • optimizers.population.ao
    • optimizers.spaces.grid
    Source code(tar.gz)
    Source code(zip)
  • v2.1.3(Mar 10, 2021)

    Changelog

    Description

    Welcome to v2.1.3 release.

    In this release, we have added the following optimizers: COA, JS and NBJS. Additionally, we have fixed the roulette selection for minimization problems, as pointed by @Martsks.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.evolutionary.ga
    • optimizers.population.coa
    • optimizers.swarm.js
    Source code(tar.gz)
    Source code(zip)
  • v2.1.2(Dec 3, 2020)

    Changelog

    Description

    Welcome to v2.1.2 release.

    In this release, we have added a set of new optimizers, as follows: ABO, ASO, BOA, BSA, CSA, DOA, EHO, EPO, GWO, HGSO, HHO, MFO, PIO, QSA, SFO, SOS, SSA, SSO, TWO, WOA and WWO.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers
    Source code(tar.gz)
    Source code(zip)
  • v2.1.1(Nov 25, 2020)

    Changelog

    Description

    Welcome to v2.1.1 release.

    In this release, we have added some HS-based optimizers and fixed an issue regarding the store_best_only flag.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers
    • utils.history
    Source code(tar.gz)
    Source code(zip)
  • v2.1.0(Jul 29, 2020)

    Changelog

    Description

    Welcome to v2.1.0 release.

    In this release, we have fixed the UMDA nomenclature, corrected some optimizers docstrings and added a soft-penalization to constrained optimization, which we believe that will enable users in designing more appropriate constrained objectives. Furthermore, we added a deterministic trait to the optimizers' unitary tests so they are able to converge.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • core.function
    • optimizers
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v2.0.2(Jul 10, 2020)

    Changelog

    Description

    Welcome to v2.0.2 release.

    In this release, we have added BMRFO, SAVPSO, UDMA and VPSO optimizers.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.boolean.bmrfo
    • optimizers.boolean.udma
    • optimizers.swarm.pso
    Source code(tar.gz)
    Source code(zip)
  • v2.0.1(Jun 26, 2020)

  • v2.0.0(May 7, 2020)

    Changelog

    Description

    Welcome to v2.0.0 release.

    In this release, we have reworked some inner structures, added the usability of decorators and revamped some optimizers. Additionally, we added a bunch of new optimizers and their tests.

    Note that this is a major release, therefore, we suggest to update the library as soon as possible, as it will not be compatible with older versions.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v1.1.3(Mar 31, 2020)

    Changelog

    Description

    Welcome to v1.1.3 release.

    In this release, we have improved the code readability, as well as we have merged some child optimizers to their parents' modules.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v1.1.2(Feb 20, 2020)

    Changelog

    Description

    Welcome to v1.1.2 release.

    In this release, we have fixed some nasty bugs that may cause some particular issues. Additionally, we added a method for handling constrained optimization.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • core.function
    Source code(tar.gz)
    Source code(zip)
  • v1.1.1(Jan 15, 2020)

    Changelog

    Description

    Welcome to v1.1.1 release.

    In this release, we have fixed some nasty bugs that may cause some particular issues. Additionally, we added several new optimizers to the library and a convergence module (belongs to the visualization package).

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.gsa
    • optimizer.hc
    • otimizer.rpso
    • optimizer.sa
    • optimizer.sca
    • visualization.convergence
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(Oct 3, 2019)

    Changelog

    Description

    Welcome to v1.1.0 release.

    This is a minor release that will probably prevent retro-compatibility. Some base structures were changed in order to enhance the library performance.

    Essentially, new structures were created to provide the bulding tools for tree-based evolutionary algorithms, such as Genetic Programming.

    Additionally, we added a constants module to the utilities package and an exception module. This will help guiding the users when inputting wrong information.

    MultiFunction was renamed to WeightedFunction, which is more suitable according to its usage.

    History have been reworked as well, making it possible to dynamically create properties through the dump() method.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • examples/applications
    • core.node
    • function.weighted
    • math.general
    • optimizers.gp
    • opytimizer
    • spaces.tree
    • tests
    • utils.constants
    • utils.exceptions
    • utils.history
    Source code(tar.gz)
    Source code(zip)
  • v1.0.7(Sep 12, 2019)

    Changelog

    Description

    Welcome to v1.0.7 release. We added Opytimizer to pip repository, fixed up the optimization task time, reworked some tests, added new integrations (Recogners library), changed our license for further publication, and fixed an issue regarding agents being out of bounds.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • examples/integrations/recogners
    • opytimizer
    • optimizers
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v1.0.6(Jun 21, 2019)

    Changelog

    Description

    Welcome to v1.0.6 release. We added new interesting things, such as a new optimizer, more benchmarking functions, bug fixes (mostly agents being out of bounds), a reworked history object and some adjusted tests.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes

    • math.benchmark
    • optimizers.wca
    • utils.history
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v1.0.5(Apr 18, 2019)

    Changelog

    Description

    Welcome to v1.0.5 release. We added new interesting things, such as new optimizers, some reworked tests for better uses' cases, a multi-objective strategy for handling problems with more than one objective functions, and much more!

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes

    • examples.integrations.sklearn
    • functions.multi
    • optimizers.abc
    • optimizers.bha
    • optimizers.hs
    • tests
    • optimizers.ihs
    Source code(tar.gz)
    Source code(zip)
  • v1.0.4(Mar 22, 2019)

  • v1.0.3(Mar 22, 2019)

    Changelog

    Description

    Welcome to v1.0.3 release. We have added methods for supporting new optimizers. Additionally, we have fixed some previous implementations and improved their convergence. Everything should be appropriate now.

    Some examples integrating PyTorch with Opytimizer were created as well. Ranging from linear regression to long short-term memory networks, we hope to continue improving our library to serve you well.

    Again, every test is implemented, making a 100% score of coverage. Please refer to the wiki in order to running them.

    Please stay tuned for our next updates and our newest integrations (Sklearn and Tensorflow)!

    Includes

    • optimizers.aiwpso
    • optimizers.ba
    • optimizers.cs
    • optimizers.fa
    • examples.integrations.pytorch
    Source code(tar.gz)
    Source code(zip)
  • v1.0.2(Mar 7, 2019)

    Changelog

    Description

    Welcome to v1.0.2 release. We have added methods for supporting hypercomplex representations. From math modules to new spaces, we do support any hypercomplex approach, ranging from complexes to octonions.

    A History class has been added as well. It will server as the one to hold vital information from the optimization task. In the future, we will support visualization and plots graphing.

    Also, internal class has been removed. All of its contents were moved to core.function module. For now, this will be our new structure (there is a slighly chance to be modified in the future to accomodate multi-objective functions).

    Finally, we have tests. Every test is implemented, making a 100% score of coverage. Please refer to the wiki in order to running them.

    Please stay tuned for our next updates!

    Includes

    • math.hypercomplex
    • spaces
    • utils.history
    • tests (100% coverage)

    Excludes

    • functions
    Source code(tar.gz)
    Source code(zip)
  • v1.0.1(Feb 26, 2019)

    Changelog

    Description

    Welcome to v1.0.1 release. Essentialy, we have reworked some basic structures, added a new math distribution module and a new optimizer (Flower Pollination Algorithm). Please stay tuned for our next updates!

    Includes

    • math.distribution
    • optimizers.fpa
    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Feb 26, 2019)

    Changelog

    Description

    This is the initial release of Opytimizer. It includes all basic modules in order to work with it. One can create an internal optimization function and apply a Particle Swarm Optimization optimizer onto it. Please check examples folder or read the docs in order to know how to use this library.

    Includes

    • core
    • functions
    • math
    • optimizers
    • utils
    Source code(tar.gz)
    Source code(zip)
Owner
Gustavo Rosa
There are no programming languages that can match up to programming logic. Machine learning researcher on work time and software engineer on free time.
Gustavo Rosa
The audio-video synchronization of MKV Container Format is exploited to achieve data hiding

The audio-video synchronization of MKV Container Format is exploited to achieve data hiding, where the hidden data can be utilized for various management purposes, including hyper-linking, annotation

Maxim Zaika 1 Nov 17, 2021
PyTorch implementation of CVPR 2020 paper (Reference-Based Sketch Image Colorization using Augmented-Self Reference and Dense Semantic Correspondence) and pre-trained model on ImageNet dataset

Reference-Based-Sketch-Image-Colorization-ImageNet This is a PyTorch implementation of CVPR 2020 paper (Reference-Based Sketch Image Colorization usin

Yuzhi ZHAO 11 Jul 28, 2022
Random Forests for Regression with Missing Entries

Random Forests for Regression with Missing Entries These are specific codes used in the article: On the Consistency of a Random Forest Algorithm in th

Irving Gómez-Méndez 1 Nov 15, 2021
ImageNet-CoG is a benchmark for concept generalization. It provides a full evaluation framework for pre-trained visual representations which measure how well they generalize to unseen concepts.

The ImageNet-CoG Benchmark Project Website Paper (arXiv) Code repository for the ImageNet-CoG Benchmark introduced in the paper "Concept Generalizatio

NAVER 23 Oct 09, 2022
ICRA 2021 "Towards Precise and Efficient Image Guided Depth Completion"

PENet: Precise and Efficient Depth Completion This repo is the PyTorch implementation of our paper to appear in ICRA2021 on "Towards Precise and Effic

232 Dec 25, 2022
Edge Restoration Quality Assessment

ERQA - Edge Restoration Quality Assessment ERQA - a full-reference quality metric designed to analyze how good image and video restoration methods (SR

MSU Video Group 27 Dec 17, 2022
Joint Unsupervised Learning (JULE) of Deep Representations and Image Clusters.

Joint Unsupervised Learning (JULE) of Deep Representations and Image Clusters. Overview This project is a Torch implementation for our CVPR 2016 paper

Jianwei Yang 278 Dec 25, 2022
This is the official source code for SLATE. We provide the code for the model, the training code, and a dataset loader for the 3D Shapes dataset. This code is implemented in Pytorch.

SLATE This is the official source code for SLATE. We provide the code for the model, the training code and a dataset loader for the 3D Shapes dataset.

Gautam Singh 66 Dec 26, 2022
HiFi-GAN: High Fidelity Denoising and Dereverberation Based on Speech Deep Features in Adversarial Networks

HiFiGAN Denoiser This is a Unofficial Pytorch implementation of the paper HiFi-GAN: High Fidelity Denoising and Dereverberation Based on Speech Deep F

Rishikesh (ऋषिकेश) 134 Dec 27, 2022
10th place solution for Google Smartphone Decimeter Challenge at kaggle.

Under refactoring 10th place solution for Google Smartphone Decimeter Challenge at kaggle. Google Smartphone Decimeter Challenge Global Navigation Sat

12 Oct 25, 2022
AquaTimer - Programmable Timer for Aquariums based on ATtiny414/814/1614

AquaTimer - Programmable Timer for Aquariums based on ATtiny414/814/1614 AquaTimer is a programmable timer for 12V devices such as lighting, solenoid

Stefan Wagner 4 Jun 13, 2022
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.

TL;DR Ignite is a high-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently. Click on the image to

4.2k Jan 01, 2023
High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.

Anakin2.0 Welcome to the Anakin GitHub. Anakin is a cross-platform, high-performance inference engine, which is originally developed by Baidu engineer

514 Dec 28, 2022
RTSeg: Real-time Semantic Segmentation Comparative Study

Real-time Semantic Segmentation Comparative Study The repository contains the official TensorFlow code used in our papers: RTSEG: REAL-TIME SEMANTIC S

Mennatullah Siam 592 Nov 18, 2022
Official implementation of "UCTransNet: Rethinking the Skip Connections in U-Net from a Channel-wise Perspective with Transformer"

[AAAI2022] UCTransNet This repo is the official implementation of "UCTransNet: Rethinking the Skip Connections in U-Net from a Channel-wise Perspectiv

Haonan Wang 199 Jan 03, 2023
Weakly Supervised 3D Object Detection from Point Cloud with Only Image Level Annotation

SCCKTIM Weakly Supervised 3D Object Detection from Point Cloud with Only Image-Level Annotation Our code will be available soon. The class knowledge t

1 Nov 12, 2021
We envision models that are pre-trained on a vast range of domain-relevant tasks to become key for molecule property prediction

We envision models that are pre-trained on a vast range of domain-relevant tasks to become key for molecule property prediction. This repository aims to give easy access to state-of-the-art pre-train

GMUM 90 Jan 08, 2023
Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation

Segmenter: Transformer for Semantic Segmentation Segmenter: Transformer for Semantic Segmentation by Robin Strudel*, Ricardo Garcia*, Ivan Laptev and

594 Jan 06, 2023
PyMove is a Python library to simplify queries and visualization of trajectories and other spatial-temporal data

Use PyMove and go much further Information Package Status License Python Version Platforms Build Status PyPi version PyPi Downloads Conda version Cond

Insight Data Science Lab 64 Nov 15, 2022
Wenet STT Python

Wenet STT Python Beta Software Simple Python library, distributed via binary wheels with few direct dependencies, for easily using WeNet models for sp

David Zurow 33 Feb 21, 2022