Simplified interface for TensorFlow (mimicking Scikit Learn) for Deep Learning

Related tags

Deep Learningskflow
Overview

SkFlow has been moved to Tensorflow.

SkFlow has been moved to http://github.com/tensorflow/tensorflow into contrib folder specifically located here. The development will continue there. Please submit any issues and pull requests to Tensorflow repository instead.

This repository will ramp down, including after next Tensorflow release we will wind down code here. Please see instructions on most recent installation here.

Comments
  • How do I do multilabel image classification?

    How do I do multilabel image classification?

    Do I have to make changes in the multioutput file? I ideally want to train any model, like Inception, on my training data which has multi labels. How do I do that?

    help wanted examples 
    opened by unography 21
  • Add early stopping and reporting based on validation data

    Add early stopping and reporting based on validation data

    This PR allows a user to specify a validation dataset that are used for early stopping (and reporting). The PR was created to address issue 85

    I made changes in 3 places.

    1. The trainer now takes a dictionary containing the validation data (in the same format as the output of the data feeder's get_dict_fn).
    2. The fit method now takes arguments for val_X and val_y. It converts these into the correct format for the trainer.
    3. The example file digits.py now uses early stopping, by supplying val_X and val_y.

    I can add early stopping to other examples if this approach looks good, though their behavior should not otherwise be affected by the current PR.

    cla: yes 
    opened by dansbecker 14
  • Class weight support

    Class weight support

    Hi,

    I am using skflow.ops.dnn to classify two - classes dataset (True and False). The percentage of True example is very small, so I have an imbalanced dataset.

    It seems to me that one way to resolve the issue is to use weighted classes. However, when I look to the implementation of skflow.ops.dnn, I do not know how could I do weighted classes with DNN.

    Is it possible to do that with skflow, or is there another technique to deal with imbalanced dataset problem in skflow?

    Thanks

    enhancement 
    opened by vinhqdang 13
  • Added verbose option

    Added verbose option

    I added an option to control the "verbosity". For this, I added the parameter "verbose" in the init method of the init.py file and to the train function in the trainers.py file. In addition, I passed this argument to the "self._trainer.train()" call in the init file and added a condition to make the prints in the trainer.py file.

    cla: no 
    opened by ivallesp 12
  • Predict batch size default

    Predict batch size default

    This changes the default batch size for prediction to be the same as for training, enabling efficient grid search. Previously GridSearchCV would try to make predictions in a single batch, which could take a lot of memory.

    This also adds a simple example of using skflow with GridSearchCV.

    cla: no 
    opened by mheilman 11
  • Add example accessing of weights

    Add example accessing of weights

    It wasn't clear how to access weights using classifier.get_tensor_value('foo') syntax. This adds some examples for the CNN model. They were figured out by logging the training as though for using TensorBoard, and then running strings on the logfile to look for the right namespace.

    Is there a better way to access these weights? Or to learn their names? The logging must walk through the graph and record these names. Maybe if there were a way to quickly list all the names, that'd be enough for advanced users to figure it out.

    cla: yes 
    opened by dvbuntu 10
  • Plotting neural network built by skflow

    Plotting neural network built by skflow

    Hi,

    Sorry I asked too much.

    I think plotting is always a nice feature. Is it possible right now for skflow (or can we do that through tensorflow directly)?

    opened by vinhqdang 10
  • move monitor and logdir arguments to init

    move monitor and logdir arguments to init

    opened by mheilman 8
  • Exception when running language model example

    Exception when running language model example

    Hi,

    Thanks for making this tool. It will definitely make things easier for NN newcomers.

    I just tried running your language model example and got the following exception:

    Traceback (most recent call last):
      File "test.py", line 84, in <module>
        estimator.fit(X, y)
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/estimators/base.py", line 243, in fit
        feed_params_fn=self._data_feeder.get_feed_params)
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/trainer.py", line 114, in train
        feed_dict = feed_dict_fn()
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/io/data_feeder.py", line 307, in _feed_dict_fn
        inp[i, :] = six.next(self.X)
    StopIteration
    

    I made sure that my python distribution has the correct version of six. I tried running it both in a virtual environment and in a normal Python 3 distro. Any ideas what might be causing this?

    opened by savkov 7
  • another ValidationMonitor with validation(+early stopping) per epoch

    another ValidationMonitor with validation(+early stopping) per epoch

    From what I understand, the existing ValidationMonitor performs validation every [print_steps] steps, and checks for stop condition every [early_stopping_rounds] steps. I'd like to add another ValidationMonitor that performs validation once and checks for stoping condition once every epoch. Is this the recommended practice in machine learning regarding validation and early stopping? I mean I'd like to add a fit process something like this:

    def fit(self, x_train, y_train, x_validate, y_validate):
        while (current_validation_loss < previous_validation_loss):
            estimator.train_one_more_epoch(x_train, y_train)
            previous_validation_loss = current_validation_loss
            current_validation_loss = some_error(y_validate, estimator.predict(x_validate))
    
    enhancement help wanted 
    opened by alanyuchenhou 7
  • Example of language model

    Example of language model

    Add an example of language model (RNN). For example character level on sheikspear book (similar to https://github.com/sherjilozair/char-rnn-tensorflow).

    examples 
    opened by ilblackdragon 7
  • .travis.yml: The 'sudo' tag is now deprecated in Travis CI

    .travis.yml: The 'sudo' tag is now deprecated in Travis CI

    opened by cclauss 1
  • Why hasn't this repo been archived yet?

    Why hasn't this repo been archived yet?

    New versions of TF have already been released since the last commit to this repo. As far as I've understood, after having read the README file of this project, you intended to close this repo. So, why hasn't it been done yet?

    opened by nbro 0
Releases(v0.1)
  • v0.1(Feb 14, 2016)

Deeper insights into graph convolutional networks for semi-supervised learning

deeper_insights_into_GCNs Deeper insights into graph convolutional networks for semi-supervised learning References data and utils.py come from Implem

Davidham3 17 Dec 16, 2022
This repository contains the implementation of the following paper: Cross-Descriptor Visual Localization and Mapping

Cross-Descriptor Visual Localization and Mapping This repository contains the implementation of the following paper: "Cross-Descriptor Visual Localiza

Mihai Dusmanu 81 Oct 06, 2022
Codes for NeurIPS 2021 paper "Adversarial Neuron Pruning Purifies Backdoored Deep Models"

Adversarial Neuron Pruning Purifies Backdoored Deep Models Code for NeurIPS 2021 "Adversarial Neuron Pruning Purifies Backdoored Deep Models" by Dongx

Dongxian Wu 31 Dec 11, 2022
Pytorch implementation for the EMNLP 2020 (Findings) paper: Connecting the Dots: A Knowledgeable Path Generator for Commonsense Question Answering

Path-Generator-QA This is a Pytorch implementation for the EMNLP 2020 (Findings) paper: Connecting the Dots: A Knowledgeable Path Generator for Common

Peifeng Wang 33 Dec 05, 2022
Official repository for "Action-Based Conversations Dataset: A Corpus for Building More In-Depth Task-Oriented Dialogue Systems"

Action-Based Conversations Dataset (ABCD) This respository contains the code and data for ABCD (Chen et al., 2021) Introduction Whereas existing goal-

ASAPP Research 49 Oct 09, 2022
A knowledge base construction engine for richly formatted data

Fonduer is a Python package and framework for building knowledge base construction (KBC) applications from richly formatted data. Note that Fonduer is

HazyResearch 386 Dec 05, 2022
LBK 26 Dec 28, 2022
Unet network with mean teacher for altrasound image segmentation

Unet network with mean teacher for altrasound image segmentation

5 Nov 21, 2022
Learning Confidence for Out-of-Distribution Detection in Neural Networks

Learning Confidence Estimates for Neural Networks This repository contains the code for the paper Learning Confidence for Out-of-Distribution Detectio

235 Jan 05, 2023
Unity Propagation in Bayesian Networks Handling Inconsistency via Unity Smoothing

This repository contains the scripts needed to generate the results from the paper Unity Propagation in Bayesian Networks Handling Inconsistency via U

0 Jan 19, 2022
Perform Linear Classification with Multi-way Data

MultiwayClassification This is an R package to perform linear classification for data with multi-way structure. The distance-weighted discrimination (

Eric F. Lock 2 Dec 15, 2020
A visualisation tool for Deep Reinforcement Learning

DRLVIS - Visualising Deep Reinforcement Learning Created by Marios Sirtmatsis with the support of Alex Bäuerle. DRLVis is an application used for visu

Marios Sirtmatsis 1 Nov 04, 2021
Train the HRNet model on ImageNet

High-resolution networks (HRNets) for Image classification News [2021/01/20] Add some stronger ImageNet pretrained models, e.g., the HRNet_W48_C_ssld_

HRNet 866 Jan 04, 2023
Framework for Spectral Clustering on the Sparse Coefficients of Learned Dictionaries

Dictionary Learning for Clustering on Hyperspectral Images Overview Framework for Spectral Clustering on the Sparse Coefficients of Learned Dictionari

Joshua Bruton 6 Oct 25, 2022
Code for "Neural Body: Implicit Neural Representations with Structured Latent Codes for Novel View Synthesis of Dynamic Humans" CVPR 2021 best paper candidate

News 05/17/2021 To make the comparison on ZJU-MoCap easier, we save quantitative and qualitative results of other methods at here, including Neural Vo

ZJU3DV 748 Jan 07, 2023
🍷 Gracefully claim weekly free games and monthly content from Epic Store.

EPIC 免费人 🚀 优雅地领取 Epic 免费游戏 Introduction 👋 Epic AwesomeGamer 帮助玩家优雅地领取 Epic 免费游戏。 使用 「Epic免费人」可以实现如下需求: get:搬空游戏商店,获取所有常驻免费游戏与免费附加内容; claim:领取周免游戏及其免

571 Dec 28, 2022
The story of Chicken for Club Bing

Chicken Story tl;dr: The time when Microsoft banned my entire country for cheating at Club Bing. (A lot of the details are from memory so I've recreat

Eyal 142 May 16, 2022
PyTorch/GPU re-implementation of the paper Masked Autoencoders Are Scalable Vision Learners

Masked Autoencoders: A PyTorch Implementation This is a PyTorch/GPU re-implementation of the paper Masked Autoencoders Are Scalable Vision Learners: @

Meta Research 4.8k Jan 04, 2023
All of the figures and notebooks for my deep learning book, for free!

"Deep Learning - A Visual Approach" by Andrew Glassner This is the official repo for my book from No Starch Press. Ordering the book My book is called

Andrew Glassner 227 Jan 04, 2023
Towards Ultra-Resolution Neural Style Transfer via Thumbnail Instance Normalization

Towards Ultra-Resolution Neural Style Transfer via Thumbnail Instance Normalization Official PyTorch implementation for our URST (Ultra-Resolution Sty

czczup 148 Dec 27, 2022