A Lucid Framework for Transparent and Interpretable Machine Learning Models.

Overview

https://raw.githubusercontent.com/lucidmode/lucidmode/main/images/lucidmode_logo.png



Documentation Status Version License Version Visits

Currently a Beta-Version


lucidmode is an open-source, low-code and lightweight Python framework for transparent and interpretable machine learning models. It has built in machine learning methods optimized for visual interpretation of some of the most relevant calculations.

Documentation

Installation

  • With package manager (coming soon)

Install by using pip package manager:

pip install lucidmode
  • Cloning repository

Clone entire github project

[email protected]:lucidmode/lucidmode.git

and then install dependencies

pip install -r requirements.txt

Models

Artificial Neural Network

Feedforward Multilayer perceptron with backpropagation.

  • fit: Fit model to data
  • predict: Prediction according to model

Initialization, Activations, Cost functions, regularization, optimization

  • Weights Initialization: With 4 types of criterias (zeros, xavier, common, he)
  • Activation Functions: sigmoid, tanh, ReLU
  • Cost Functions: Sum of Squared Error, Binary Cross-Entropy, Multi-Class Cross-Entropy
  • Regularization: L1, L2, ElasticNet for weights in cost function and in gradient updating
  • Optimization: Weights optimization with Gradient Descent (GD, SGD, Batch) with learning rate
  • Execution: Callback (metric threshold), History (Cost and metrics)
  • Hyperparameter Optimization: Random Grid Search with Memory

Complementary

  • Metrics: Accuracy, Confusion Matrix (Binary and Multiclass), Confusion Tensor (Multiclass OvR)
  • Visualizations: Cost evolution
  • Public Datasets: MNIST, Fashion MNIST
  • Special Datasets: OHLCV + Symbolic Features of Cryptocurrencies (ETH, BTC)

Important Links

Author/Principal Maintainer

Francisco Munnoz (IFFranciscoME) Is an associate professor of financial engineering and financial machine learning ITESO (Western Institute of Technology and Higher Education)

License

GNU General Public License v3.0

Permissions of this strong copyleft license are conditioned on making available complete source code of licensed works and modifications, which include larger works using a licensed work, under the same license. Copyright and license notices must be preserved. Contributors provide an express grant of patent rights.

Contact: For more information in reggards of this repo, please contact [email protected]

You might also like...
Implementations of Machine Learning models, Regularizers, Optimizers and different Cost functions.

Linear Models Implementations of LinearRegression, LassoRegression and RidgeRegression with appropriate Regularizers and Optimizers. Linear Regression

Tangram makes it easy for programmers to train, deploy, and monitor machine learning models.
Tangram makes it easy for programmers to train, deploy, and monitor machine learning models.

Tangram Website | Discord Tangram makes it easy for programmers to train, deploy, and monitor machine learning models. Run tangram train to train a mo

SageMaker Python SDK is an open source library for training and deploying machine learning models on Amazon SageMaker.
SageMaker Python SDK is an open source library for training and deploying machine learning models on Amazon SageMaker.

SageMaker Python SDK SageMaker Python SDK is an open source library for training and deploying machine learning models on Amazon SageMaker. With the S

Model Validation Toolkit is a collection of tools to assist with validating machine learning models prior to deploying them to production and monitoring them after deployment to production.

Model Validation Toolkit is a collection of tools to assist with validating machine learning models prior to deploying them to production and monitoring them after deployment to production.

easyNeuron is a simple way to create powerful machine learning models, analyze  data and research cutting-edge AI.
easyNeuron is a simple way to create powerful machine learning models, analyze data and research cutting-edge AI.

easyNeuron is a simple way to create powerful machine learning models, analyze data and research cutting-edge AI.

A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

Light Gradient Boosting Machine LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed a

Automated modeling and machine learning framework FEDOT
Automated modeling and machine learning framework FEDOT

This repository contains FEDOT - an open-source framework for automated modeling and machine learning (AutoML). It can build custom modeling pipelines for different real-world processes in an automated way using an evolutionary approach. FEDOT supports classification (binary and multiclass), regression, clustering, and time series prediction tasks.

machine learning model deployment project of Iris classification model in a minimal UI using flask web framework and deployed it in Azure cloud using Azure app service
machine learning model deployment project of Iris classification model in a minimal UI using flask web framework and deployed it in Azure cloud using Azure app service

This is a machine learning model deployment project of Iris classification model in a minimal UI using flask web framework and deployed it in Azure cloud using Azure app service. We initially made this project as a requirement for an internship at Indian Servers. We are now making it open to contribution.

QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.

QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.

Releases(v0.4-beta1.0)
  • v0.4-beta1.0(Apr 29, 2021)

    Metrics

    • Calculation of several metrics for classification sensitivity (TPR), specificity (TNR), accuracy (acc), likelihood ratio (positive), likelihood ratio (negative), confusion matrix (binary and multiclass) confusion tensor (binary for every class in multi-class)

    Sequential Class

    • Move the cost_f and cost_r parameters to be specified from the formation method, leave the class instantiation with just the model architecture

    • Move the init_weights method to be specified from the formation method

    Execution

    • Create formation method in the Sequential Class, with the following parameters init, cost, metrics, optimizer

    • Store selected metrics in Train and Validation History

    Visualizations

    • Select metrics for verbose output
    Source code(tar.gz)
    Source code(zip)
  • v0.3-beta1.0(Apr 27, 2021)

    Regularization:

    • On weights and biases, location: gradients

      • L1, L2 and ElasticNet
    • On weights and biases, location: cost function

      • L1, L2 and ElasticNet

    Numerical Stability:

    • in functions.py, in cost, added a 1e-25 value to A, to avoid a divide by zero and invalid multiply cases in computations of np.log(A)

    Data Handling:

    • train and validation cost

    Visualization:

    • print: verbose of cost evolution

    Documentation:

    • Improve README
    Source code(tar.gz)
    Source code(zip)
  • v0.2-beta1.0(Apr 27, 2021)

    Files:

    • complete data set: MNIST
    • complete data set: 'fashion-MNIST'

    Tests passed:

    • fashion MNIST
    • previous release tests

    Topology

    • single hidden layer (tested)
    • 1 - 2 hidden layers (tested)
    • different activation functions among hidden layer

    Activation functions:

    • For hidden -> Sigmoid, Tanh, ReLU (tested and not working)
    • For output -> Softmax

    Cost Functions:

    • 'binary-logloss' (Binary-class Cross-Entropy)
    • 'multi-logloss' (Multi-class Cross-Entropy)

    Metrics:

    • Confusion matrix (Multi-class)
    • Accuracy (Multi-class)
    Source code(tar.gz)
    Source code(zip)
  • v0.1-beta1.0(Apr 26, 2021)

    First release!

    Tests passed:

    • Random XOR data classification

    Sequential model:

    • hidden_l: Number of neurons per hidden layer (list of int, with a length of l_hidden)
    • hidden_a: Activation of hidden layers (list of str, with length l_hidden)
    • output_n: Number of neurons in the output layer (1)
    • output_a: Activation of output layer (str)

    Layer transformations:

    • linear

    Activation functions:

    • For hidden -> Sigmoid, Tanh
    • For output -> Sigmoid (Binary)

    Weights Initialization:

    • Xavier normal, Xavier uniform, common uniform, according to [1]

    Training Schemes:

    • Gradient Descent

    Cost Functions:

    • Sum of Squared Error (SSE) or Residual Sum of Squares (RSS)

    Metrics:

    • Accuracy (Binary)
    Source code(tar.gz)
    Source code(zip)
    LucidNet_v0.1-beta1.0.zip(111.97 MB)
Owner
lucidmode
A lucid framework for interpretable machine learning models
lucidmode
Reproducibility and Replicability of Web Measurement Studies

Reproducibility and Replicability of Web Measurement Studies This repository holds additional material to the paper "Reproducibility and Replicability

6 Dec 31, 2022
cleanlab is the data-centric ML ops package for machine learning with noisy labels.

cleanlab is the data-centric ML ops package for machine learning with noisy labels. cleanlab cleans labels and supports finding, quantifying, and lear

Cleanlab 51 Nov 28, 2022
Predicting Baseball Metric Clusters: Clustering Application in Python Using scikit-learn

Clustering Clustering Application in Python Using scikit-learn This repository contains the prediction of baseball metric clusters using MLB Statcast

Tom Weichle 2 Apr 18, 2022
Machine Learning Algorithms

Machine-Learning-Algorithms In this project, the dataset was created through a survey opened on Google forms. The purpose of the form is to find the p

Göktuğ Ayar 3 Aug 10, 2022
This repository demonstrates the usage of hover to understand and supervise a machine learning task.

Hover Example Apps (works out-of-the-box on Binder) This repository demonstrates the usage of hover to understand and supervise a machine learning tas

Pavel 43 Dec 03, 2021
Datetimes for Humans™

Maya: Datetimes for Humans™ Datetimes are very frustrating to work with in Python, especially when dealing with different locales on different systems

Timo Furrer 3.4k Dec 28, 2022
MBTR is a python package for multivariate boosted tree regressors trained in parameter space.

MBTR is a python package for multivariate boosted tree regressors trained in parameter space.

SUPSI-DACD-ISAAC 61 Dec 19, 2022
ParaMonte is a serial/parallel library of Monte Carlo routines for sampling mathematical objective functions of arbitrary-dimensions

ParaMonte is a serial/parallel library of Monte Carlo routines for sampling mathematical objective functions of arbitrary-dimensions, in particular, the posterior distributions of Bayesian models in

Computational Data Science Lab 182 Dec 31, 2022
Scikit-Garden or skgarden is a garden for Scikit-Learn compatible decision trees and forests.

Scikit-Garden or skgarden (pronounced as skarden) is a garden for Scikit-Learn compatible decision trees and forests.

260 Dec 21, 2022
A linear equation solver using gaussian elimination. Implemented for fun and learning/teaching.

A linear equation solver using gaussian elimination. Implemented for fun and learning/teaching. The solver will solve equations of the type: A can be

Sanjeet N. Dasharath 3 Feb 15, 2022
A repository of PyBullet utility functions for robotic motion planning, manipulation planning, and task and motion planning

pybullet-planning (previously ss-pybullet) A repository of PyBullet utility functions for robotic motion planning, manipulation planning, and task and

Caelan Garrett 260 Dec 27, 2022
A simple python program that draws a tree for incrementing values using the Collatz Conjecture.

Collatz Conjecture A simple python program that draws a tree for incrementing values using the Collatz Conjecture. Values which can be edited: Length

davidgasinski 1 Oct 28, 2021
An implementation of Relaxed Linear Adversarial Concept Erasure (RLACE)

Background This repository contains an implementation of Relaxed Linear Adversarial Concept Erasure (RLACE). Given a dataset X of dense representation

Shauli Ravfogel 4 Apr 13, 2022
Databricks Certified Associate Spark Developer preparation toolkit to setup single node Standalone Spark Cluster along with material in the form of Jupyter Notebooks.

Databricks Certification Spark Databricks Certified Associate Spark Developer preparation toolkit to setup single node Standalone Spark Cluster along

19 Dec 13, 2022
Model Agnostic Confidence Estimator (MACEST) - A Python library for calibrating Machine Learning models' confidence scores

Model Agnostic Confidence Estimator (MACEST) - A Python library for calibrating Machine Learning models' confidence scores

Oracle 95 Dec 28, 2022
A handy tool for common machine learning models' hyper-parameter tuning.

Common machine learning models' hyperparameter tuning This repo is for a collection of hyper-parameter tuning for "common" machine learning models, in

Kevin Hu 2 Jan 27, 2022
NumPy-based implementation of a multilayer perceptron (MLP)

My own NumPy-based implementation of a multilayer perceptron (MLP). Several of its components can be tuned and played with, such as layer depth and size, hidden and output layer activation functions,

1 Feb 10, 2022
ETNA is an easy-to-use time series forecasting framework.

ETNA is an easy-to-use time series forecasting framework. It includes built in toolkits for time series preprocessing, feature generation, a variety of predictive models with unified interface - from

Tinkoff.AI 674 Jan 07, 2023
Accelerating model creation and evaluation.

EmeraldML A machine learning library for streamlining the process of (1) cleaning and splitting data, (2) training, optimizing, and testing various mo

Yusuf 0 Dec 06, 2021
hgboost - Hyperoptimized Gradient Boosting

hgboost is short for Hyperoptimized Gradient Boosting and is a python package for hyperparameter optimization for xgboost, catboost and lightboost using cross-validation, and evaluating the results o

Erdogan Taskesen 34 Jan 03, 2023