Simple and flexible ML workflow engine.

Overview

Katana ML Skipper

PyPI - Python GitHub Stars GitHub Issues Current Version

This is a simple and flexible ML workflow engine. It helps to orchestrate events across a set of microservices and create executable flow to handle requests. Engine is designed to be configurable with any microservices. Enjoy!

Skipper

Author

Katana ML, Andrej Baranovskij

Instructions

Start/Stop

Docker Compose

Start:

docker-compose up --build -d

Stop:

docker-compose down

This will start RabbitMQ container. To run engine and services, navigate to related folders and follow instructions.

Web API FastAPI endpoint:

http://127.0.0.1:8080/api/v1/skipper/tasks/docs

Kubernetes

NGINX Ingress Controller:

If you are using local Kubernetes setup, install NGINX Ingress Controller

Build Docker images:

docker-compose -f docker-compose-kubernetes.yml build

Setup Kubernetes services:

./kubectl-setup.sh

Skipper API endpoint published through NGINX Ingress (you can setup your own host in /etc/hosts):

http://kubernetes.docker.internal/api/v1/skipper/tasks/docs

Check NGINX Ingress Controller pod name:

kubectl get pods -n ingress-nginx

Sample response, copy the name of 'Running' pod:

NAME                                       READY   STATUS      RESTARTS   AGE
ingress-nginx-admission-create-dhtcm       0/1     Completed   0          14m
ingress-nginx-admission-patch-x8zvw        0/1     Completed   0          14m
ingress-nginx-controller-fd7bb8d66-tnb9t   1/1     Running     0          14m

NGINX Ingress Controller logs:

kubectl logs -n ingress-nginx -f 
   

   

Skipper API logs:

kubectl logs -n katana-skipper -f -l app=skipper-api

Remove Kubernetes services:

./kubectl-remove.sh

Components

  • api - Web API implementation
  • workflow - workflow logic
  • services - a set of sample microservices, you should replace this with your own services. Update references in docker-compose.yml
  • rabbitmq - service for RabbitMQ broker
  • skipper-lib - reusable Python library to streamline event communication through RabbitMQ
  • logger - logger service

URLs

  • Web API
http://127.0.0.1:8080/api/v1/skipper/tasks/docs

If running on local Kubernetes with Docker Desktop:

http://kubernetes.docker.internal/api/v1/skipper/tasks/docs
  • RabbitMQ:
http://localhost:15672/ (skipper/welcome1)

If running on local Kubernets, make sure port forwarding is enabled:

kubectl -n rabbits port-forward rabbitmq-0 15672:15672
  • PyPI
https://pypi.org/project/skipper-lib/
  • OCI - deployment guide for Oracle Cloud

Usage

You can use Skipper engine to run Web API, workflow and communicate with a group of ML microservices implemented under services package.

Skipper can be deployed to any Cloud vendor with Kubernetes or Docker support. You can scale Skipper runtime on Cloud using Kubernetes commands.

License

Licensed under the Apache License, Version 2.0. Copyright 2020-2021 Katana ML, Andrej Baranovskij. Copy of the license.

Comments
  • Cache EventProducer

    Cache EventProducer

    I found that cache the EventProducer can improve performace 40%. I tried but it block may request when increase the speed test. Do you have suggest to fix that

    opened by manhtd98 7
  • Docker-compose up not working

    Docker-compose up not working

    Hi

    Thank you for the wonderful katana-skipper. I am trying to digest the library and execute the docker-compose.yml. But it seems like it is not working.

    Would appreciate it if you could take a look

    good first issue 
    opened by jamesee 6
  • Doc: How to add a new service with a new queue

    Doc: How to add a new service with a new queue

    How do we add a new service with a new queue called translator?

    1. I add a new router adding a new path for my new service defining a new prefix and tag named translator.
    2. I create a new request model for my new service in models.py containing task_type and expect a type translator and a payload
    3. I define a new service container with the correct variables and set my SERVICE=translator and QUEUE_NAME=skipper_translator

    I am able to call the new endpoint and it returns:

    task_id: "-", 
    task_status: "Success", 
    outcome: "<starlette.responses.JSONResponse object at 0x7ff2672dbed0>"
    

    However the container is never triggered.

    What am I missing?

    opened by ladrua 4
  • The difference between event_producer and exchange_producer

    The difference between event_producer and exchange_producer

    Hello, Thanks for sharing your ML workflow. I appreciate if you could explain the difference between event_producer and exchange_producer. event_producer is used to produce an event to rabbitmq, but exchange_producer is not clear to me. Can't we use event_producer in place of exchange_producer?

    good first issue 
    opened by fadishaar84 4
  • Encountering Authentication Issues

    Encountering Authentication Issues

    When I run the start command on docker I get the following error in the data-service container. Would greatly appreciate guidance on how to fix this issue. ` data-service katanaml/data-service RUNNING

    Traceback (most recent call last):

    File "main.py", line 19, in

    main()
    

    File "main.py", line 15, in main

    'http://127.0.0.1:5001/api/v1/skipper/logger/log_receiver'))
    

    File "/usr/local/lib/python3.7/site-packages/skipper_lib/events/event_receiver.py", line 16, in init

    credentials=credentials))
    

    File "/usr/local/lib/python3.7/site-packages/pika/adapters/blocking_connection.py", line 360, in init

    self._impl = self._create_connection(parameters, _impl_class)
    

    File "/usr/local/lib/python3.7/site-packages/pika/adapters/blocking_connection.py", line 451, in _create_connection

    raise self._reap_last_connection_workflow_error(error)
    

    pika.exceptions.AMQPConnectionError

    Traceback (most recent call last):

    File "main.py", line 19, in

    main()
    

    File "main.py", line 15, in main

    'http://127.0.0.1:5001/api/v1/skipper/logger/log_receiver'))
    

    File "/usr/local/lib/python3.7/site-packages/skipper_lib/events/event_receiver.py", line 16, in init

    credentials=credentials))
    

    File "/usr/local/lib/python3.7/site-packages/pika/adapters/blocking_connection.py", line 360, in init

    self._impl = self._create_connection(parameters, _impl_class)
    

    File "/usr/local/lib/python3.7/site-packages/pika/adapters/blocking_connection.py", line 451, in _create_connection

    raise self._reap_last_connection_workflow_error(error)
    

    pika.exceptions.ProbableAuthenticationError: ConnectionClosedByBroker: (403) 'ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN. For details see the broker logfi`

    opened by LM-01 3
  • How can we move from docker compose to kubernetes?

    How can we move from docker compose to kubernetes?

    Hello Andrej, I would like to ask about how to move from docker-compose to Kubernetes, do we have to use some tools like kompose or other tools, I appreciate if you could guide me a little bit about how to perform this conversion to run our services on Skipper not using docker compose but kubernetes. Thank you.

    opened by fadishaar84 2
Releases(v1.1.0)
  • v1.1.0(Dec 11, 2021)

    This release of Katana ML Skipper includes:

    • Skipper Lib JS - support for Node.js containers
    • Error handling
    • Configurable FastAPI endpoints
    • Various improvements and bug fixes

    What's Changed

    • (README.md) Adding Andrej's profile url by @xandrade in https://github.com/katanaml/katana-skipper/pull/3

    New Contributors

    • @xandrade made their first contribution in https://github.com/katanaml/katana-skipper/pull/3

    Full Changelog: https://github.com/katanaml/katana-skipper/compare/v1.0.0...v1.1.0

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Oct 9, 2021)

    First production release of Katana ML Skipper.

    Included:

    • Logger
    • Workflow
    • API async and sync
    • Services
    • Docker support
    • Kubernetes support
    • Tested on OCI Cloud

    Full Changelog: https://github.com/katanaml/katana-skipper/commits/v1.0.0

    Source code(tar.gz)
    Source code(zip)
Owner
Katana ML
Machine Learning for Business Automation
Katana ML
WAGMA-SGD is a decentralized asynchronous SGD for distributed deep learning training based on model averaging.

WAGMA-SGD is a decentralized asynchronous SGD based on wait-avoiding group model averaging. The synchronization is relaxed by making the collectives externally-triggerable, namely, a collective can b

Shigang Li 6 Jun 18, 2022
Code base of KU AIRS: SPARK Autonomous Vehicle Team

KU AIRS: SPARK Autonomous Vehicle Project Check this link for the blog post describing this project and the video of SPARK in simulation and on parkou

Mehmet Enes Erciyes 1 Nov 23, 2021
Machine-learning-dell - Repositório com as atividades desenvolvidas no curso de Machine Learning

📚 Descrição Neste curso da Dell aprofundamos nossos conhecimentos em Machine Learning. 🖥️ Aulas (Em curso) 1.1 - Python aplicado a Data Science 1.2

Claudia dos Anjos 1 Jan 05, 2022
ETNA is an easy-to-use time series forecasting framework.

ETNA is an easy-to-use time series forecasting framework. It includes built in toolkits for time series preprocessing, feature generation, a variety of predictive models with unified interface - from

Tinkoff.AI 674 Jan 07, 2023
Auto updating website that tracks closed & open issues/PRs on scikit-learn/scikit-learn.

Repository Status for Scikit-learn Live webpage Auto updating website that tracks closed & open issues/PRs on scikit-learn/scikit-learn. Running local

Thomas J. Fan 6 Dec 27, 2022
Dieses Projekt ermöglicht es den Smartmeter der EVN (Netz Niederösterreich) über die Kundenschnittstelle auszulesen.

SmartMeterEVN Dieses Projekt ermöglicht es den Smartmeter der EVN (Netz Niederösterreich) über die Kundenschnittstelle auszulesen. Smart Meter werden

greenMike 43 Dec 04, 2022
MLFlow in a Dockercontainer based on Azurite and Postgres

mlflow-azurite-postgres docker This is a MLFLow image which works with a postgres DB and a local Azure Blob Storage Instance (Azurite). This image is

2 May 29, 2022
An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models

Seldon Core: Blazing Fast, Industry-Ready ML An open source platform to deploy your machine learning models on Kubernetes at massive scale. Overview S

Seldon 3.5k Jan 01, 2023
Open-Source CI/CD platform for ML teams. Deliver ML products, better & faster. ⚡️🧑‍🔧

Deliver ML products, better & faster Giskard is an Open-Source CI/CD platform for ML teams. Inspect ML models visually from your Python notebook 📗 Re

Giskard 335 Jan 04, 2023
PySpark + Scikit-learn = Sparkit-learn

Sparkit-learn PySpark + Scikit-learn = Sparkit-learn GitHub: https://github.com/lensacom/sparkit-learn About Sparkit-learn aims to provide scikit-lear

Lensa 1.1k Jan 04, 2023
Book Item Based Collaborative Filtering

Book-Item-Based-Collaborative-Filtering Collaborative filtering methods are used

Şebnem 3 Jan 06, 2022
Project to deploy a machine learning model based on Titanic dataset from Kaggle

kaggle_titanic_deploy Project to deploy a machine learning model based on Titanic dataset from Kaggle In this project we used the Titanic dataset from

Vivian Yamassaki 8 May 23, 2022
A webpage that utilizes machine learning to extract sentiments from tweets.

Tweets_Classification_Webpage The goal of this project is to be able to predict what rating customers on social media platforms would give to products

Ayaz Nakhuda 1 Dec 30, 2021
Toolss - Automatic installer of hacking tools (ONLY FOR TERMUKS!)

Tools Автоматический установщик хакерских утилит (ТОЛЬКО ДЛЯ ТЕРМУКС!) Оригиналь

14 Jan 05, 2023
Nevergrad - A gradient-free optimization platform

Nevergrad - A gradient-free optimization platform nevergrad is a Python 3.6+ library. It can be installed with: pip install nevergrad More installati

Meta Research 3.4k Jan 08, 2023
nn-Meter is a novel and efficient system to accurately predict the inference latency of DNN models on diverse edge devices

A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.

Microsoft 241 Dec 26, 2022
Firebase + Cloudrun + Machine learning

A simple end to end consumer lending decision engine powered by Google Cloud Platform (firebase hosting and cloudrun)

Emmanuel Ogunwede 8 Aug 16, 2022
PyCaret is an open-source, low-code machine learning library in Python that automates machine learning workflows.

An open-source, low-code machine learning library in Python 🚀 Version 2.3.5 out now! Check out the release notes here. Official • Docs • Install • Tu

PyCaret 6.7k Jan 08, 2023
Microsoft contributing libraries, tools, recipes, sample codes and workshop contents for machine learning & deep learning.

Microsoft contributing libraries, tools, recipes, sample codes and workshop contents for machine learning & deep learning.

Microsoft 366 Jan 03, 2023