Multiple-criteria decision-making (MCDM) with Electre, Promethee, Weighted Sum and Pareto

Overview

PyPI version GitHub Issues Contributions welcome License: MIT Downloads

EasyMCDM - Quick Installation methods

Install with PyPI

Once you have created your Python environment (Python 3.6+) you can simply type:

pip3 install EasyMCDM

Install with GitHub

Once you have created your Python environment (Python 3.6+) you can simply type:

git clone https://github.com/qanastek/EasyMCDM.git
cd EasyMCDM
pip3 install -r requirements.txt
pip3 install --editable .

Any modification made to the EasyMCDM package will be automatically interpreted as we installed it with the --editable flag.

Setup with Anaconda

conda create --name EasyMCDM python=3.6 -y
conda activate EasyMCDM

More information on managing environments with Anaconda can be found in the conda cheat sheet.

Try It

Data in tests/data/donnees.csv :

alfa_156,23817,201,8,39.6,6,378,31.2
audi_a4,25771,195,5.7,35.8,7,440,33
cit_xantia,25496,195,7.9,37,2,480,34

Promethee

from EasyMCDM.models.Promethee import Promethee

data = pd.read_csv('tests/data/donnees.csv', header=None).to_numpy()
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}
weights = [0.14,0.14,0.14,0.14,0.14,0.14,0.14]
prefs = ["min","max","min","min","min","max","min"]

p = Promethee(data=data, verbose=False)
res = p.solve(weights=weights, prefs=prefs)
print(res)

Output :

{
  'phi_negative': [('rnlt_safrane', 2.381), ('vw_passat', 2.9404), ('bmw_320d', 3.3603), ('saab_tid', 3.921), ('audi_a4', 4.34), ('cit_xantia', 4.48), ('rnlt_laguna', 5.04), ('alfa_156', 5.32), ('peugeot_406', 5.461), ('cit_xsara', 5.741)],
  'phi_positive': [('rnlt_safrane', 6.301), ('vw_passat', 5.462), ('bmw_320d', 5.18), ('saab_tid', 4.76), ('audi_a4', 4.0605), ('cit_xantia', 3.921), ('rnlt_laguna', 3.6406), ('alfa_156', 3.501), ('peugeot_406', 3.08), ('cit_xsara', 3.08)],
  'phi': [('rnlt_safrane', 3.92), ('vw_passat', 2.5214), ('bmw_320d', 1.8194), ('saab_tid', 0.839), ('audi_a4', -0.27936), ('cit_xantia', -0.5596), ('rnlt_laguna', -1.3995), ('alfa_156', -1.8194), ('peugeot_406', -2.381), ('cit_xsara', -2.661)],
  'matrix': '...'
}

Electre Iv / Is

from EasyMCDM.models.Electre import Electre

data = {
    "A1" : [80, 90,  600, 5.4,  8,  5],
    "A2" : [65, 58,  200, 9.7,  1,  1],
    "A3" : [83, 60,  400, 7.2,  4,  7],
    "A4" : [40, 80, 1000, 7.5,  7, 10],
    "A5" : [52, 72,  600, 2.0,  3,  8],
    "A6" : [94, 96,  700, 3.6,  5,  6],
}
weights = [0.1, 0.2, 0.2, 0.1, 0.2, 0.2]
prefs = ["min", "max", "min", "min", "min", "max"]
vetoes = [45, 29, 550, 6, 4.5, 4.5]
indifference_threshold = 0.6
preference_thresholds = [20, 10, 200, 4, 2, 2] # or None for Electre Iv

e = Electre(data=data, verbose=False)

results = e.solve(weights, prefs, vetoes, indifference_threshold, preference_thresholds)

Output :

{'kernels': ['A4', 'A5']}

Pareto

from EasyMCDM.models.Pareto import Pareto

data = 'tests/data/donnees.csv'
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}

p = Pareto(data=data, verbose=False)
res = p.solve(indexes=[0,1,6], prefs=["min","max","min"])
print(res)

Output :

{
  'alfa_156': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'audi_a4': {'Weakly-dominated-by': ['alfa_156'], 'Dominated-by': ['alfa_156']}, 
  'cit_xantia': {'Weakly-dominated-by': ['alfa_156', 'vw_passat'], 'Dominated-by': ['alfa_156']},
  'peugeot_406': {'Weakly-dominated-by': ['alfa_156', 'cit_xantia', 'rnlt_laguna', 'vw_passat'], 'Dominated-by': ['alfa_156', 'cit_xantia', 'rnlt_laguna', 'vw_passat']},
  'saab_tid': {'Weakly-dominated-by': ['alfa_156'], 'Dominated-by': ['alfa_156']}, 
  'rnlt_laguna': {'Weakly-dominated-by': ['vw_passat'], 'Dominated-by': ['vw_passat']}, 
  'vw_passat': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'bmw_320d': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'cit_xsara': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'rnlt_safrane': {'Weakly-dominated-by': ['bmw_320d'], 'Dominated-by': ['bmw_320d']}
}

Weighted Sum

from EasyMCDM.models.WeightedSum import WeightedSum

data = 'tests/data/donnees.csv'
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}

p = WeightedSum(data=data, verbose=False)
res = p.solve(pref_indexes=[0,1,6],prefs=["min","max","min"], weights=[0.001,2,3], target='min')
print(res)

Output :

[(1, 'bmw_320d', -299.04), (2, 'alfa_156', -284.58299999999997), (3, 'rnlt_safrane', -280.84), (4, 'saab_tid', -275.817), (5, 'vw_passat', -265.856), (6, 'audi_a4', -265.229), (7, 'rnlt_laguna', -262.93600000000004), (8, 'cit_xantia', -262.504), (9, 'peugeot_406', -252.551), (10, 'cit_xsara', -244.416)]

Instant-Runoff Multicriteria Optimization (IRMO)

Short description : Eliminate the worst individual for each criteria, until we reach the last one and select the best one.

from EasyMCDM.models.Irmo import Irmo

p = Irmo(data="data/donnees.csv", verbose=False)
res = p.solve(
    indexes=[0,1,4,5], # price -> max_speed -> comfort -> trunk_space
    prefs=["min","max","min","max"]
)
print(res)

Output :

{'best': 'saab_tid'}

List of methods available

Build PyPi package

Build: python setup.py sdist bdist_wheel

Upload: twine upload dist/*

Citation

If you want to cite the tool you can use this:

@misc{EasyMCDM,
  title={EasyMCDM},
  author={Yanis Labrak, Quentin Raymondaud, Philippe Turcotte},
  publisher={GitHub},
  journal={GitHub repository},
  howpublished={\url{https://github.com/qanastek/EasyMCDM}},
  year={2022}
}
Owner
Labrak Yanis
👨🏻‍🎓 Student in Master of Science in Computer Science, Avignon University 🇫🇷 🏛 Research Scientist - Machine Learning in Healthcare
Labrak Yanis
Attendance Monitoring with Face Recognition using Python

Attendance Monitoring with Face Recognition using Python A python GUI integrated attendance system using face recognition to take attendance. In this

Vaibhav Rajput 2 Jun 21, 2022
CondenseNet V2: Sparse Feature Reactivation for Deep Networks

CondenseNetV2 This repository is the official Pytorch implementation for "CondenseNet V2: Sparse Feature Reactivation for Deep Networks" paper by Le Y

Haojun Jiang 74 Dec 12, 2022
天勤量化开发包, 期货量化, 实时行情/历史数据/实盘交易

TqSdk 天勤量化交易策略程序开发包 TqSdk 是一个由信易科技发起并贡献主要代码的开源 python 库. 依托快期多年积累成熟的交易及行情服务器体系, TqSdk 支持用户使用极少的代码量构建各种类型的量化交易策略程序, 并提供包含期货、期权、股票的 历史数据-实时数据-开发调试-策略回测-

信易科技 2.8k Dec 30, 2022
Language Used: Python . Made in Jupyter(Anaconda) notebook.

FACE-DETECTION-ATTENDENCE-SYSTEM Made in Jupyter(Anaconda) notebook. Language Used: Python Steps to perform before running the program : Install Anaco

1 Jan 12, 2022
Contains modeling practice materials and homework for the Computational Neuroscience course at Okinawa Institute of Science and Technology

A310 Computational Neuroscience - Okinawa Institute of Science and Technology, 2022 This repository contains modeling practice materials and homework

Sungho Hong 1 Jan 24, 2022
RLHive: a framework designed to facilitate research in reinforcement learning.

RLHive is a framework designed to facilitate research in reinforcement learning. It provides the components necessary to run a full RL experiment, for both single agent and multi agent environments.

88 Jan 05, 2023
A whale detector design for the Kaggle whale-detector challenge!

CNN (InceptionV1) + STFT based Whale Detection Algorithm So, this repository is my PyTorch solution for the Kaggle whale-detection challenge. The obje

Tarin Ziyaee 92 Sep 28, 2021
BuildingNet: Learning to Label 3D Buildings

BuildingNet This is the implementation of the BuildingNet architecture described in this paper: Paper: BuildingNet: Learning to Label 3D Buildings Arx

16 Nov 07, 2022
Learning to Identify Top Elo Ratings with A Dueling Bandits Approach

Learning to Identify Top Elo Ratings We propose two algorithms MaxIn-Elo and MaxIn-mElo to solve the top players identification on the transitive and

2 Jan 14, 2022
High-fidelity 3D Model Compression based on Key Spheres

High-fidelity 3D Model Compression based on Key Spheres This repository contains the implementation of the paper: Yuanzhan Li, Yuqi Liu, Yujie Lu, Siy

5 Oct 11, 2022
This code is an implementation for Singing TTS.

MLP Singer This code is an implementation for Singing TTS. The algorithm is based on the following papers: Tae, J., Kim, H., & Lee, Y. (2021). MLP Sin

Heejo You 22 Dec 23, 2022
The repo contains the code to train and evaluate a system which extracts relations and explanations from dialogue.

The repo contains the code to train and evaluate a system which extracts relations and explanations from dialogue. How do I cite D-REX? For now, cite

Alon Albalak 6 Mar 31, 2022
Fashion Recommender System With Python

Fashion-Recommender-System Thr growing e-commerce industry presents us with a la

Omkar Gawade 2 Feb 02, 2022
Official implementation of the paper WAV2CLIP: LEARNING ROBUST AUDIO REPRESENTATIONS FROM CLIP

Wav2CLIP 🚧 WIP 🚧 Official implementation of the paper WAV2CLIP: LEARNING ROBUST AUDIO REPRESENTATIONS FROM CLIP 📄 🔗 Ho-Hsiang Wu, Prem Seetharaman

Descript 240 Dec 13, 2022
Denoising Normalizing Flow

Denoising Normalizing Flow Christian Horvat and Jean-Pascal Pfister 2021 We combine Normalizing Flows (NFs) and Denoising Auto Encoder (DAE) by introd

CHrvt 17 Oct 15, 2022
Library for machine learning stacking generalization.

stacked_generalization Implemented machine learning *stacking technic[1]* as handy library in Python. Feature weighted linear stacking is also availab

114 Jul 19, 2022
An implementation of EWC with PyTorch

EWC.pytorch An implementation of Elastic Weight Consolidation (EWC), proposed in James Kirkpatrick et al. Overcoming catastrophic forgetting in neural

Ryuichiro Hataya 166 Dec 22, 2022
This code is 3d-CNN model that can predict environmental value

Predict-environmental-value-3dCNN This code is 3d-CNN model that can predict environmental value. Firstly, I built a model that can create a lot of bu

1 Jan 06, 2022
This is a simple framework to make object detection dataset very quickly

FastAnnotation Table of contents General info Requirements Setup General info This is a simple framework to make object detection dataset very quickly

Serena Tetart 1 Jan 24, 2022
A Novel Incremental Learning Driven Instance Segmentation Framework to Recognize Highly Cluttered Instances of the Contraband Items

A Novel Incremental Learning Driven Instance Segmentation Framework to Recognize Highly Cluttered Instances of the Contraband Items This repository co

Taimur Hassan 3 Mar 16, 2022