- Apps - Bibliometric and Scientometric - Python
- Apps - Machine Learning - Python (
Lesson Included
) - Apps - Multicriteria Decision Aids - Java/Python (
Lesson Included
) - Apps - Operations Research - Java/Python (
Lesson Included
) - Data Science - Association Rules - Python (
Lesson Included
) - Data Science - Classification Algorithms - Python (
Lesson Included
) - Data Science - Regression Algorithms - Python (
Lesson Included
) - Data Science - Clustering Algorithms - Python (
Lesson Included
) - Data Science - Decision Trees - Python (
Lesson Included
) - Data Science - Recommender Systems - Python (
Lesson Included
) - Data Science - Natural Language Processing - Python (
Lesson Included
) - Data Science - Neural Networks - Python (
Lesson Included
) - Data Science - Reinforcement Learning - Python (
Lesson Included
) - Data Science - Deep Reinforcement Learning - Python (
Lesson Included
) - Forecasting - Python
- Multivariate Data Analysis - R and SPSS (
Lesson Included
) - Others - Python
-
pyBibX: A Bibliometric and Scientometric Python Library Powered with Artificial Intelligence Tools
-
pyAutoSummarizer: A Extrative and Abstractive Summarization Library Powered with Artificial Intelligence
- pyRecommenderSystem: A Recommender System Library.
Recommender Systems Lesson
-
pyDecision: A python library for MCDA methods. Lesson Included.
AHP Lesson
. Lesson Included (slides in pt-br).ELECTRE I Lesson
;ELECTRE I_v Lesson
;ELECTRE I_s Lesson
;ELECTRE II Lesson
;ELECTRE III Lesson
;ELECTRE IV Lesson
andELECTRE Tri Lesson
. Lesson Included.PROMETHEE I Lesson
;PROMETHEE II Lesson
;PROMETHEE III Lesson
;PROMETHEE IV Lesson
andPROMETHEE V Lesson
-
pyMissingAHP - A Method to Infer AHP Missing Pairwise Comparisons. Lesson Included.
AHP Lesson
-
3MO-AHP - A Multiobjective Inconsistency Reduction Technique for AHP and Fuzzy-AHP Methods. The following objectives can be used: minimum inconsistency (fMI), the total number of adjusted pairwise comparisons (fNC), original rank preservation (fKT), minimum average weights adjustment (fWA) and finally, minimum L1 matrix norm between the original PCM and the adjusted PCM (fLM). Lesson Included.
AHP Lesson
-
Ranking-Trees (Inference for ELECTRE II, III and IV Parameters and Inference for PROMETHEE I, II, III, IV Parameters) : A python implementation of the Ranking-Trees Algorithm. The Ranking-Trees can elicitate (infer) all or any combination of the following parameters, ELECTRE II ( Colab Demo ) - criteria weights, concordance parameters and discordance parameters, ELECTRE III ( Colab Demo ) - criteria weights, indifference threshold, preference threshold and veto threshold, ELECTRE IV ( Colab Demo ) - indifference threshold, preference threshold and veto threshold, PROMETHEE I ( Colab Demo ) - P, Q, S and F parameters, PROMETHEE II ( Colab Demo ) - P, Q, S and F parameters, f) PROMETHEE III ( Colab Demo ) - P, Q, S, F and lambda parameters, PROMETHEE IV ( Colab Demo ) - P, Q, S and F parameters. Lesson Included (slides in pt-br).
ELECTRE I Lesson
;ELECTRE I_v Lesson
;ELECTRE I_s Lesson
;ELECTRE II Lesson
;ELECTRE III Lesson
;ELECTRE IV Lesson
andELECTRE Tri Lesson
-
ELECTRE-Tree (Inference for ELECTRE Tri-B Parameters) : A python implementation of the ELECTRE-Tree Algorithm. The ELECTRE-Tree can elicitate (infer) all or any combination of the following ELECTRE Tri-B parameters: criteria weights, indifference threshold, preference threshold, veto threshold, and the lambda cut level ( Colab Demo ). Lesson Included (slides in pt-br).
ELECTRE I Lesson
;ELECTRE I_v Lesson
;ELECTRE I_s Lesson
;ELECTRE II Lesson
;ELECTRE III Lesson
;ELECTRE IV Lesson
andELECTRE Tri Lesson
-
J-ELECTRE (ELimination and Choice Expressing REality) : Electre I, I_s, I_v, II, III, IV, TRI and TRI-ME Software. Lesson Included (slides in pt-br).
ELECTRE I Lesson
;ELECTRE I_v Lesson
;ELECTRE I_s Lesson
;ELECTRE II Lesson
;ELECTRE III Lesson
;ELECTRE IV Lesson
andELECTRE Tri Lesson
-
Voracious-AHP (Analytic Hierarquic Process) : A software that can remove or reduce the inconsistency from an AHP Matrix. Lesson Included.
AHP Lesson
-
J-Horizon : A Vehicle Routing Problem Software. CVRP (Capacitated VRP), MDVRP (Multiple Depot VRP), VRPTW (VRP with Time Windows), VRPB (VRP with Backhauls), VRPPD (VRP with Pickups and Deliveries), VRP with Homogeneous or Heterogeneous Fleet, TSP, mTSP and various combination of these types
-
J-EOQ-SA : EOQ (Economic Order Quantity) for a single product with No Discounts, All Units or Incremental Discounts and with or without Backorders
-
pyMetaheuristic : A library that implements Metaheuristics to solve single objective problems.
GA Lesson
-
pyMultiobjective : A library that implements Multiobjective Optimization Algorithms and Many Objectives Optimization Algorithms
-
pyCombinatorial : A library to solve the TSP (Travelling Salesman Problem) using Exact Algorithms, Heuristics and Metaheuristics
-
pyVRP : A python library that solves, using Genetic Algorithms, many VRP Problems ( Capacitated VRP; Multiple Depot VRP; VRP with Time Windows; VRP with Heterogeneous Fleet; VRP with Infinite Fleet; Open VRP; TSP; mTSP and various combination of these types).
GA for VRP Lesson
-
pyID : A python library that implements many Intermittent Demand Methods (Croston; SBA; SBJ; TSB; HES; LES and SES)
- Apriori Algorithm : Apriori Algorithm - An Association Rule Learning Over Transactions Databases
Apriori Algorithm Lesson
- Naive Baye Classifier : Classifier for Supervised Learning Problems
Naive Baye Classifier Lesson
Support Vector Machines Lesson
Gradient Boosting Lesson
Multiple Linear Regression Lesson
Logistic Regression Lesson
Multinomial Regression Lesson
Support Vector Regression Lesson
Gradient Boosting Lesson
- ID3 (Iterative Dichotomiser 3) : A Decision Tree for Categorical Data with Pruning Methods
ID3 (Iterative Dichotomiser 3) Lesson
- C4.5 : A Decision Tree for Numerical and Categorical Data that can Handle Missing Values and Pruning Methods
C4.5 Lesson
- CART (Classification And Regression Trees) : A Decision Tree for Numerical and Categorical Data that can Handle Missing Values and Pruning Methods
CART (Classification And Regression Trees) Lesson
CHAID (Chi-square Automatic Interaction Detection) Lesson
- Random Forest : A Decision Tree Ensemble for Numerical and Categorical Data that can Handle Missing Values
Random Forest Lesson
Isolation Forest & Extended Isolation Forest Lesson
( Colab Demo - Isolation Forest ) ( Colab Demo - Extended Isolation Forest )
- CBF (Content Based Filtering) : Content-Based Filtering using TF-IDF Matrices with Cosine Similarity
- CF Item (Collaborative Filtering - Item Based) : Collaborative Filtering Function using an Item Based Regression Approach
- CF User (Collaborative Filtering - User Based) : Collaborative Filtering Function using an User Based Regression Approach
- CF User-Item (Collaborative Filtering - User & Item Based) : Collaborative Filtering Function using an User-Item Based Regression Approach
- CF Latent Factors (Collaborative Filtering - Latent Factors) : Collaborative Filtering Function using Regression with Latent Factors Approach
- CF Nearest Neighbors (Collaborative Filtering - Nearest Neighbors) : Collaborative Filtering Function using a Nearest Neighbors Approach
- CF SVD (Collaborative Filtering - SVD) : Collaborative Filtering Function using a SVD (Singular Value Decomposition) Approach
Recommender Systems Lesson
- LDA (Latent Dirichlet Allocation) : Latent Dirichlet Allocation (LDA) function. Also computes the dtm, binary dtm, tf dtm and tf-idf dtm
LDA (Latent Dirichlet Allocation) Lesson
( Colab Demo )Word Embedding Lesson
( Colab Demo - W2V ) ( Colab Demo - D2V )Extractive and Abstractive Summarization Lesson
( Colab Demo - TextRank ) ( Colab Demo - LexRank ) ( Colab Demo - LSA ) ( Colab Demo - KL-Sum ) ( Colab Demo - BART) ( Colab Demo - T5 ) ( Colab Demo - chatGPT )
- Neural Network : Pure Python Neural Network Function for Classification or Regression Problems
Neural Network Lesson
( Colab Demo - Regression ) ( Colab Demo - Classification 1 ) ( Colab Demo - Classification 2 )Convolutional Neural Network Lesson
( Colab Demo - Classification 1 ) ( Colab Demo - Classification 2 VGG16 ) ( Colab Demo - Object Detection )Recurrent Neural Network Lesson
( Colab Demo - NLP ) ( Colab Demo - Forecasting 1 ) ( Colab Demo - Forecasting 2 )Tranformers Lesson
( Colab Demo - Translation ) ( Colab Demo - Voice Cloning & Text-to-Speech ) ( Colab Demo - Abstractive Summarization )Vision Transformers Lesson
( Colab Demo - Object Detection )Autoencoders Lesson
( Colab Demo - Dimension Reduction )Variational Autoencoders Lesson
( Colab Demo - Shapes )Generative Adversarial Networks Lesson
( Colab Demo - pixel2style2pixel ) ( Colab Demo - Numpy) ( Colab Demo - Deepfake) ( Colab Demo - Image Animation)Diffusion Models Lesson
( Colab Demo - Stable Diffusion 1 ) ( Colab Demo - Stable Diffusion 2 ) ( Colab Demo - DDPM & DDIM )
Reinforcement Learning Lesson
( Colab Demo - Q Learning - Gridworld ) ( Colab Demo - Q Learning - TicTacToe )
Deep Reinforcement Learning Lesson
( Colab Demo - DQN - Gridworld ) ( Colab Demo - A2C - Gridworld )
Forecasting Lessons
:- Lesson 01 - Introduction to Forecasting
- Lesson 02 - Time Series Decomposition
- Lesson 03 - Holt's Method
- Lesson 04 - Holt-Winters' Method
- Lesson 05 - Multiple Linear Regression
- Lesson 06 - Logistic Regression
- Moving Averages : Calculates the Centered Moving Average (Weighted, Simple or Exponential) of a Time Series
- Decomposition : Decomposition of Timeseries Using the X-11 Algorithm
- Holt Method : Calculates the Additive or Multiplicative Holt's Method for Time Series with Trend
- Holt-Winters Method : Calculates the Additive or Multiplicative Holt-Winters' Method for Time Series with Trend and Seasonality
-
MVDA Lessons ( R ):
R Codes
Lesson 03 - Exploratory Factor Analysis
Lesson 04 - Multidimensional Scaling
Lesson 05 - Correspondence Analysis
Lesson 06 - Discriminant Analysis
Lesson 07 - Multiple Linear Regression
Lesson 08 - Logistic Regression (Binary)
Lesson 09 - Logistic Regression (Multinomial)
Lesson 10 - Confirmatory Factor Analysis
Lesson 11 - Canonical Correlation
-
MVDA Lessons ( SPSS ) :
Lesson 01 - Introduction
Lesson 02 - Scales & Descriptive Statistics
Lesson 03 - Exploratory Factor Analysis
Lesson 04 - Multidimensional Scaling
Lesson 05 - Correspondence Analysis
Lesson 06 - Discriminant Analysis
Lesson 07 - Multiple Linear Regression
Lesson 08 - Logistic Regression (Binary)
Lesson 09 - Logistic Regression (Multinomial)
Lesson 10 - Confirmatory Factor Analysis
Lesson 11 - Canonical Correlation
- LUDPP (LU Decomposition with Partial Pivoting) : Lower–Upper Decomposition with Partial Pivoting