COURSE OBJECTIVE
Advanced course that uses classical artificial intelligence and quantum computing to increase the effectiveness of health diagnostics, create models for the drug development process and image classification to detect diseases early, among other applications.
Machine Learning (ML) in healthcare can be used by medical professionals to develop better diagnostic tools to analyze medical images. For example, a machine learning algorithm can be used on medical images (such as Xrays or MRIs) using pattern recognition to look for patterns that indicate a particular disease. This type of machine learning algorithm could help doctors make faster and more accurate diagnoses, resulting in better outcomes for patients.
Healthcare organizations and pharmaceutical companies can also use a deep learning model to identify relevant information in data that could lead to drug discovery, development of new drugs by pharmaceutical companies, and new treatments for diseases. For example, machine learning in healthcare could be used to analyze medical data and research from clinical trials to find previously unknown side effects of drugs. This type of healthcare machine learning in clinical trials could help improve patient care, drug discovery, and the safety and effectiveness of medical procedures.
ML in healthcare can also be used by medical professionals to improve the quality of patient care. For example, deep learning algorithms could be used by the healthcare industry to develop systems that proactively monitor patients and provide alerts to medical devices or electronic health records when changes in their status occur. This type of data collection machine learning could help ensure that patients receive the right care at the right time.
About ML, a module on advanced data processing is presented, explaining, among other topics: sampling, exploratory analysis, detection of outliers, advanced segmentation techniques, feature engineering and classification algorithms.
During the course, ML and Deep Learning predictive models are shown such as: decision trees, neural networks, Bayesian networks, Support Vector Machine, ensemble model, etc. And as for neural networks, the feed forward, recurrent RNN, convolved CNN and Generative adversarial architectures are exposed. In addition, probabilistic machine learning models such as Gaussian processes and Bayesian neural networks have been included.
Computer vision is a form of artificial intelligence (AI) and machine learning that allows computers to extract meaningful information from images and automate actions based on that information, quickly and at scale.
Computer vision has the ability to recognize patterns and make diagnoses in medical images with much greater precision and speed and fewer errors. It has the potential to extract information from medical images that are not visible to the human eye. Therefore, computer vision models for image classification using powerful ML models are presented in the course.
During the course, real cases are addressed, including early detection of obesity using classical ML models and Quantum Machine Learning (QLM), identification and categorization of diabetic retinopathy using convolved neural networks, drug discovery using generative and adversarial neural networks. GAN.
QUANTUM COMPUTING
Quantum Machine Learning is the integration of quantum algorithms within Machine Learning programs. Machine learning algorithms are used to calculate large amounts of data, quantum machine learning uses qubits and quantum operations or specialized quantum systems to improve the speed of calculation and data storage performed by a program's algorithms. For example, some mathematical and numerical techniques from quantum physics are applicable to classical deep learning. A quantum neural network has computational capabilities to decrease the number of steps, the qubits used, and the computing time.
The important objective of the course is to show the use of quantum computing and tensor networks to improve the calculation of machine learning algorithms.
Additionally, the course explains quantum computing, quantum circuits, important quantum algorithms, quantum mechanics, quantum error and correction, and quantum machine learning.
The course explains applications of quantum computing, such as Quantum Machine Learning, in clinical and medical solutions, such as diabetes, esophageal cancer and drug discovery. The improvement of quantum models over traditional ML models is presented, for example for drug discovery a GAN neural network is used compared to its counterpart the GAN quantum neural network.
IMPORTANT
The great need to correctly apply traditional and quantum artificial intelligence in healthcare has forced us to include a very advanced validation module and powerful model risk techniques as well as probabilistic machine learning methodologies in order to understand the uncertainty that there is in the results. We have also included a module called XAI to prevent models from being black boxes and being interpretable.
WHO SHOULD ATTEND?
The Course is aimed at healthcare professionals and laboratories interested in developing powerful artificial intelligence and quantum computing models applied to Healthcare.
For a better understanding of the topics, it is necessary that the participant have knowledge of statistics and mathematics.
Schedules:

Europe: MonFri, CEST 1620 h

America: MonFri, CDT 1821 h

Asia: MonFri, IST 1821 h
Price: 7.900 €
Level: Advanced
Duration: 36 h
Material:

Presentations PDF

Exercises in Excel, R, Python, Jupyterlab y Tensorflow
AGENDA
From Diagnosis to Treatment:
AI and Quantum Computing in Healthcare
Machine Learning
Module 1: Machine Learning

Definition of Machine Learning

Machine Learning Methodology

Data Storage

Abstraction

Generalization

Assessment


Supervised Learning

Unsupervised Learning

Reinforcement Learning

Вeep learning

Typology of Machine Learning algorithms

Steps to Implement an Algorithm

Information collection

Exploratory Analysis

Model Training

Model Evaluation

Model improvements

Machine Learning in consumer credit risk


Machine Learning in credit scoring models

Quantum Machine Learning
Module 2: EDA Exploratory Analysis

Data typology

Transactional data

Unstructured data embedded in text documents

Social Media Data

Data sources

Data review

Target definition

Time horizon of the target variable

Sampling

Random Sampling

Stratified Sampling

Rebalanced Sampling


Exploratory Analysis:

Histograms

Q Q Plot

Moment analysis

boxplot


Treatment of Missing values

Multivariate Imputation Model


Advanced Outlier detection and treatment techniques

Univariate technique: winsorized and trimming

Multivariate Technique: Mahalanobis Distance


Exercise 1: EDA Exploratory Analysis
Module 3: Feature Engineering

Data Standardization

Variable categorization

Equal Interval Binning

Equal Frequency Binning

ChiSquare Test


Binary coding

Binning

Kind of transformation

Univariate Analysis with Target variable

Variable Selection

Optimization of continuous variables

Optimization of categorical variables


Exercise 2: Detection and treatment of Advanced Outliers

Exercise 3: Stratified and Random Sampling in R

Exercise 4: Multivariate imputation model

Exercise 5: Univariate analysis in percentiles in R

Exercise 6: Continuous variable optimal univariate analysis in Excel

Exercise 7: Estimation of the KS, Gini and IV of each variable in Excel

Exercise 8: Feature Engineering of variables
Unsupervised Learning
Module 4: Unsupervised models

Hierarchical Clusters

K Means

standard algorithm

Euclidean distance

Principal Component Analysis (PCA)

Advanced PCA Visualization

Eigenvectors and Eigenvalues

Exercise 9: Segmentation of the data with KMeans R
Supervised Learning
Module 5: Logistic Regression and LASSO Regression

Econometric Models

Logit regression

probit regression

Piecewise Regression

survival models


Machine Learning Models

Lasso Regression

Ridge Regression


Exercise 10: Lasso Logistic Regression in R

Exercise 11: Ridge Regression in R
Module 6: Trees and KNN

Decision Trees

Modeling

Advantages and disadvantages

Recursion and Partitioning Processes

Recursive partitioning tree

Pruning Decision tree

Conditional inference tree

Tree display

Measurement of decision tree prediction

CHAID model

Model C5.0


KNearest Neighbors KNN

Modeling

Advantages and disadvantages

Euclidean distance

Distance Manhattan

K value selection


Exercise 12: KNN and PCA
Module 7: Support Vector Machine SVM

Support Vector Classification

Support Vector Regression

optimal hyperplane

Support Vectors

Add costs

Advantages and disadvantages

SVM visualization

Tuning SVM

Kernel trick

Exercise 14: Support Vector Machine in R
Module 8: Ensemble Learning

Classification and regression ensemble models

Bagging

Bagging trees

Random Forest

Boosting

Adaboost

Gradient Boosting Trees

Xgboost

Advantages and disadvantages

Exercise 15: Boosting in R

Exercise 16: Bagging in R

Exercise 17: Random Forest, R and Python

Exercise 18: Gradient Boosting Trees
Deep Learning
Module 9: Introduction to Deep Learning

Definition and concept of deep learning

Why now the use of deep learning?

Neural network architectures

Cost function

Gradient descending optimization

Use of deep learning

How many hidden layers?

How many neurons, 100, 1000?

How many times and size of the batch size?

What is the best activation function?


Hardware, CPU, GPU and cloud environments

Advantages and disadvantages of deep learning
Module 10: Deep Learning Feed Forward Neural Networks

Single Layer Perceptron

Multiple Layer Perceptron

Neural network architectures

Activation function

Sigmoidal

Rectified linear unit (Relu)

The U

Selu

Hyperbolic hypertangent

Softmax

Other


Back propagation

Directional derivatives

Gradients

Jacobians

Chain rule

Optimization and local and global minima


Exercise 19: Deep Learning Feed Forward
Module 11: Deep Learning Convolutional Neural Networks CNN

CNN for pictures

Design and architectures

Convolution operation

Descending gradient

Filters

Strider

Padding

Subsampling

Pooling

Fully connected

Exercise 20: deep learning CNN
Module 12: Deep Learning Recurrent Neural Networks RNN

Natural Language Processing

Natural Language Processing (NLP) text classification

Long Term Short Term Memory (LSTM)

Hopfield

Bidirectional associative memory

Descending gradient

Global optimization methods

Oneway and twoway models

Deep Bidirectional Transformers for Language Understanding

Exercise 21: Deep Learning LSTM
Module 14: Generative Adversarial Networks (GANs)

Generative Adversarial Networks (GANs)

Fundamental components of the GANs

GAN architectures

Bidirectional GAN

Training generative models

Exercise 22: Deep Learning GANs
Module 15: Tuning Hyperparameters

Hyperparameterization

Grid search

Random search

Bayesian Optimization

Train test split ratio

Learning rate in optimization algorithms (e.g. gradient descent)

Selection of optimization algorithm (e.g., gradient descent, stochastic gradient descent, or Adam optimizer)

Activation function selection in a (nn) layer neural network (e.g. Sigmoid, ReLU, Tanh)

Selection of loss, cost and custom function

Number of hidden layers in an NN

Number of activation units in each layer

The dropout rate in nn (dropout probability)

Number of iterations (epochs) in training a nn

Number of clusters in a clustering task

Kernel or filter size in convolutional layers

Pooling size

Batch size

Exercise 23: Optimization Xboosting, Random forest and SVM

Exercise 24: Optimized Deep Learning
Probabilistic Machine Learning
Module 16: Probabilistic Machine Learning

Introduction to probabilistic machine learning

Gaussian models

Bayesian Statistics

Bayesian logistic regression

Kernel family

Gaussian processes

Gaussian processes for regression


Hidden Markov Model

Markov chain Monte Carlo (MCMC)

Metropolis Hastings algorithm


Machine Learning Probabilistic Model

Bayesian Boosting

Bayesian Neural Networks

Exercise 25: Gaussian process for regression

Exercise 26: Bayesian Neural Networks
Model Validation
Module 17: Validation of traditional and Machine Learning models

Model validation

Validation of machine learning models

Regulatory validation of machine learning models in Europe

Out of Sample and Out of time validation

Checking pvalues in regressions

R squared, MSE, MAD

Waste diagnosis

Goodness of Fit Test

Multicollinearity

Binary case confusion matrix

KFold Cross Validation

Diagnostico del modelo

Exercise 27: Validación avanzada de la regression

Exercise 28: Diagnostico de la regresión

Exercise 29: KFold Cross Validation in R
Module 18: Advanced Validation of AI Models

Integration of stateoftheart methods in interpretable machine learning and model diagnosis.

Data Pipeline

Feature Selection

Blackbox Models

Posthoc Explainability

Global Explainability

Local Explainability

Model Interpretability

Diagnosis: Accuracy, WeakSpot, Overfit, Reliability, Robustness, Resilience, Fairness

Model comparison

Comparative for Regression and Classification

Fairness Comparison


Exercise 30: Validation and diagnosis of advanced credit scoring models
Auto Machine Learning and XAI
Module 19: Automation of ML

What is modeling automation?

That is automated

Automation of machine learning processes

Optimizers and Evaluators

Modeling Automation Workflow Components

Hyperparameter optimization

Global evaluation of modeling automation

Implementation of modeling automation in banking

Technological requirements

Available tools

Benefits and possible ROI estimation

Main Issues

Genetic algorithms

Exercise 31: Automation of the modeling, optimization and validation of pricing models
Explainable Artificial Intelligence
Module 20: Explainable Artificial Intelligence XAI

Interpretability problem

Machine learning models

1. The challenge of interpreting the results,

2. The challenge of ensuring that management functions adequately understand the models, and

3. The challenge of justifying the results to supervisors


Black Box Models vs. Transparent and Interpretable Algorithms

Interpretability tools

Shap, Shapley Additive explanations

Global Explanations

Dependency Plot

Decision Plot

Local Explanations Waterfall Plot


Lime, agnostic explanations of the local interpretable model

Explainer Dashboard

Other advanced tools

Exercise 32: XAI interpretability of pricing
Quantum Computing
Module 21: Quantum computing and algorithms
Objective: Quantum computing applies quantum mechanical phenomena. On a small scale, physical matter exhibits properties of both particles and waves, and quantum computing takes advantage of this behavior using specialized hardware. The basic unit of information in quantum computing is the qubit, similar to the bit in traditional digital electronics. Unlike a classical bit, a qubit can exist in a superposition of its two "basic" states, meaning that it is in both states simultaneously.

Future of quantum computing in insurance

Is it necessary to know quantum mechanics?

QIS Hardware and Apps

quantum operations

Qubit representation

Measurement

Overlap

Matrix multiplication

Qubit operations

Multiple Quantum Circuits

Entanglement

Deutsch Algorithm

Quantum Fourier transform and search algorithms

Hybrid quantumclassical algorithms

Quantum annealing, simulation and optimization of algorithms

Quantum machine learning algorithms

Exercise 33: Quantum operations multiexercises
Module 22: Introduction to quantum mechanics

Quantum mechanical theory

Wave function

Schrodinger's equation

Statistical interpretation

Probability

Standardization

Impulse

The uncertainty principle

Mathematical Tools of Quantum Mechanics

Hilbert space and wave functions

The linear vector space

Hilbert's space

Dimension and bases of a Vector Space

Integrable square functions: wave functions

Dirac notation

Operators

General definitions

Hermitian adjunct

Projection operators

Commutator algebra

Uncertainty relationship between two operators

Operator Functions

Inverse and Unitary Operators

Eigenvalues and Eigenvectors of an operator

Infinitesimal and finite unit transformations

Matrices and Wave Mechanics

Matrix mechanics

Wave Mechanics

Exercise 34: Quantum mechanics multiexercises
Module 23: Introduction to quantum error correction

Error correction

From reversible classical error correction to simple quantum error correction

The quantum error correction criterion

The distance of a quantum error correction code

Content of the quantum error correction criterion and the quantum Hamming bound criterion

Digitization of quantum noise

Classic linear codes

Calderbank, Shor and Steane codes

Stabilizer Quantum Error Correction Codes

Exercise 35: Noise Model, Repetition Code and quantum circuit
Module 24: Quantum Computing II

Quantum programming

Solution Providers

IBM Quantum Qiskit

Amazon Braket

PennyLane

cirq

Quantum Development Kit (QDK)

Quantum clouds

Microsoft Quantum

Qiskit


Main Algorithms

Grover's algorithm

Deutsch–Jozsa algorithm

Fourier transform algorithm

Shor's algorithm


Quantum annealers

DWave implementation

Qiskit Implementation

Exercise 36: Quantum Circuits, Grover Algorithm Simulation, Fourier Transform and Shor
Module 25: Quantum Machine Learning

Quantum Machine Learning

Hybrid models

Quantum Principal Component Analysis

Q means vs. K means

Variational Quantum Classifiers

Variational quantum classifiers

Quantum Neural Network

Quantum Convolutional Neural Network

Quantum Long Short Memory LSTM


Quantum Support Vector Machine (QSVC)

Exercise 37: Quantum Support Vector Machine
Module 26: Tensor Networks for Machine Learning

What are tensor networks?

Quantum Entanglement

Tensor networks in machine learning

Tensor networks in unsupervised models

Tensor networks in SVM

Tensor networks in NN

NN tensioning

Application of tensor networks in credit scoring models

Exercise 38: Neural Network using tensor networks
Healthcare Models
Module 27: Early Obesity Detection Using AI and Quantum AI
Obesity is an epidemic disease, as being overweight or obese increases the risk of serious diseases such as diabetes, heart disease, hypertension and certain types of cancer that lead to premature death. However, early identification of the causative factors makes obesity highly preventable. Due to the objective of early detection and with the advancement of machine learning (ML) algorithms, some models used in obesity detection are evaluated against quantum ML algorithms.

Obesity in the world

What are the causes of obesity and overweight?

What are the health consequences of overweight and obesity?

How to reduce overweight and obesity?

Variables related to eating habits

Variables related to physical condition

Feature Engineering

Importance in variable selection

Treatment of outliers

Models used in recent years to detect obesity

Machine Learning Models

Support Vector Machine Regression

KNearest Neighborhood

Random Forests

Gradient Boosting

Extreme Gradient Boosting


Quantum Machine Algorithms

Qubit and Quantum States

Quantum circuits

Support Vector Quantum Machine

Variational quantum classifier

Quantum Neural Networks


Exercise 39: Random Forest, Gradient Boosting and Extreme Gradient Boosting to detect obesity

Exercise 40: Quantum Support Vector Machine and classical SVM to detect obesity

Exercise 41: Quantum Neural Networks to detect obesity

Exercise 42: Quantum Convoluted Neural Networks to detect obesity
Module 28: Diabetic Retinopathy Model with Convolutional Neural Networks
Diabetic retinopathy is one of the main causes of blindness in diabetic people between 25 and 65 years old. Injuries to the retina caused by weakened blood vessels can lead to vision loss and even total blindness. Current manual grading methods to detect diabetic retinopathy are timeconsuming and errorprone. Convolutional neural networks have shown great promise for automating the identification and categorization of diabetic retinopathy.

Excessive blood sugar levels

Statistics in the world on diabetes

Diabetesrelated retinopathy (DR)

Four phases
 Mild nonproliferative DR (NPRD)

Moderate nonproliferative DR (NPRD)

Severe NPDR

Proliferative DR (PDR)

Image preprocessing and diagnosis process of diabetic retinopathy using CNN

Classification of images in Real Estate
 Problem Statement

Deep Learning Problem Formulation

Project and Data Source

Image Dataset

Evaluation Metric

Exploratory Data Analysis

Image Preprocessing

Model Training

Productionizing

Image Classification: Datadriven Approach

Convolutional Neural Networks

Data and the Loss preprocessing

Hyperparameters

Train/val/test splits

L1/L2 distances

Hyperparameter search

Optimization: Stochastic Gradient Descent

Optimization landscapes

Black Box Landscapes

Local search

Learning rate


Weight initialization

Batch normalization

Regularization (L2/dropout)

Loss functions

Gradient checks

Sanity checks

Hyperparameter optimization

Architectures

Convolution / Pooling Layers

Spatial arrangement

Layer patterns

Layer sizing patterns

AlexNet/ZFNet/Densenet/VGGNet


Convolutional Neural Networks tSNE embeddings

Deconvnets

Data gradients

Fooling ConvNets

Human comparisons Transfer Learning and

Finetuning Convolutional Neural Networks

Performance Metrics

Accuracy

F1Score

AUCROC

Cohen Kappa Coefficient


Exercise 43: Convolutional Neural Networks for diagnosis of diabetic retinopathy
Module 29: Drug discovery using
GAN neural networks
Drug discovery refers to the process of identifying and developing new chemical compounds, to create medications that can treat or cure diseases. In this process, one of the biggest obstacles is designing a molecule with the necessary properties, since it requires numerous chemical and structural optimizations. Generative adversarial networks can learn the representation of chemical structures and drug properties from the data set to produce new chemical structures that have similar properties to the training set. Generative adversarial networks have the potential to accelerate drug discovery by generating new compounds with desirable properties, thereby reducing the time and cost required for traditional drug discovery methods.

Simplified Molecular Input Line Entry System (SMILES)

Drug Discovery Cycle

Generative Adversarial Networks (GANs)

Backpropagation in Discriminator and Generator Training

Encoding

GAN Variants

CGAN

LAPGAN

DCGAN

AAE

INFOGAN


Application of the GAN

Drug discovery

Drug development

Biomolecular

Targeting


Modified and Applied CNN architectures used as the GAN generators and discriminators

DCGAN generator and discriminator used for molecular compound fingerprint generation

Exercise 44: Drug discovery models using GAN neural networks
Quantum Computing for Healthcare
Module 30: Applications of quantum computing in clinical and medical solutions
Medicine, including health and life sciences, has witnessed a flurry of activities and experiments related to quantum computing in recent years. Initially they focused on biochemical and computational biology problems, but recently clinical and medical quantum solutions have attracted increasing interest. The rapid emergence of quantum computing in the fields of health and medicine requires an adaptation of the landscape.

Impact of quantum computing in healthcare

Quantum support vector classifier (QSVC)

Virtual screening in drug discovery


QNN, QSCV

Classification of ischemic heart disease


Transfer learningbased QNN
 Classification of breast cancer

VQC

Classification of diabetes


QSVC

Classification of medication persistence for individuals with rheumatoid arthritis


Grover’s

DNA sequence alignment


Medical quantum computing challenges

Data Security

Replicability

Skill devolpment


Exercise 45: Hybrid Quantum Classical Neural Networks for Diabetic Retinopathy

Exercise 46: Experimentos de diabetes usando Neural Network clásico y Quantum Neural Network

Exercise 47: Esophageal cancer tissue sorter using Neural Network clásico y Quantum Neural Network

Exercise 48: Quantum GAN for drug discovery
Probabilistic Machine Learning for Healthcare
Module 31: Bayesian Neural Network
DL models have also been intensively used in different healthcare tasks, such as disease diagnosis and treatment. ML techniques have outperformed other machine learning algorithms and have proven to be the ultimate tools for many nextgeneration applications. Despite all that success, classical deep learning has limitations and its models tend to be overconfident in their predicted decisions because they don't know when they're wrong. For healthcare, this limitation can have a negative impact on model predictions, since almost all decisions regarding patients and diseases are sensitive. Therefore, Bayesian deep learning (BDL) has been developed to overcome these limitations. Unlike classical DL, BDL uses probability distributions for the model parameters, allowing all uncertainties associated with the predicted results to be estimated. In this sense, the BDL offers a rigorous framework to quantify all sources of model uncertainty.

Convolutional Neural Networks (CNN)

Markov chain Monte Carlo (MCMC)

Metropolis Hastings algorithm


Monte Carlo Dropout (MC–DROPOUT)

Variational Inference (VI)

Bayesian Neural Networks

Exercise 49: Bayesian Neural Networks for Diabetic Retinopathy Detection