• Stars
    star
    162
  • Rank 232,284 (Top 5 %)
  • Language
    TeX
  • Created over 2 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

https://slds-lmu.github.io/seminar_multimodal_dl/

Multimodal Deep Learning

In the last few years, there have been several breakthroughs in the methodologies used in Natural Language Processing (NLP) as well as Computer Vision (CV). Beyond these improvements on single-modality models, large-scale multi-modal approaches have become a very active area of research.

In this seminar, we reviewed these approaches and attempted to create a solid overview of the field, starting with the current state-of-the-art approaches in the two subfields of Deep Learning individually. Further, modeling frameworks are discussed where one modality is transformed into the other, as well as models in which one modality is utilized to enhance representation learning for the other. To conclude the second part, architectures with a focus on handling both modalities simultaneously are introduced. Finally, we also cover other modalities as well as general-purpose multi-modal models, which are able to handle different tasks on different modalities within one unified architecture. One interesting application (Generative Art) eventually caps off this booklet.

How this book came about

This book is the result of a student seminar for Master Statistics and Master Data Science at the LMU in the summer semester 2022. Each student in the seminar wrote about a specific chapter of the book to pass the seminar.

How to build the book

Step 0: Prerequisites

Make sure you have git and R up and running on your computer.

Step 1: Clone the repository to your machine

With RStudio: https://support.rstudio.com/hc/en-us/articles/200532077-Version-Control-with-Git-and-SVN

With command-line:

git clone [email protected]/slds-lmu/seminar_multimodal_dl.git

Step 2: Install dependencies

Start R in the project folder:

install.packages("devtools")
devtools::install_dev_deps()

Step 3: Render the book (R commands)

# HTML
bookdown::render_book('./', 'bookdown::gitbook')
# PDF
bookdown::render_book('./', 'bookdown::pdf_book')

More Repositories

1

lecture_i2ml

I2ML lecture repository
HTML
147
star
2

iml_methods_limitations

Seminar on Limitations of Interpretable Machine Learning Methods
R
55
star
3

yahpo_gym

Surrogate benchmarks for HPO problems
Jupyter Notebook
26
star
4

imlplots

Create Interpretable Machine Learning plots with an interactive Shiny based dashboard
R
15
star
5

lecture_dl4nlp

Repo containing all the lecture material for the dl4nlp course
TeX
15
star
6

paper_2019_iml_measures

Quantifying Interpretability of Arbitrary Machine Learning Models Through Functional Decomposition
TeX
15
star
7

lecture_iml

Jupyter Notebook
12
star
8

code_pitfalls_iml

This repository contains the code for all figures in the paper "General Pitfalls of Model-agnostic Interpretation Methods for Machine Learning Models"
R
12
star
9

i2ml

https://slds-lmu.github.io/i2ml/
Makefile
12
star
10

seminar_nlp_ss20

Link to website
TeX
11
star
11

lecture_optimization

TeX
11
star
12

tsclassification

Wrapping the Time-Series Classification Java Implementations for R
R
9
star
13

mosmafs

Multi-Objective Simultaneous Model and Feature Selection
R
8
star
14

latex-math

TeX
7
star
15

qdo_yahpo

A Collection of Quality Diversity Optimization Problems Derived from Hyperparameter Optimization of Machine Learning Models
Python
6
star
16

lecture_i2ml_learnr_tutorials

CSS
5
star
17

lrz_configs

R
5
star
18

yahpo_data

Data required for slds-lmu/yahpo_gym
R
4
star
19

paper_2021_categorical_feature_encodings

R
4
star
20

hpo_ela

R
4
star
21

paper_2023_survival_benchmark

Benchmark for Burk et al. (2024)
R
4
star
22

iml-shiny-summary

Shiny Dashboard showing an interpretation summary for any model
R
3
star
23

i2dl

https://slds-lmu.github.io/i2dl/
HTML
3
star
24

wildlife-ml

Python
3
star
25

paper_2021_xautoml

Jupyter Notebook
3
star
26

lecture_sl

TeX
2
star
27

vistool

R
2
star
28

wildlife-experiments

Jupyter Notebook
2
star
29

seminar_website_skeleton

TeX
2
star
30

jTSC4R

Java Time Series Classification code to use in R
Java
2
star
31

paper_2021_multi_fidelity_surrogates

Surrogate benchmarks for HPO problems
R
2
star
32

surrogates

R
2
star
33

grouped_feat_imp_and_effects

R
2
star
34

dl4nlp

https://slds-lmu.github.io/dl4nlp/
Makefile
2
star
35

rcourses_notebook_deeplearning

scetch
Jupyter Notebook
2
star
36

paper_2019_multiobjective_rfms

High Dimensional Restrictive Federated Model Selection with multi-objective Bayesian Optimization over shifted distributions
Jupyter Notebook
2
star
37

lecture_advml

TeX
1
star
38

yahpo_exps

Experiments for yahpo gym
R
1
star
39

lecture_template

TeX
1
star
40

rcourses_notebook_clustering

Jupyter Notebook
1
star
41

paper_2019_variationalResampleDistributionShift

Variational Resampling Based Assessment of Deep Neural Networks Robustness under Distribution Shift
Python
1
star
42

mlw-htr

Jupyter Notebook
1
star
43

ame

average marginal effects for machine learning
R
1
star
44

qdo_nas

R
1
star
45

paper_2024_rpid

R
1
star
46

benchmark_2022_counterfactuals

Benchmark code for paper on counterfactuals R Package
R
1
star
47

phd_thesis_dummy_template

TeX
1
star
48

rcourses_notebook_ml

Jupyter Notebook
1
star
49

mbt_comparison

R
1
star
50

mobafeas

Model Based Feature Selection
R
1
star
51

introduction_iml_bliz_summerschool

This repository provides a short introduction into the most popular model-agnostic IML (interpretable machine learning) methods. A presentation containing the theoretical background and examples as well as excercises on real-world data are included.
HTML
1
star
52

lecture_i2dl

Introduction to Deep Leaning
Jupyter Notebook
1
star
53

lecture_service

Service repo for common infrastructure across all open source lectures
R
1
star
54

paper_2023_eagga

Multi-Objective Optimization of Performance and Interpretability of Tabular Supervised Machine Learning Models
R
1
star
55

paper_2023_regression_suite

R
1
star