Rémi Flamary

Professional website

News

This page contains the achive of news for this website.

NeurIPS 2023

2023-12-01

I will be present at NeurIPS 2023 in New Orleans. I will present with mt awesome co-authors two posters and I am an invited speaker at the Optimal Transport for Machine Learning Workshop (OTML).

Feel free to come and see me and my collaborators at our posters or during the OTML workshop (we also have posters there).

T. Gnassounou, R. Flamary, A. Gramfort, Convolutional Monge Mapping Normalization for learning on biosignals, Neural Information Processing Systems (NeurIPS), 2023.
Abstract: In many machine learning applications on signals and biomedical data, especially electroencephalogram (EEG), one major challenge is the variability of the data across subjects, sessions, and hardware devices. In this work, we propose a new method called Convolutional Monge Mapping Normalization (CMMN), which consists in filtering the signals in order to adapt their power spectrum density (PSD) to a Wasserstein barycenter estimated on training data. CMMN relies on novel closed-form solutions for optimal transport mappings and barycenters and provides individual test time adaptation to new data without needing to retrain a prediction model. Numerical experiments on sleep EEG data show that CMMN leads to significant and consistent performance gains independent from the neural network architecture when adapting between subjects, sessions, and even datasets collected with different hardware. Notably our performance gain is on par with much more numerically intensive Domain Adaptation (DA) methods and can be used in conjunction with those for even better performances.
BibTeX:
@inproceedings{gnassounou2023convolutional,
author = {Gnassounou, Théo and Flamary, Rémi and Gramfort, Alexandre},
title = {Convolutional Monge Mapping Normalization for learning on biosignals},
booktitle = {Neural Information Processing Systems (NeurIPS)},
editor = {},
year = {2023}
} 
H. Van Assel, T. Vayer, R. Flamary, N. Courty, SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities, Neural Information Processing Systems (NeurIPS), 2023.
Abstract: Many approaches in machine learning rely on a weighted graph to encode the similarities between samples in a dataset. Entropic affinities (EAs), which are notably used in the popular Dimensionality Reduction (DR) algorithm t-SNE, are particular instances of such graphs. To ensure robustness to heterogeneous sampling densities, EAs assign a kernel bandwidth parameter to every sample in such a way that the entropy of each row in the affinity matrix is kept constant at a specific value, whose exponential is known as perplexity. EAs are inherently asymmetric and row-wise stochastic, but they are used in DR approaches after undergoing heuristic symmetrization methods that violate both the row-wise constant entropy and stochasticity properties. In this work, we uncover a novel characterization of EA as an optimal transport problem, allowing a natural symmetrization that can be computed efficiently using dual ascent. The corresponding novel affinity matrix derives advantages from symmetric doubly stochastic normalization in terms of clustering performance, while also effectively controlling the entropy of each row thus making it particularly robust to varying noise levels. Following, we present a new DR algorithm, SNEkhorn, that leverages this new affinity matrix. We show its clear superiority to state-of-the-art approaches with several indicators on both synthetic and real-world datasets.
BibTeX:
@inproceedings{van2023snekhorn,
author = {Van Assel, Hugues and Vayer, Titouan and Flamary, Rémi and Courty, Nicolas},
title = {SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities},
booktitle = {Neural Information Processing Systems (NeurIPS)},
editor = {},
year = {2023}
} 

The least effort theory and its applications to artificial intelligence

2023-04-12

Gabriel Peyré and I presented on March 13 2023 at Sorbonne University Jussieu in Paris, a conference for a large public where we discussed the use of optimal transport and the least effort theory in artificial intelligence applications.

I provide here the slides of the presentation (in french) and the link to the Youtube video.

Oral presentation at NeurIPS 2022

2022-11-20

The thesis work of Cédric Vincent-Cuaz on Optimal Transport for Graph Neural Networks has been accepted for a very selective oral presentation at NeuriPS 2022.

Cedric and I will be present at New Orleans for NeurIPS. Feel free to come and see us at our poster.

C. Vincent-Cuaz, R. Flamary, M. Corneli, T. Vayer, N. Courty, Template based Graph Neural Network with Optimal Transport Distances, Neural Information Processing Systems (NeurIPS), 2022.
Abstract: Current Graph Neural Networks (GNN) architectures generally rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling. The structural (or topological) information is implicitly taken into account in these two steps. We propose in this work a novel point of view, which places distances to some learnable graph templates at the core of the graph representation. This distance embedding is constructed thanks to an optimal transport distance: the Fused Gromov-Wasserstein (FGW) distance, which encodes simultaneously feature and structure dissimilarities by solving a soft graph-matching problem. We postulate that the vector of FGW distances to a set of template graphs has a strong discriminative power, which is then fed to a non-linear classifier for final predictions. Distance embedding can be seen as a new layer, and can leverage on existing message passing techniques to promote sensible feature representations. Interestingly enough, in our work the optimal set of template graphs is also learnt in an end-to-end fashion by differentiating through this layer. After describing the corresponding learning procedure, we empirically validate our claim on several synthetic and real life graph classification datasets, where our method is competitive or surpasses kernel and GNN state-of-the-art approaches. We complete our experiments by an ablation study and a sensitivity analysis to parameters.
BibTeX:
@inproceedings{vincentcuaz2022template,
author = { Vincent-Cuaz, Cédric and Flamary, Rémi and Corneli, Marco and Vayer, Titouan and Courty, Nicolas},
title = {Template based Graph Neural Network with Optimal Transport   Distances},
booktitle = {Neural Information Processing Systems (NeurIPS)},
editor = {},
year = {2022}
} 

Optimal Transport for Machine Learning tutorial at Hi! Paris Summer School 2022

2022-06-15

I will be giving a tutorial about Optimal transport for machine learning for the Hi! Paris Summer School 2022 on July 4 2022 at Ecole Polytechnique in Paris/Saclay, France.

The presentation slides are available below:

  • Part 1 : Intro to Optimal Transport [PDF].
  • Part 2: Optimal Transform for Machine learning [PDF].

Release of POT Python Optimal Transport 0.8.2

2022-04-21

This releases introduces several new notable features. The less important but most exiting one being that we now have a logo for the toolbox :

POT logo

Other new features include

  • Stochastic loss and OT plan computation for regularized OT and backend examples
  • Implementation of factored OT with emd and sinkhorn
  • Backend implementation for free support barycenters
  • Weak OT solver + example
  • Backend support for Domain Adaptation and Unbalanced solvers
  • (F)GW linear dictionary learning solvers + example
  • New minimization-maximization algorithms for solving exact Unbalanced OT + example

More details are in the release notes.

Release of the version 0.8 of POT Python Optimal Transport

2021-11-05

As the maintainer of the POT Python Optimal Transport toolbox I am very happy to announce the new release 0.8 of the toolbox. It contains several new major features:

  • OpenMP implementation for the exact OT solver.
  • Backend for solving OT problems on Numpy/Pytorch/jax arrays (CPU or GPU)
  • Differentiable solvers for compatible backends.
  • Several new examples in the documentation
  • Compiled wheels ARM on Mac and Raspberry PI.

More details are in the release notes.

Elected as an ELLIS Scholar

2021-10-23

I am honored to have been elected as an ELLIS Scholar in the Paris ELLIS Unit. ELLIS is the European Lab for Learning and Intelligent Systems and whose aim is to promote machine learning and modern AI research in europe.

Optimal Transport for Machine Learning Workshop at NeurIPS 2021

2021-09-05

We are organizing with Jason Altschuler, Charlotte Bunne, Laetitia Chapel, Alexandra Suvorikova, Marco Cuturi and Gabriel Peyré the fourth OTML Workshop at NeurIPS 2021 on 13 December 2021.

Plenary Speakers
Keynote Talks

This workshop is organized with the following partners: ELLIS, 3IA Côte d'Azur, Prairie Institute.

Tutorial at SIAM Annual Meeting 2021

2021-06-10

I have been invited to give a short tutorial about the use of optimal transport in machine learning applications at the SIAM Annual Meetings 2021.

You can find the presentation slides here.

Optimal Transport for Machine Learning Workshop at NeurIPS 2019

2019-09-02

We are organizing with Alexandra Suvorikova, Marco Cuturi and Gabriel Peyré the third OTML Workshop at NeurIPS 2019 on 13/14 December 2019.

The list of invited speakers and the call for contribution are both available on the Workshop website.

Optimal Transport for Machine Learning Tutorial at ISBI 2019

2019-04-08

I have given a 3h tutorial about Optimal transport for machine learning at ISBI 2019.

You can find the presentation slides and the practical session Python notebook here.

Optimal Transport at Data Science Summer School 2018

2018-06-19

We will be giving two one day courses with Marco Cuturi and Nicolas Courty about Optimal transport for machine learning for the Data Science Summer School 2018 (DS3) at Ecole Polytechnique in Paris/Saclay, France.

You can find the presentation slides and the practical session Python notebook on Github.

Optimal Transport at Statlearn 2018

2018-04-05

We have given a one day course with Nicolas Courty about Optimal transport for machine learning for the Statlearn 2018 summer school in Nice, France.

You can find the presentation slides and the practical session Python notebook on Github.

Talk at GDR ISIS General Meeting

2017-11-17

I had the honor to be invited for a talk at the GDR ISI General meeting in Sète.

I presented a short introduction to optimal transport and discussed some recent applications of OT to in the machine learning comunity. the slides in english are available here

Domain adaptation paper accepted at NIPS 2017 and OTML 2017 Workshop

2017-09-17

My collaborators and I have been accepted to present the following paper at NIPS 2017

N. Courty, R. Flamary, A. Habrard, A. Rakotomamonjy, Joint Distribution Optimal Transportation for Domain Adaptation, Neural Information Processing Systems (NIPS), 2017.
Abstract: This paper deals with the unsupervised domain adaptation problem, where one wants to estimate a prediction function f in a given target domain without any labeled sample by exploiting the knowledge available from a source domain where labels are known. Our work makes the following assumption: there exists a non-linear transformation between the joint feature/label space distributions of the two domain Ps and Pt. We propose a solution of this problem with optimal transport, that allows to recover an estimated target Pft=(X,f(X)) by optimizing simultaneously the optimal coupling and f. We show that our method corresponds to the minimization of a bound on the target error, and provide an efficient algorithmic solution, for which convergence is proved. The versatility of our approach, both in terms of class of hypothesis or loss functions is demonstrated with real world classification and regression problems, for which we reach or surpass state-of-the-art results.
BibTeX:
@inproceedings{courty2017joint,
author = {Courty, Nicolas and Flamary, Remi and Habrard, Amaury and Rakotomamonjy, Alain},
title = {Joint Distribution Optimal Transportation for Domain Adaptation},
booktitle = {Neural Information Processing Systems (NIPS)},
editor = {},
year = {2017}
} 

I have been invited to present at the OTML 2017 Workshop and we also have two additional posters there.

Feel free to come and see us at our NIPS poster or at the workshop.

POT Python Optimal Transport library

2016-11-07

We proposed recently a general purpose Python library for Optimal Transport called POT. The library is available on Github and can be easily installed using PyPI. The toolbox implement a number of solvers from the image and machine learning literature (see README and the Documentation for more details).

We also give several examples of the potential uses of OT in the form of Python scripts and Python notebook that show the toolbox in use without requiring Python.

Here is a list of the Python notebooks if you want a quick look:

Feel free to use and contribute to the library.

Two papers in optimal transport accepted at NIPS 2016

2016-08-04

My collaborators and I have been accepted to present the following two papers at NIPS 2016

R. Flamary, C. Févotte, N. Courty, V. Emyia, Optimal spectral transportation with application to music transcription, Neural Information Processing Systems (NIPS), 2016.
Abstract: Many spectral unmixing methods rely on the non-negative decomposition of spectral data onto a dictionary of spectral templates. In particular, state-of-the-art music transcription systems decompose the spectrogram of the input signal onto a dictionary of representative note spectra. The typical measures of fit used to quantify the adequacy of the decomposition compare the data and template entries frequency-wise. As such, small displacements of energy from a frequency bin to another as well as variations of timber can disproportionally harm the fit. We address these issues by means of optimal transportation and propose a new measure of fit that treats the frequency distributions of energy holistically as opposed to frequency-wise. Building on the harmonic nature of sound, the new measure is invariant to shifts of energy to harmonically-related frequencies, as well as to small and local displacements of energy. Equipped with this new measure of fit, the dictionary of note templates can be considerably simplified to a set of Dirac vectors located at the target fundamental frequencies (musical pitch values). This in turns gives ground to a very fast and simple decomposition algorithm that achieves state-of-the-art performance on real musical data.
BibTeX:
@inproceedings{flamary2016ost,
author = {Flamary, Remi and Févotte, Cédric and Courty, N. and  Emyia, Valentin},
title = {Optimal spectral transportation with application to music transcription},
booktitle = { Neural Information Processing Systems (NIPS)},
editor = {},
year = {2016}
} 
M. Perrot, N. Courty, R. Flamary, A. Habrard, Mapping estimation for discrete optimal transport, Neural Information Processing Systems (NIPS), 2016.
Abstract: We are interested in the computation of the transport map of an Optimal Transport problem. Most of the computational approaches of Optimal Transport use the Kantorovich relaxation of the problem to learn a probabilistic coupling but do not address the problem of learning the transport map linked to the original Monge problem. Consequently, it lowers the potential usage of such methods in contexts where out-of-samples computations are mandatory. In this paper we propose a new way to jointly learn the coupling and an approximation of the transport map. We use a jointly convex formulation which can be efficiently optimized. Additionally, jointly learning the coupling and the transport map allows to smooth the result of the Optimal Transport and generalize it on out-of-samples examples. Empirically, we show the interest and the relevance of our method in two tasks: domain adaptation and image editing.
BibTeX:
@inproceedings{perrot2016mapping,
author = {Perrot, M. and Courty, N. and Flamary, R. and Habrard, A.},
title = {Mapping estimation for discrete optimal transport},
booktitle = {Neural Information Processing Systems (NIPS)},
editor = {},
year = {2016}
} 

Feel free to come and see us at our posters, we will have real life demonstrations of audio musical annotation and seamless copy in images.

Helava award of the best paper in ISPRS Journal period 2012-2015

2016-07-06

Our paper has been selected for the Helava Award, i.e. best paper in the ISPRS Journal of Photogrammetry and Remote Sensing for the 2012-2015 period.

D. Tuia, R. Flamary, N. Courty, Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions, ISPRS Journal of Photogrammetry and Remote Sensing, 2015.
Abstract: In this paper, we tackle the question of discovering an effective set of spatial filters to solve hyperspectral classification problems. Instead of fixing a priori the filters and their parameters using expert knowledge, we let the model find them within random draws in the (possibly infinite) space of possible filters. We define an active set feature learner that includes in the model only features that improve the classifier. To this end, we consider a fast and linear classifier, multiclass logistic classification, and show that with a good representation (the filters discovered), such a simple classifier can reach at least state of the art performances. We apply the proposed active set learner in four hyperspectral image classification problems, including agricultural and urban classification at different resolutions, as well as multimodal data. We also propose a hierarchical setting, which allows to generate more complex banks of features that can better describe the nonlinearities present in the data.
BibTeX:
@article{tuia2015multiclass,
author = {Tuia, D. and Flamary, R. and  Courty, N.},
title = {Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
editor = {},
year = {2015}
} 

It is a great honor for us and I will be present at the ISPRS Congress 2016 on July 12 to receive the prize on behalf of all authors. This is a joint work with Devis Tuia and Nicolas Courty.

Statlearn 2016

2016-03-20

I have been invited to the Statlearn 2016 Workshop in Vannes, France.

I will present on April 8th our works on Optimal transport for domain adaptation in collaboration with Nicolas Courty , Devis Tuia et Alain Rakotomamonjy. The slides are available here.

PhD thesis proposal

2015-04-03

Cedric Richard an I are proposing a PhD thesis subject starting in 2015 on the subject of Distributed estimation over multitask networks.

If you are interested, contact Cédric or me before June 6.

For more details see the complete proposal.

BasMatI Summer School

2015-02-13

We are organizing a french summer school with Céline Theys, David Mary et Claude Aime about Mathemathics for signal and image processing in astronomy You can find pore information on the website.

Best paper at PCV 2014

2014-09-10

Our paper has been chosen for a best paper award at the Photogrammetric Computer Vision symposium (PCV 2014).

D. Tuia, N. Courty, R. Flamary, A group-lasso active set strategy for multiclass hyperspectral image classification, Photogrammetric Computer Vision (PCV), 2014.
Abstract: Hyperspectral images have a strong potential for landcover/landuse classification, since the spectra of the pixels can highlight subtle differences between materials and provide information beyond the visible spectrum. Yet, a limitation of most current approaches is the hypothesis of spatial independence between samples: images are spatially correlated and the classification map should exhibit spatial regularity. One way of integrating spatial smoothness is to augment the input spectral space with filtered versions of the bands. However, open questions remain, such as the selection of the bands to be filtered, or the filterbank to be used. In this paper, we consider the entirety of the possible spatial filters by using an incremental feature learning strategy that assesses whether a candidate feature would improve the model if added to the current input space. Our approach is based on a multiclass logistic classifier with group-lasso regularization. The optimization of this classifier yields an optimality condition, that can easily be used to assess the interest of a candidate feature without retraining the model, thus allowing drastic savings in computational time. We apply the proposed method to three challenging hyperspectral classification scenarios, including agricultural and urban data, and study both the ability of the incremental setting to learn features that always improve the model and the nature of the features selected.
BibTeX:
@inproceedings{tuia2014grouplasso,
author = {Tuia, D. and Courty, N. and Flamary, R.},
title = {A group-lasso active set strategy for multiclass hyperspectral image classification},
booktitle = {Photogrammetric Computer Vision (PCV)},
editor = {},
year = {2014}
} 

This is a joint work with Devis Tuia and Nicolas Courty.

ICASSP 2014

2014-04-30

I will be at ICASSP 2014 in Firenze. I will present the paper Active set strategy for high-dimensional non-convex sparse optimization problems on Wednesday, May 7 in the special session Optimization algorithms for high dimensional signal processing.

This is a joint work with Aurélie Boisbunon and Alain Rakotomamonjy.

AMOR project is online

2013-12-04

The AMOR project is a Young researchers project that is financed by GdR ISIS and the GRETSI association.

The wep page of the project is now available here.

Talk about learning with infinitely many features

2013-01-02

I have been invited to present our work about learning with infinitely many features at a GDR ISIS reunion.

The slides of the presentation (in english) are now available here.

New SVM Toolbox

2012-09-10

The code for my linear SVM toolbox is now available in the software section of the website. It can learn linear SVM with a wide class of regularization terms such as the l1 norm or the l1-lp mixed norms.

This toolbox is in Matlab and the solver used is a Forward-Backward Splitting algorithm from the paper FISTA.

Go to the G-SVM page for more information and for downloading the source code.

Website update

2012-07-25

I just updated this website! The software page is now cleaner and the website is now generated by Webgen.py instead of webgen.

Moreover you can now download my PhD thesis (in french) from here or from thèse en ligne.

PhD Defence

2011-12-06

I defended my PdD at Rouen university on december 6th. The final manuscript (in french) will be available shortly and the presentation slides are available here.

The Jury was composed of :

Reviewers :

  • Pr. Jalal Fadili, ENSICAEN
  • Dr. Michele Sebag, LRI CNRS

Examiners:

  • Pr. Jocelyn Chanussot, INP Grenoble
  • Pr. Liva Ralaivola, Université Aix-Marseille II
  • Pr. Stéphane Canu, INSA Rouen

PhD advisor:

  • Pr. Alain Rakotomamonjy, Université de Rouen

Source code for SVM with uncertain labels

2011-07-06

This year, I was at the SSP 2011 workshop at Nice, France. I had the pleasure to present our works with Emilie Niaf about SVM classification with uncertain labels and probabilistic prediction. The poster is now online. Note that the Matlab source code of our approach can now be downloaded.

BCI poster at CAP 2011

2011-05-10

CAP is a French conference in Machine Learning that will be held May 17-20th 2011 at Chambéry. This year, I will be presenting a poster about BCI for general public.

MLSP 2010 Presentation

2010-08-31

I provide here the presentation at MLSP for the Paper “Learning spatial filters for multispectral image segmentation”. The draft version of the paper will be available shortly.

MLSP 2010

2010-08-22

I’m going to MLSP 2010 in Finland. I will be presenting there a joint work with Devis Tuia and Gustavo Camp-Valls from IPL in Valencia. We propose a method to learn a large margin spatial filtering for image segmentation.

The presentation and the paper will be shortly available on this website.

CAp 2010 / BCI in Paris

2010-06-20

I have presented in may my works concerning large margin filtering for non-linear problems in the French conference CAp 2010 and in a BCI meeting in Paris organized by Cédric Gouy-Pailler .

The sildes are in French but the paper is in English.

MLSP 2010 Competitio

2010-04-28

This year, MLSP Workshop (MACHINE LEARNING FOR SIGNAL PROCESSING) organized a BCI competition : “MIND READING”. The aim was to predict rare events during a rapid image presentation. 35 international teams were part of this competition.

The BCI LITIS team obtained good results (3,5,6 et 8 of 35) in the competition. Results are available here.

Team members :

  • Rémi Flamary (2)
  • Benjamin Labbé (1)
  • Grégoire Mesnil (2)
  • Xilan Tian(1)
  • Florian Yger (2)
  • Alain Rakotomomamonjy (2) – study supervision
  • Gilles Gasso (1) – study supervision

Institution :

  • (1) INSA de Rouen – LITIS
  • (2) Université de Rouen – LITIS

This new was extracted from the LITIS website

ICASSP 2010

2010-03-18

I will be at ICASSP 2010 conference and I will present my works about Large Margin Filtering.

You can download the poster or the paper.

Presentation at CREATIS

2010-01-27

In January, I did a presentation at the CREATIS Laboratory about SVM methods, Kernel Learning and Large Margin Filtering. The slides in french can be downloaded.

MLSP 2009 In Grenoble

2009-07-30

I will present my paper Variational Sequence Labeling at the Machine Learning For Signal Processing Workshop 2009 (MLSP 09) on the 2th of September.

This paper has been writer jointly with Jean Loïc Rose .

The presentation may be downloaded here

BCI Competition IV

2009-04-28

The results of the fourth BCI Competition have been published. We achieved the second place on the Dataset 4.

The goal was to determine the position of the finger of a subject using only his brain signal.