Rémi Flamary

Professional website

Home

photo

I am associate professor at Nice-Sophia Antipolis University in the Departement of Electronics and in the Lagrange Laboratory. This laboratory is part of the Observatoire de la Côte d'Azur. I was previously a PhD student and teaching assistant at the LITIS Laboratory and my PhD advisor was Alain Rakotomamonjy at Rouen University.

On this website, you can find a list of my publications and download the corresponding software/code. Some of my french teaching material is also available.

Research Interests

  • Machine Learning
  • Statistical signal processing
    • Classification and segmentation of signals and images
    • Filter learning, image reconstruction
    • Sparse and non-convex optimization
  • Applications
    • Biomedical engineering, Brain-Computer Interfaces
    • Remote sensing and hyperspectral Imaging
    • Astronomical image processing

Wordcloud of my research interests.

Recent work

R. Flamary, A. Rakotomamonjy, G. Gasso, "Importance Sampling Strategy for Non-Convex Randomized Block-Coordinate Descent", IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2015.
Abstract: As the number of samples and dimensionality of optimization problems related to statistics and machine learning explode, block coordinate descent algorithms have gained popularity since they reduce the original problem to several smaller ones. Coordinates to be optimized are usually selected randomly according to a given probability distribution. We introduce an importance sampling strategy that helps randomized coordinate descent algorithms to focus on blocks that are still far from convergence. The framework applies to problems composed of the sum of two possibly non-convex terms, one being separable and non-smooth. We have compared our algorithm to a full gradient proximal approach as well as to a randomized block coordinate algorithm that considers uniform sampling and cyclic block coordinate descent. Experimental evidences show the clear benefit of using an importance sampling strategy.
BibTeX:
@inproceedings{flamary2015importance,
author = {Flamary, R. and Rakotomamonjy, A. and  Gasso, G.},
title = {Importance Sampling Strategy for Non-Convex Randomized Block-Coordinate Descent}, 
booktitle = {IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)},
year = {2015}
}
N. Courty, R. Flamary, D. Tuia, A. Rakotomamonjy, "Optimal transport for domain adaptation", Pattern Analysis and Machine Intelligence, IEEE Transactions on (under revision), 2015.
Abstract: Domain adaptation is one of the most challenging tasks of modern data analytics. If the adaptation is done correctly, models built on a specific data representations become more robust when confronted to data depicting the same semantic concepts (the classes), but observed by another observation system with its own specificities. Among the many strategies proposed to adapt a domain to another, finding domain-invariant representations has shown excellent properties, as a single classifier can use labelled samples from the source domain under this representation to predict the unlabelled samples of the target domain. In this paper, we propose a regularized unsupervised optimal transportation model to perform the alignment of the representations in the source and target domains. We learn a transportation plan matching both PDFs, which constrains labelled samples in the source domain to remain close during transport. This way, we exploit at the same time the few labeled information in the source and distributions of the input/observation variables observed in both domains. Experiments in toy and challenging real visual adaptation examples show the interest of the method, that consistently outperforms state of the art approaches.
BibTeX:
@article{courty2015optimal,
author = { Courty, N. and Flamary, R.  and Tuia, D. and Rakotomamonjy, A.},
title = {Optimal transport for domain adaptation}, 
journal = { Pattern Analysis and Machine Intelligence, IEEE Transactions on },
year = {2015 (under revision)}
}
D. Tuia, R. Flamary, M. Barlaud, "To be or not to be convex? A study on regularization in hyperspectral image classification", International Geoscience and Remote Sensing Symposium (IGARSS), 2015.
Abstract: Hyperspectral image classification has long been dominated by convex models, which provide accurate decision functions exploiting all the features in the input space. However, the need for high geometrical details, which are often satisfied by using spatial filters, and the need for compact models (i.e. relying on models issued form reduced input spaces) has pushed research to study alternatives such as sparsity inducing regularization, which promotes models using only a subset of the input features. Although successful in reducing the number of active inputs, these models can be biased and sometimes offer sparsity at the cost of reduced accuracy. In this paper, we study the possibility of using non-convex regularization, which limits the bias induced by the regularization. We present and compare four regularizers, and then apply them to hyperspectral classification with different cost functions.
BibTeX:
@inproceedings{tuia2015tobe,
author = {Tuia, D. and Flamary, R. and Barlaud, M.},
title = {To be or not to be convex? A study on regularization in   hyperspectral image classification}, 
booktitle = {International Geoscience and Remote Sensing Symposium (IGARSS)},
year = {2015}
}
D. Tuia, R. Flamary, N. Courty, "Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions", ISPRS Journal of Photogrammetry and Remote Sensing, 2015.
Abstract: In this paper, we tackle the question of discovering an effective set of spatial filters to solve hyperspectral classification problems. Instead of fixing a priori the filters and their parameters using expert knowledge, we let the model find them within random draws in the (possibly infinite) space of possible filters. We define an active set feature learner that includes in the model only features that improve the classifier. To this end, we consider a fast and linear classifier, multiclass logistic classification, and show that with a good representation (the filters discovered), such a simple classifier can reach at least state of the art performances. We apply the proposed active set learner in four hyperspectral image classification problems, including agricultural and urban classification at different resolutions, as well as multimodal data. We also propose a hierarchical setting, which allows to generate more complex banks of features that can better describe the nonlinearities present in the data.
BibTeX:
@article{tuia2015multiclass,
author = {Tuia, D. and Flamary, R. and  Courty, N.},
title = {Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions}, 
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
year = {2015}
}
A. Boisbunon, R. Flamary, A. Rakotomamonjy, A. Giros, J. Zerubia, "Large scale sparse optimization for object detection in high resolution images", IEEE Workshop in Machine Learning for Signal Processing (MLSP), 2014.
Abstract: In this work, we address the problem of detecting objects in images by expressing the image as convolutions between activation matrices and dictionary atoms. The activation matrices are estimated through sparse optimization and correspond to the position of the objects. In particular, we propose an efficient algorithm based on an active set strategy that is easily scalable and can be computed in parallel. We apply it to a toy image and a satellite image where the aim is to detect all the boats in a harbor. These results show the benefit of using nonconvex penalties, such as the log-sum penalty, over the convex l1 penalty.
BibTeX:
@inproceedings{boisbunon2014largescale,
author = {Boisbunon, A. and Flamary, R. and Rakotomamonjy, A. and Giros, A. and Zerubia, J.},
title = {Large scale sparse optimization for object detection in high resolution images}, 
booktitle = {IEEE Workshop in Machine Learning for Signal Processing (MLSP)},
year = {2014}
}

News

Statlearn 2016

2016-03-20

I have been invited to the Statlearn 2016 Workshop in Vannes, France.

I will present on April 8th our works on Optimal transport for domain adaptation in collaboration with Nicolas Courty , Devis Tuia et Alain Rakotomamonjy. The slides are available here.

PhD thesis proposal

2015-04-03

Cedric Richard an I are proposing a PhD thesis subject starting in 2015 on the subject of Distributed estimation over multitask networks.

If you are interested, contact Cédric or me before June 6.

For more details see the complete proposal.

BasMatI Summer School

2015-02-13

We are organizing a french summer school with Céline Theys, David Mary et Claude Aime about Mathemathics for signal and image processing in astronomy You can find pore information on the website.