Rémi Flamary

Professional website

Home

photo

I am associate professor at Nice-Sophia Antipolis University in the Departement of Electronics and in the Lagrange Laboratory. This laboratory is part of the Observatoire de la Côte d'Azur. I was previously a PhD student and teaching assistant at the LITIS Laboratory and my PhD advisor was Alain Rakotomamonjy at Rouen University.

On this website, you can find a list of my publications and download the corresponding software/code. Some of my french teaching material is also available.

Research Interests

  • Machine Learning
    • Kernel methods, Support Vector Machines
    • Sparsity, variable selection, mixed norm
    • Data representation, kernel learning
  • Statistical signal processing
    • Classification and segmentation of signals and images
    • Filter learning
    • Sparse and non-convex optimization
  • Applications
    • Biomedical engineering, Brain-Computer Interfaces
    • Remote sensing and hyperspectral Imaging
    • Astronomical image processing

Wordcloud of my research interests.

Recent work

Courty, N., Flamary, R., Tuia, D., "Domain adaptation with regularized optimal transport", European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), 2014.
Abstract: We present a new and original method to solve the domain adaptation problem using optimal transport. By searching for the best transportation plan between the probability distribution functions of a source and a target domain, a non-linear and invertible transformation of the learning samples can be estimated. Any standard machine learning method can then be applied on the transformed set, which makes our method very generic. We propose a new optimal transport algorithm that incorporates label information in the optimization: this is achieved by combining an efficient matrix scaling technique together with a majoration of a non-convex regularization term. By using the proposed optimal transport with label regularization, we obtain significant increase in performance compared to the original transport solution. The proposed algorithm is computationally efficient and effective, as illustrated by its evaluation on a toy example and a challenging real life vision dataset, against which it achieves competitive results with respect to state-of-the-art methods.
BibTeX:
@inproceedings{courty2014domain,
author = {Courty, N. and Flamary, R. and Tuia, D.},
title = {Domain adaptation with regularized optimal transport}, 
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)},
year = {2014} 
} 
Boisbunon, A., Flamary, R., Rakotomamonjy, A., "Active set strategy for high-dimensional non-convex sparse optimization problems", International Conference on Acoustic, Speech and Signal Processing (ICASSP), 2014.
Abstract: The use of non-convex sparse regularization has attracted much interest when estimating a very sparse model on high dimensional data. In this work we express the optimality conditions of the optimization problem for a large class of non-convex regularizers. From those conditions, we derive an efficient active set strategy that avoids the computing of unnecessary gradients. Numerical experiments on both generated and real life datasets show a clear gain in computational cost w.r.t. the state of the art when using our method to obtain very sparse solutions.
BibTeX:
@inproceedings{boisbunon2014active,
author = {Boisbunon, A. and Flamary, R. and Rakotomamonjy, A.},
title = {Active set strategy for high-dimensional non-convex sparse optimization problems}, 
booktitle = {International Conference on Acoustic, Speech and Signal Processing (ICASSP)},
year = {2014} 
} 
Flamary, R., Jrad, N., Phlypo, R., Congedo, M., Rakotomamonjy, A., "Mixed-Norm Regularization for Brain Decoding", Computational and Mathematical Methods in Medicine, Vol. 2014, N. 1, pp 1-13, 2014.
Abstract: This work investigates the use of mixed-norm regularization for sensor selection in event-related potential (ERP) based brain-computer interfaces (BCI). The classification problem is cast as a discriminative optimization framework where sensor selection is induced through the use of mixed-norms. This framework is extended to the multitask learning situation where several similar classification tasks related to different subjects are learned simultaneously. In this case, multitask learning helps in leveraging data scarcity issue yielding to more robust classifiers. For this purpose, we have introduced a regularizer that induces both sensor selection and classifier similarities. The different regularization approaches are compared on three ERP datasets showing the interest of mixed-norm regularization in terms of sensor selection. The multitask approaches are evaluated when a small number of learning examples are available yielding to significant performance improvements especially for subjects performing poorly.
BibTeX:
@article{flamary2014mixed,
author = {Flamary, R. and Jrad, N. and Phlypo, R. and Congedo, M. and Rakotomamonjy, A.},
title = {Mixed-Norm Regularization for Brain Decoding}, 
journal = {Computational and Mathematical Methods in Medicine},
volume = {2014},
number = {1},
pages = {1-13},
year = {2014} 
} 
Niaf, E., Flamary, R., Rakotomamonjy, A., Rouvière, O., Lartizien, C., "SVM with feature selection and smooth prediction in images: application to CAD of prostate cancer", IEEE International Conference on Image Processing (ICIP), 2014.
Abstract: We propose a new computer-aided detection scheme for prostate cancer screening on multiparametric magnetic resonance (mp-MR) images. Based on an annotated training database of mp-MR images from thirty patients, we train a novel support vector machine (SVM)-inspired classifier which simultaneously learns an optimal linear discriminant and a subset of predictor variables (or features) that are most relevant to the classification task, while promoting spatial smoothness of the malignancy prediction maps. The approach uses a $\ell_1$-norm in the regularization term of the optimization problem that rewards sparsity. Spatial smoothness is promoted via an additional cost term that encodes the spatial neighborhood of the voxels, to avoid noisy prediction maps. Experimental comparisons of the proposed $\ell_1$-Smooth SVM scheme to the regular $\ell_2$-SVM scheme demonstrate a clear visual and numerical gain on our clinical dataset.
BibTeX:
@inproceedings{niaf2014svmsmooth,
author = {Niaf, E. and Flamary, R. and Rakotomamonjy, A. and Rouvière, O. and Lartizien, C.},
title = {SVM with feature selection and smooth prediction in images: application to CAD of prostate cancer}, 
booktitle = {IEEE International Conference on Image Processing (ICIP)},
year = {2014} 
} 
Laporte, L., Flamary, R., Canu, S., Déjean, S., Mothe, J., "Nonconvex Regularizations for Feature Selection in Ranking With Sparse SVM", Neural Networks and Learning Systems, IEEE Transactions on, Vol. 25, N. 6, pp 1118-1130, 2014.
Abstract: Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few works have been focused on integrating the feature selection into the learning process. In this work, we propose a general framework for feature selection in learning to rank using SVM with a sparse regularization term. We investigate both classical convex regularizations such as l1 or weighted l1 and non-convex regularization terms such as log penalty, Minimax Concave Penalty (MCP) or lp pseudo norm with p lower than 1. Two algorithms are proposed, first an accelerated proximal approach for solving the convex problems, second a reweighted l1 scheme to address the non-convex regularizations. We conduct intensive experiments on nine datasets from Letor 3.0 and Letor 4.0 corpora. Numerical results show that the use of non-convex regularizations we propose leads to more sparsity in the resulting models while prediction performance is preserved. The number of features is decreased by up to a factor of six compared to the l1 regularization. In addition, the software is publicly available on the web.
BibTeX:
@article{tnnls2014,
author = { Laporte, L. and Flamary, R. and Canu, S. and Déjean, S. and Mothe, J.},
title = {Nonconvex Regularizations for Feature Selection in Ranking With Sparse SVM}, 
journal = { Neural Networks and Learning Systems, IEEE Transactions on},
volume = {25},
number = {6},
pages = {1118-1130},
year = {2014} 
} 

News

ICASSP 2014

2014-04-30

I will be at ICASSP 2014 in Firenze. I will present the paper Active set strategy for high-dimensional non-convex sparse optimization problems on Wednesday, May 7 in the special session Optimization algorithms for high dimensional signal processing.

This is a joint work with Aurélie Boisbunon and Alain Rakotomamonjy.

AMOR project is online

2013-12-04

The AMOR project is a Young researchers project that is financed by GdR ISIS and the GRETSI association.

The wep page of the project is now available here.

Talk about learning with infinitely many features

2013-01-02

I have been invited to present our work about learning with infinitely many features at a GDR ISIS reunion.

The slides of the presentation (in english) are now available here.