Feature learning for Remote sensing
$$
\def\w{\mathbf{w}}
\def\W{\mathbf{W}}
\def\b{\mathbf{b}}
$$
Description
This package is a Matlab implementation of a sparse optimization scheme for automated feature selection in remote sensing.
It is the code that has been used in the numerical experiments of the papers:
D. Tuia, M. Volpi, M. Dalla Mura, A. Rakotomamonjy, R. Flamary, Automatic Feature Learning for Spatio-Spectral Image Classification With Sparse SVM, Geoscience and Remote Sensing, IEEE Transactions on, Vol. 52, N. 10, pp 6062-6074, 2014.
[Abstract]
[BibTeX]
[URL]
[DOI]
[PDF]
[Code]
Abstract: Including spatial information is a key step for successful remote sensing image classification. In particular, when dealing with high spatial resolution, if local variability is strongly reduced by spatial filtering, the classification performance results are boosted. In this paper, we consider the triple objective of designing a spatial/spectral classifier, which is compact (uses as few features as possible), discriminative (enhances class separation), and robust (works well in small sample situations). We achieve this triple objective by discovering the relevant features in the (possibly infinite) space of spatial filters by optimizing a margin-maximization criterion. Instead of imposing a filter bank with predefined filter types and parameters, we let the model figure out which set of filters is optimal for class separation. To do so, we randomly generate spatial filter banks and use an active-set criterion to rank the candidate features according to their benefits to margin maximization (and, thus, to generalization) if added to the model. Experiments on multispectral very high spatial resolution (VHR) and hyperspectral VHR data show that the proposed algorithm, which is sparse and linear, finds discriminative features and achieves at least the same performances as models using a large filter bank defined in advance by prior knowledge.
BibTeX:
@article{tuia2014automatic,
author = {Tuia, D. and Volpi, M. and Dalla Mura, M. and Rakotomamonjy, A. and Flamary, R.},
title = {Automatic Feature Learning for Spatio-Spectral Image Classification With Sparse SVM},
journal = {Geoscience and Remote Sensing, IEEE Transactions on},
volume = {52},
number = {10},
pages = {6062-6074},
editor = {},
year = {2014}
}
D. Tuia, N. Courty, R. Flamary, A group-lasso active set strategy for multiclass hyperspectral image classification, Photogrammetric Computer Vision (PCV), 2014.
[Abstract]
[BibTeX]
[PDF]
[Code]
Abstract: Hyperspectral images have a strong potential for landcover/landuse classification, since the spectra of the pixels can highlight subtle differences between materials and
provide information beyond the visible spectrum. Yet, a limitation of most
current approaches is the hypothesis of spatial independence between
samples: images are spatially correlated and the classification map should exhibit spatial regularity. One way of integrating spatial
smoothness is to augment the input spectral space with filtered
versions of the bands. However, open questions remain, such as the
selection of the bands to be filtered, or the filterbank to be
used. In this paper, we consider the entirety of the possible spatial
filters by using an incremental feature learning strategy that assesses
whether a candidate feature would improve the model if added to the
current input space. Our approach is based on a multiclass logistic
classifier with group-lasso regularization. The optimization of this
classifier yields an optimality condition, that can easily be used to assess the interest of a candidate feature without retraining the model, thus allowing drastic savings in computational time. We apply the proposed method to three challenging hyperspectral classification scenarios, including agricultural and urban data, and study both the ability of the incremental setting to learn features that always improve the model and the nature of the features selected.
BibTeX:
@inproceedings{tuia2014grouplasso,
author = {Tuia, D. and Courty, N. and Flamary, R.},
title = {A group-lasso active set strategy for multiclass hyperspectral image classification},
booktitle = {Photogrammetric Computer Vision (PCV)},
editor = {},
year = {2014}
}
D. Tuia, R. Flamary, N. Courty, Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions, ISPRS Journal of Photogrammetry and Remote Sensing, 2015.
[Abstract]
[BibTeX]
[DOI]
[PDF]
[Code]
Abstract: In this paper, we tackle the question of discovering an effective set of spatial filters to solve hyperspectral classification problems. Instead of fixing a priori the filters and their parameters using expert knowledge, we let the model find them within random draws in the (possibly infinite) space of possible filters. We define an active set feature learner that includes in the model only features that improve the classifier. To this end, we consider a fast and linear classifier, multiclass logistic classification, and show that with a good representation (the filters discovered), such a simple classifier can reach at least state of the art performances. We apply the proposed active set learner in four hyperspectral image classification problems, including agricultural and urban classification at different resolutions, as well as multimodal data. We also propose a hierarchical setting, which allows to generate more complex banks of features that can better describe the nonlinearities present in the data.
BibTeX:
@article{tuia2015multiclass,
author = {Tuia, D. and Flamary, R. and Courty, N.},
title = {Multiclass feature learning for hyperspectral image classification: sparse and hierarchical solutions},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
editor = {},
year = {2015}
}
The toolbox has been mainly coded by Devis Tuia, Michele Volpi and Rémi Flamary.
Solver
We provide a general solver for squared hinge loss SVM of the form:
$$
\begin{equation*}
\min_\varphi\quad\min_{\w,b} \quad\sum_{i=1}^{n} \max(0,1-y_i(\varphi(\mathbf{x}_i)^T\w+b))^2 + \lambda\Omega(\w)
\end{equation*}
$$
where $\Omega(\w)=\sum_i |w_i|$ promote sparsity and $\varphi(\cdot)$ is a nonlinear feature extraction.
Since version 2.0 of the toolbox we provide a solver for multiclass multinomial logistic regression to optimize problems such as:
$$
\begin{equation*}
\min_\varphi\quad\min_{\W,\b} \quad\sum_{i=1}^{n} L(y_i,\varphi(\mathbf{x}_i),\W,\b) + \lambda\Omega_{1,2}(\W)
\end{equation*}
$$
where $\Omega_{1,2}(\W)=\sum_i \|\W_{i,\cdot}\|_2$ promote joint sparsity accros classes and $\varphi(\cdot)$ is a nonlinear feature extraction.
This toolbox is in Matlab and the algorithm used for solving the inner problem is a
Forward-Backward Splitting algorithm from the FISTA paper implemented in G-SVM.
Note that since version 2, we use the efficient group-lasso solver provided by the toolbox SPAMS.
Download
Current version : 2.1
Download : fl-rs-svm-2.1.zip
Dataset : AVIRIS_IndianPines.mat
Release Notes
Version 1.0
- Active set approach with l1 regularization.
Version 2.0
- Add multiclass logistic regression with group lasso régularization.
Version 2.1
- Correction of small bug in generateFeatures.m (Thanks to Sheng Ding).
- Update of DEMO_ActiveSet.m with better comments.
Installation
Quick version:
- Add all the paths and subpaths to matlab.
- Depending on your system, it might be necessary to compile the toolbox SPAMS using script /utils/spams-matlab/compile.m .
Entry points:
- DEMO_ActiveSet.m : Illustrate and compare the 3 active set approaches from the papers.
- generateFeatures.m : this is the file to be customized to generate the features of interest. in the example provided, it generates contextual features.
- contextualfeatures.m : to use the attribute filters, we use the attribute profile code as in Dalla Mura's paper. Be careful because the features ATT-i and ATT-s features can be VERY slow.
- get_feature_test_manybands.m : It must be capable of generating the features selected by the active set (whose specs are in "feat"). In the examples, it uses contextualfeatures.m, which generates spatial filters.