Détail de l'auteur
Auteur Lichao Mou |
Documents disponibles écrits par cet auteur (4)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Nonlocal graph convolutional networks for hyperspectral image classification / Lichao Mou in IEEE Transactions on geoscience and remote sensing, Vol 58 n° 12 (December 2020)
[article]
Titre : Nonlocal graph convolutional networks for hyperspectral image classification Type de document : Article/Communication Auteurs : Lichao Mou, Auteur ; Xiaoqiang Lu, Auteur ; Xuelong Li, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp 8246 - 8257 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] classification par séparateurs à vaste marge
[Termes IGN] classification semi-dirigée
[Termes IGN] entropie
[Termes IGN] graphe
[Termes IGN] image hyperspectrale
[Termes IGN] réseau neuronal récurrentRésumé : (auteur) Over the past few years making use of deep networks, including convolutional neural networks (CNNs) and recurrent neural networks (RNNs), classifying hyperspectral images has progressed significantly and gained increasing attention. In spite of being successful, these networks need an adequate supply of labeled training instances for supervised learning, which, however, is quite costly to collect. On the other hand, unlabeled data can be accessed in almost arbitrary amounts. Hence it would be conceptually of great interest to explore networks that are able to exploit labeled and unlabeled data simultaneously for hyperspectral image classification. In this article, we propose a novel graph-based semisupervised network called nonlocal graph convolutional network (nonlocal GCN). Unlike existing CNNs and RNNs that receive pixels or patches of a hyperspectral image as inputs, this network takes the whole image (including both labeled and unlabeled data) in. More specifically, a nonlocal graph is first calculated. Given this graph representation, a couple of graph convolutional layers are used to extract features. Finally, the semisupervised learning of the network is done by using a cross-entropy error over all labeled instances. Note that the nonlocal GCN is end-to-end trainable. We demonstrate in extensive experiments that compared with state-of-the-art spectral classifiers and spectral–spatial classification networks, the nonlocal GCN is able to offer competitive results and high-quality classification maps (with fine boundaries and without noisy scattered points of misclassification). Numéro de notice : A2020-739 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2020.2973363 Date de publication en ligne : 12/05/2020 En ligne : https://doi.org/10.1109/TGRS.2020.2973363 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=96365
in IEEE Transactions on geoscience and remote sensing > Vol 58 n° 12 (December 2020) . - pp 8246 - 8257[article]Unsupervised deep joint segmentation of multitemporal high-resolution images / Sudipan Saha in IEEE Transactions on geoscience and remote sensing, Vol 58 n° 12 (December 2020)
[article]
Titre : Unsupervised deep joint segmentation of multitemporal high-resolution images Type de document : Article/Communication Auteurs : Sudipan Saha, Auteur ; Lichao Mou, Auteur ; Chunping Qiu, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp 8780 - 8792 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] analyse d'image orientée objet
[Termes IGN] apprentissage profond
[Termes IGN] classification non dirigée
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] extraction de données
[Termes IGN] image à haute résolution
[Termes IGN] image à très haute résolution
[Termes IGN] image multitemporelle
[Termes IGN] itération
[Termes IGN] segmentation sémantiqueRésumé : (auteur) High/very-high-resolution (HR/VHR) multitemporal images are important in remote sensing to monitor the dynamics of the Earth’s surface. Unsupervised object-based image analysis provides an effective solution to analyze such images. Image semantic segmentation assigns pixel labels from meaningful object groups and has been extensively studied in the context of single-image analysis, however not explored for multitemporal one. In this article, we propose to extend supervised semantic segmentation to the unsupervised joint semantic segmentation of multitemporal images. We propose a novel method that processes multitemporal images by separately feeding to a deep network comprising of trainable convolutional layers. The training process does not involve any external label, and segmentation labels are obtained from the argmax classification of the final layer. A novel loss function is used to detect object segments from individual images as well as establish a correspondence between distinct multitemporal segments. Multitemporal semantic labels and weights of the trainable layers are jointly optimized in iterations. We tested the method on three different HR/VHR data sets from Munich, Paris, and Trento, which shows the method to be effective. We further extended the proposed joint segmentation method for change detection (CD) and tested on a VHR multisensor data set from Trento. Numéro de notice : A2020-744 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2020.2990640 Date de publication en ligne : 11/05/2020 En ligne : https://doi.org/10.1109/TGRS.2020.2990640 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=96375
in IEEE Transactions on geoscience and remote sensing > Vol 58 n° 12 (December 2020) . - pp 8780 - 8792[article]Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network / Chunping Qiu in ISPRS Journal of photogrammetry and remote sensing, vol 154 (August 2019)
[article]
Titre : Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network Type de document : Article/Communication Auteurs : Chunping Qiu, Auteur ; Lichao Mou, Auteur ; Michael Schmitt, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2019 Article en page(s) : pp 151 - 162 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage automatique
[Termes IGN] climat urbain
[Termes IGN] image multitemporelle
[Termes IGN] image optique
[Termes IGN] image Sentinel-MSI
[Termes IGN] occupation du sol
[Termes IGN] réseau neuronal convolutif
[Termes IGN] réseau neuronal récurrent
[Termes IGN] résidu
[Termes IGN] villeRésumé : (Auteur) The local climate zone (LCZ) scheme was originally proposed to provide an interdisciplinary taxonomy for urban heat island (UHI) studies. In recent years, the scheme has also become a starting point for the development of higher-level products, as the LCZ classes can help provide a generalized understanding of urban structures and land uses. LCZ mapping can therefore theoretically aid in fostering a better understanding of spatio-temporal dynamics of cities on a global scale. However, reliable LCZ maps are not yet available globally. As a first step toward automatic LCZ mapping, this work focuses on LCZ-derived land cover classification, using multi-seasonal Sentinel-2 images. We propose a recurrent residual network (Re-ResNet) architecture that is capable of learning a joint spectral-spatial-temporal feature representation within a unitized framework. To this end, a residual convolutional neural network (ResNet) and a recurrent neural network (RNN) are combined into one end-to-end architecture. The ResNet is able to learn rich spectral-spatial feature representations from single-seasonal imagery, while the RNN can effectively analyze temporal dependencies of multi-seasonal imagery. Cross validations were carried out on a diverse dataset covering seven distinct European cities, and a quantitative analysis of the experimental results revealed that the combined use of the multi-temporal information and Re-ResNet results in an improvement of approximately 7 percent points in overall accuracy. The proposed framework has the potential to produce consistent-quality urban land cover and LCZ maps on a large scale, to support scientific progress in fields such as urban geography and urban climatology. Numéro de notice : A2019-268 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2019.05.004 Date de publication en ligne : 14/06/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.05.004 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=93085
in ISPRS Journal of photogrammetry and remote sensing > vol 154 (August 2019) . - pp 151 - 162[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2019081 RAB Revue Centre de documentation En réserve L003 Disponible 081-2019083 DEP-RECP Revue LASTIG Dépôt en unité Exclu du prêt 081-2019082 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Learning spectral-spatial-temporal features via a recurrent convolutional neural network for change detection in multispectral imagery / Lichao Mou in IEEE Transactions on geoscience and remote sensing, vol 57 n° 2 (February 2019)
[article]
Titre : Learning spectral-spatial-temporal features via a recurrent convolutional neural network for change detection in multispectral imagery Type de document : Article/Communication Auteurs : Lichao Mou, Auteur ; Lorenzo Bruzzone, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2019 Article en page(s) : pp 924 - 935 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] détection de changement
[Termes IGN] données d'entrainement (apprentissage automatique)
[Termes IGN] image multibande
[Termes IGN] réseau neuronal récurrentRésumé : (Auteur) Change detection is one of the central problems in earth observation and was extensively investigated over recent decades. In this paper, we propose a novel recurrent convolutional neural network (ReCNN) architecture, which is trained to learn a joint spectral-spatial-temporal feature representation in a unified framework for change detection in multispectral images. To this end, we bring together a convolutional neural network and a recurrent neural network into one end-to-end network. The former is able to generate rich spectral-spatial feature representations, while the latter effectively analyzes temporal dependence in bitemporal images. In comparison with previous approaches to change detection, the proposed network architecture possesses three distinctive properties: 1) it is end-to-end trainable, in contrast to most existing methods whose components are separately trained or computed; 2) it naturally harnesses spatial information that has been proven to be beneficial to change detection task; and 3) it is capable of adaptively learning the temporal dependence between multitemporal images, unlike most of the algorithms that use fairly simple operation like image differencing or stacking. As far as we know, this is the first time that a recurrent convolutional network architecture has been proposed for multitemporal remote sensing image analysis. The proposed network is validated on real multispectral data sets. Both visual and quantitative analyses of the experimental results demonstrate competitive performance in the proposed mode. Numéro de notice : A2019-110 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2018.2863224 Date de publication en ligne : 20/11/2018 En ligne : https://doi.org/10.1109/TGRS.2018.2863224 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=92449
in IEEE Transactions on geoscience and remote sensing > vol 57 n° 2 (February 2019) . - pp 924 - 935[article]