Détail de l'auteur
Auteur Chunping Qiu |
Documents disponibles écrits par cet auteur (3)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Unsupervised deep joint segmentation of multitemporal high-resolution images / Sudipan Saha in IEEE Transactions on geoscience and remote sensing, Vol 58 n° 12 (December 2020)
[article]
Titre : Unsupervised deep joint segmentation of multitemporal high-resolution images Type de document : Article/Communication Auteurs : Sudipan Saha, Auteur ; Lichao Mou, Auteur ; Chunping Qiu, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp 8780 - 8792 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] analyse d'image orientée objet
[Termes IGN] apprentissage profond
[Termes IGN] classification non dirigée
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] extraction de données
[Termes IGN] image à haute résolution
[Termes IGN] image à très haute résolution
[Termes IGN] image multitemporelle
[Termes IGN] itération
[Termes IGN] segmentation sémantiqueRésumé : (auteur) High/very-high-resolution (HR/VHR) multitemporal images are important in remote sensing to monitor the dynamics of the Earth’s surface. Unsupervised object-based image analysis provides an effective solution to analyze such images. Image semantic segmentation assigns pixel labels from meaningful object groups and has been extensively studied in the context of single-image analysis, however not explored for multitemporal one. In this article, we propose to extend supervised semantic segmentation to the unsupervised joint semantic segmentation of multitemporal images. We propose a novel method that processes multitemporal images by separately feeding to a deep network comprising of trainable convolutional layers. The training process does not involve any external label, and segmentation labels are obtained from the argmax classification of the final layer. A novel loss function is used to detect object segments from individual images as well as establish a correspondence between distinct multitemporal segments. Multitemporal semantic labels and weights of the trainable layers are jointly optimized in iterations. We tested the method on three different HR/VHR data sets from Munich, Paris, and Trento, which shows the method to be effective. We further extended the proposed joint segmentation method for change detection (CD) and tested on a VHR multisensor data set from Trento. Numéro de notice : A2020-744 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2020.2990640 Date de publication en ligne : 11/05/2020 En ligne : https://doi.org/10.1109/TGRS.2020.2990640 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=96375
in IEEE Transactions on geoscience and remote sensing > Vol 58 n° 12 (December 2020) . - pp 8780 - 8792[article]Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network / Chunping Qiu in ISPRS Journal of photogrammetry and remote sensing, vol 154 (August 2019)
[article]
Titre : Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network Type de document : Article/Communication Auteurs : Chunping Qiu, Auteur ; Lichao Mou, Auteur ; Michael Schmitt, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2019 Article en page(s) : pp 151 - 162 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage automatique
[Termes IGN] climat urbain
[Termes IGN] image multitemporelle
[Termes IGN] image optique
[Termes IGN] image Sentinel-MSI
[Termes IGN] occupation du sol
[Termes IGN] réseau neuronal convolutif
[Termes IGN] réseau neuronal récurrent
[Termes IGN] résidu
[Termes IGN] villeRésumé : (Auteur) The local climate zone (LCZ) scheme was originally proposed to provide an interdisciplinary taxonomy for urban heat island (UHI) studies. In recent years, the scheme has also become a starting point for the development of higher-level products, as the LCZ classes can help provide a generalized understanding of urban structures and land uses. LCZ mapping can therefore theoretically aid in fostering a better understanding of spatio-temporal dynamics of cities on a global scale. However, reliable LCZ maps are not yet available globally. As a first step toward automatic LCZ mapping, this work focuses on LCZ-derived land cover classification, using multi-seasonal Sentinel-2 images. We propose a recurrent residual network (Re-ResNet) architecture that is capable of learning a joint spectral-spatial-temporal feature representation within a unitized framework. To this end, a residual convolutional neural network (ResNet) and a recurrent neural network (RNN) are combined into one end-to-end architecture. The ResNet is able to learn rich spectral-spatial feature representations from single-seasonal imagery, while the RNN can effectively analyze temporal dependencies of multi-seasonal imagery. Cross validations were carried out on a diverse dataset covering seven distinct European cities, and a quantitative analysis of the experimental results revealed that the combined use of the multi-temporal information and Re-ResNet results in an improvement of approximately 7 percent points in overall accuracy. The proposed framework has the potential to produce consistent-quality urban land cover and LCZ maps on a large scale, to support scientific progress in fields such as urban geography and urban climatology. Numéro de notice : A2019-268 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2019.05.004 Date de publication en ligne : 14/06/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.05.004 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=93085
in ISPRS Journal of photogrammetry and remote sensing > vol 154 (August 2019) . - pp 151 - 162[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2019081 RAB Revue Centre de documentation En réserve L003 Disponible 081-2019083 DEP-RECP Revue LASTIG Dépôt en unité Exclu du prêt 081-2019082 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery / Chunping Qiu in ISPRS Journal of photogrammetry and remote sensing, vol 138 (April 2018)
[article]
Titre : Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery Type de document : Article/Communication Auteurs : Chunping Qiu, Auteur ; Michael Schmitt, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2018 Article en page(s) : pp 218 - 231 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image mixte
[Termes IGN] couple stéréoscopique
[Termes IGN] géométrie épipolaire
[Termes IGN] image optique
[Termes IGN] image radar moirée
[Termes IGN] image TerraSAR-X
[Termes IGN] image Worldview
[Termes IGN] mesure de similitude
[Termes IGN] points homologues
[Termes IGN] précision de localisation
[Termes IGN] reconstruction 3D
[Termes IGN] zone urbaineRésumé : (Auteur) In this paper, we discuss the potential and challenges regarding SAR-optical stereogrammetry for urban areas, using very-high-resolution (VHR) remote sensing imagery. Since we do this mainly from a geometrical point of view, we first analyze the height reconstruction accuracy to be expected for different stereogrammetric configurations. Then, we propose a strategy for simultaneous tie point matching and 3D reconstruction, which exploits an epipolar-like search window constraint. To drive the matching and ensure some robustness, we combine different established hand-crafted similarity measures. For the experiments, we use real test data acquired by the Worldview-2, TerraSAR-X and MEMPHIS sensors. Our results show that SAR-optical stereogrammetry using VHR imagery is generally feasible with 3D positioning accuracies in the meter-domain, although the matching of these strongly hetereogeneous multi-sensor data remains very challenging. Numéro de notice : A2018-124 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2017.12.006 En ligne : https://doi.org/10.1016/j.isprsjprs.2017.12.006 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=89586
in ISPRS Journal of photogrammetry and remote sensing > vol 138 (April 2018) . - pp 218 - 231[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2018041 RAB Revue Centre de documentation En réserve L003 Disponible 081-2018043 DEP-EXM Revue LASTIG Dépôt en unité Exclu du prêt 081-2018042 DEP-EAF Revue Nancy Dépôt en unité Exclu du prêt