Détail de l'auteur
Auteur Michael Schmitt |
Documents disponibles écrits par cet auteur (6)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Multisensor data fusion for cloud removal in global and all-season Sentinel-2 imagery / Patrick Ebel in IEEE Transactions on geoscience and remote sensing, Vol 59 n° 7 (July 2021)
[article]
Titre : Multisensor data fusion for cloud removal in global and all-season Sentinel-2 imagery Type de document : Article/Communication Auteurs : Patrick Ebel, Auteur ; Andrea Meraner, Auteur ; Michael Schmitt, Auteur ; et al., Auteur Année de publication : 2021 Article en page(s) : pp 5866 - 5878 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] détection des nuages
[Termes IGN] données multicapteurs
[Termes IGN] image Sentinel-MSI
[Termes IGN] nuage
[Termes IGN] reconstruction d'image
[Termes IGN] réseau antagoniste génératifRésumé : (auteur) The majority of optical observations acquired via spaceborne Earth imagery are affected by clouds. While there is numerous prior work on reconstructing cloud-covered information, previous studies are, oftentimes, confined to narrowly defined regions of interest, raising the question of whether an approach can generalize to a diverse set of observations acquired at variable cloud coverage or in different regions and seasons. We target the challenge of generalization by curating a large novel data set for training new cloud removal approaches and evaluate two recently proposed performance metrics of image quality and diversity. Our data set is the first publically available to contain a global sample of coregistered radar and optical observations, cloudy and cloud-free. Based on the observation that cloud coverage varies widely between clear skies and absolute coverage, we propose a novel model that can deal with either extreme and evaluate its performance on our proposed data set. Finally, we demonstrate the superiority of training models on real over synthetic data, underlining the need for a carefully curated data set of real observations. To facilitate future research, our data set is made available online. Numéro de notice : A2021-529 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2020.3024744 Date de publication en ligne : 02/10/2020 En ligne : https://doi.org/10.1109/TGRS.2020.3024744 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=97980
in IEEE Transactions on geoscience and remote sensing > Vol 59 n° 7 (July 2021) . - pp 5866 - 5878[article]Matching of TerraSAR-X derived ground control points to optical image patches using deep learning / Tatjana Bürgmann in ISPRS Journal of photogrammetry and remote sensing, Vol 158 (December 2019)
[article]
Titre : Matching of TerraSAR-X derived ground control points to optical image patches using deep learning Type de document : Article/Communication Auteurs : Tatjana Bürgmann, Auteur ; Wolfgang Koppe, Auteur ; Michael Schmitt, Auteur Année de publication : 2019 Article en page(s) : pp 241 - 248 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image mixte
[Termes IGN] appariement d'images
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] géolocalisation
[Termes IGN] image multicapteur
[Termes IGN] image optique
[Termes IGN] image Pléiades
[Termes IGN] image radar moirée
[Termes IGN] image Sentinel-MSI
[Termes IGN] image Sentinel-SAR
[Termes IGN] image TerraSAR-X
[Termes IGN] point d'appuiRésumé : (auteur) High resolution synthetic aperture radar (SAR) satellites like TerraSAR-X are capable of acquiring images exhibiting an absolute geolocation accuracy within a few centimeters, mainly because of the availability of precise orbit information and by compensating range delay errors due to atmospheric conditions. In contrast, satellite images from optical missions generally exhibit comparably low geolocation accuracies because of the propagation of errors in angular measurements over large distances. However, a variety of remote sensing applications, such as change detection, surface movement monitoring or ice flow measurements, require precisely geo-referenced and co-registered satellite images. By using Ground Control Points (GCPs) derived from TerraSAR-X, the absolute geolocation accuracy of optical satellite images can be improved. For this purpose, the corresponding matching points in the optical images need to be localized. In this paper, a deep learning based approach is investigated for an automated matching of SAR-derived GCPs to optical image elements. Therefore, a convolutional neural network is pretrained with medium resolution Sentinel-1 and Sentinel-2 imagery and fine-tuned on precisely co-registered TerraSAR-X and Pléiades training image pairs to learn a common descriptor representation. By using these descriptors, the similarity of SAR and optical image patches can be calculated. This similarity metric is then used in a sliding window approach to identify the matching points in the optical reference image. Subsequently, the derived points can be utilized for co-registration of the underlying images. The network is evaluated over nine study areas showing airports and their rural surroundings from several different countries around the world. The results show that based on TerraSAR-X-derived GCPs, corresponding points in the optical image can automatically and reliably be identified with a pixel-level localization accuracy. Numéro de notice : A2019-548 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2019.09.010 Date de publication en ligne : 05/11/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.09.010 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=94194
in ISPRS Journal of photogrammetry and remote sensing > Vol 158 (December 2019) . - pp 241 - 248[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2019121 RAB Revue Centre de documentation En réserve L003 Disponible 081-2019123 DEP-RECP Revue LASTIG Dépôt en unité Exclu du prêt 081-2019122 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network / Chunping Qiu in ISPRS Journal of photogrammetry and remote sensing, vol 154 (August 2019)
[article]
Titre : Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network Type de document : Article/Communication Auteurs : Chunping Qiu, Auteur ; Lichao Mou, Auteur ; Michael Schmitt, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2019 Article en page(s) : pp 151 - 162 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage automatique
[Termes IGN] climat urbain
[Termes IGN] image multitemporelle
[Termes IGN] image optique
[Termes IGN] image Sentinel-MSI
[Termes IGN] occupation du sol
[Termes IGN] réseau neuronal convolutif
[Termes IGN] réseau neuronal récurrent
[Termes IGN] résidu
[Termes IGN] villeRésumé : (Auteur) The local climate zone (LCZ) scheme was originally proposed to provide an interdisciplinary taxonomy for urban heat island (UHI) studies. In recent years, the scheme has also become a starting point for the development of higher-level products, as the LCZ classes can help provide a generalized understanding of urban structures and land uses. LCZ mapping can therefore theoretically aid in fostering a better understanding of spatio-temporal dynamics of cities on a global scale. However, reliable LCZ maps are not yet available globally. As a first step toward automatic LCZ mapping, this work focuses on LCZ-derived land cover classification, using multi-seasonal Sentinel-2 images. We propose a recurrent residual network (Re-ResNet) architecture that is capable of learning a joint spectral-spatial-temporal feature representation within a unitized framework. To this end, a residual convolutional neural network (ResNet) and a recurrent neural network (RNN) are combined into one end-to-end architecture. The ResNet is able to learn rich spectral-spatial feature representations from single-seasonal imagery, while the RNN can effectively analyze temporal dependencies of multi-seasonal imagery. Cross validations were carried out on a diverse dataset covering seven distinct European cities, and a quantitative analysis of the experimental results revealed that the combined use of the multi-temporal information and Re-ResNet results in an improvement of approximately 7 percent points in overall accuracy. The proposed framework has the potential to produce consistent-quality urban land cover and LCZ maps on a large scale, to support scientific progress in fields such as urban geography and urban climatology. Numéro de notice : A2019-268 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2019.05.004 Date de publication en ligne : 14/06/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.05.004 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=93085
in ISPRS Journal of photogrammetry and remote sensing > vol 154 (August 2019) . - pp 151 - 162[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2019081 RAB Revue Centre de documentation En réserve L003 Disponible 081-2019083 DEP-RECP Revue LASTIG Dépôt en unité Exclu du prêt 081-2019082 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery / Chunping Qiu in ISPRS Journal of photogrammetry and remote sensing, vol 138 (April 2018)
[article]
Titre : Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery Type de document : Article/Communication Auteurs : Chunping Qiu, Auteur ; Michael Schmitt, Auteur ; Xiao Xiang Zhu, Auteur Année de publication : 2018 Article en page(s) : pp 218 - 231 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image mixte
[Termes IGN] couple stéréoscopique
[Termes IGN] géométrie épipolaire
[Termes IGN] image optique
[Termes IGN] image radar moirée
[Termes IGN] image TerraSAR-X
[Termes IGN] image Worldview
[Termes IGN] mesure de similitude
[Termes IGN] points homologues
[Termes IGN] précision de localisation
[Termes IGN] reconstruction 3D
[Termes IGN] zone urbaineRésumé : (Auteur) In this paper, we discuss the potential and challenges regarding SAR-optical stereogrammetry for urban areas, using very-high-resolution (VHR) remote sensing imagery. Since we do this mainly from a geometrical point of view, we first analyze the height reconstruction accuracy to be expected for different stereogrammetric configurations. Then, we propose a strategy for simultaneous tie point matching and 3D reconstruction, which exploits an epipolar-like search window constraint. To drive the matching and ensure some robustness, we combine different established hand-crafted similarity measures. For the experiments, we use real test data acquired by the Worldview-2, TerraSAR-X and MEMPHIS sensors. Our results show that SAR-optical stereogrammetry using VHR imagery is generally feasible with 3D positioning accuracies in the meter-domain, although the matching of these strongly hetereogeneous multi-sensor data remains very challenging. Numéro de notice : A2018-124 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2017.12.006 En ligne : https://doi.org/10.1016/j.isprsjprs.2017.12.006 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=89586
in ISPRS Journal of photogrammetry and remote sensing > vol 138 (April 2018) . - pp 218 - 231[article]Exemplaires(3)
Code-barres Cote Support Localisation Section Disponibilité 081-2018041 RAB Revue Centre de documentation En réserve L003 Disponible 081-2018043 DEP-EXM Revue LASTIG Dépôt en unité Exclu du prêt 081-2018042 DEP-EAF Revue Nancy Dépôt en unité Exclu du prêt Maximum-likelihood estimation for multi-aspect multi-baseline SAR interferometry of urban areas / Michael Schmitt in ISPRS Journal of photogrammetry and remote sensing, vol 87 (January 2014)
[article]
Titre : Maximum-likelihood estimation for multi-aspect multi-baseline SAR interferometry of urban areas Type de document : Article/Communication Auteurs : Michael Schmitt, Auteur ; Uwe Stilla, Auteur Année de publication : 2014 Article en page(s) : pp 68 - 77 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image radar et applications
[Termes IGN] classification par maximum de vraisemblance
[Termes IGN] image aérienne
[Termes IGN] image radar moirée
[Termes IGN] interféromètrie par radar à antenne synthétique
[Termes IGN] matrice de covariance
[Termes IGN] milieu urbain
[Termes IGN] modèle numérique de surface
[Termes IGN] Munich
[Termes IGN] reconstruction 3DRésumé : (Auteur) The reconstruction of digital surface models (DSMs) of urban areas from interferometric synthetic aperture radar (SAR) data is a challenging task. In particular the SAR inherent layover and shadowing effects need to be coped with by sophisticated processing strategies. In this paper, a maximum-likelihood estimation procedure for the reconstruction of DSMs from multi-aspect multi-baseline InSAR imagery is proposed. In this framework, redundant as well as contradicting observations are exploited in a statistically optimal way. The presented method, which is especially suited for single-pass SAR interferometers, is examined using test data consisting of experimental airborne millimeterwave SAR imagery. The achievable accuracy is evaluated by comparison to LiDAR-derived reference data. It is shown that the proposed estimation procedure performs better than a comparable non-statistical reconstruction method. Numéro de notice : A2014-013 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2013.10.006 En ligne : https://doi.org/10.1016/j.isprsjprs.2013.10.006 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=32918
in ISPRS Journal of photogrammetry and remote sensing > vol 87 (January 2014) . - pp 68 - 77[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 081-2014011 RAB Revue Centre de documentation En réserve L003 Disponible Radargrammetric registration of airborne multi-aspect SAR data of urban areas / Michael Schmitt in ISPRS Journal of photogrammetry and remote sensing, vol 86 (December 2013)Permalink