Détail de l'auteur
Auteur Zhenfeng Shao |
Documents disponibles écrits par cet auteur (4)



Spatiotemporal temperature fusion based on a deep convolutional network / Xuehan Wang in Photogrammetric Engineering & Remote Sensing, PERS, vol 88 n° 2 (February 2022)
![]()
[article]
Titre : Spatiotemporal temperature fusion based on a deep convolutional network Type de document : Article/Communication Auteurs : Xuehan Wang, Auteur ; Zhenfeng Shao, Auteur ; Xiao Huang, Auteur ; Deren Li, Auteur Année de publication : 2022 Article en page(s) : pp 93 - 101 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] analyse comparative
[Termes IGN] apprentissage profond
[Termes IGN] Chine
[Termes IGN] données spatiotemporelles
[Termes IGN] fusion de données multisource
[Termes IGN] image Landsat
[Termes IGN] image Terra-MODIS
[Termes IGN] réseau neuronal convolutif
[Termes IGN] série temporelle
[Termes IGN] température au sol
[Termes IGN] température de surfaceRésumé : (Auteur) High-spatiotemporal-resolution land surface temperature (LST) images are essential in various fields of study. However, due to technical constraints, sensing systems have difficulty in providing LSTs with both high spatial and high temporal resolution. In this study, we propose a multi-scale spatiotemporal temperature-image fusion network (MSTTIFN) to generate high-spatial-resolution LST products. The MSTTIFN builds nonlinear mappings between the input Moderate Resolution Imaging Spectroradiometer (MODIS) LSTs and the out- put Landsat LSTs at the target date with two pairs of references and therefore enhances the resolution of time-series LSTs. We conduct experiments on the actual Landsat and MODIS data in two study areas (Beijing and Shandong) and compare our proposed MSTTIFN with four competing methods: the Spatial and Temporal Adaptive Reflectance Fusion Model, the Flexible Spatiotemporal Data Fusion Model, a two-stream convolutional neural network (StfNet), and a deep learning-based spatiotemporal temperature-fusion network. Results reveal that the MSTTIFN achieves the best and most stable performance. Numéro de notice : A2022-064 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.14358/PERS.21-00023R2 Date de publication en ligne : 01/02/2022 En ligne : https://doi.org/10.14358/PERS.21-00023R2 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99724
in Photogrammetric Engineering & Remote Sensing, PERS > vol 88 n° 2 (February 2022) . - pp 93 - 101[article]Improving urban land cover mapping with the fusion of optical and SAR data based on feature selection strategy / Qing Ding in Photogrammetric Engineering & Remote Sensing, PERS, vol 88 n° 1 (January 2022)
![]()
[article]
Titre : Improving urban land cover mapping with the fusion of optical and SAR data based on feature selection strategy Type de document : Article/Communication Auteurs : Qing Ding, Auteur ; Zhenfeng Shao, Auteur ; Xiao Huang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : pp 17 - 28 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image mixte
[Termes IGN] analyse comparative
[Termes IGN] carte d'occupation du sol
[Termes IGN] cartographie urbaine
[Termes IGN] Chine
[Termes IGN] fusion de données multisource
[Termes IGN] image optique
[Termes IGN] image radar
[Termes IGN] précision de la classificationRésumé : (Auteur) Taking the Futian District as the research area, this study proposed an effective urban land cover mapping framework fusing optical and SAR data. To simplify the model complexity and improve the mapping results, various feature selection methods were compared and evaluated. The results showed that feature selection can eliminate irrelevant features, increase the mean correlation between features slightly, and improve the classification accuracy and computational efficiency significantly. The recursive feature elimination-support vector machine (RFE-SVM) model obtained the best results, with an overall accuracy of 89.17% and a kappa coefficient of 0.8695, respectively. In addition, this study proved that the fusion of optical and SAR data can effectively improve mapping and reduce the confusion between different land covers. The novelty of this study is with the insight into the merits of multi-source data fusion and feature selection in the land cover mapping process over complex urban environments, and to evaluate the performance differences between different feature selection methods. Numéro de notice : A2022-061 Affiliation des auteurs : non IGN Thématique : URBANISME Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.14358/PERS.21-00030R2 Date de publication en ligne : 01/01/2022 En ligne : https://doi.org/10.14358/PERS.21-00030R2 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99703
in Photogrammetric Engineering & Remote Sensing, PERS > vol 88 n° 1 (January 2022) . - pp 17 - 28[article]Réservation
Réserver ce documentExemplaires (1)
Code-barres Cote Support Localisation Section Disponibilité 105-2022011 SL Revue Centre de documentation Revues en salle Disponible An internal-external optimized convolutional neural network for arbitrary orientated object detection from optical remote sensing images / Sihang Zhang in Geo-spatial Information Science, vol 24 n° 4 (October 2021)
![]()
[article]
Titre : An internal-external optimized convolutional neural network for arbitrary orientated object detection from optical remote sensing images Type de document : Article/Communication Auteurs : Sihang Zhang, Auteur ; Zhenfeng Shao, Auteur ; Xiao Huang, Auteur ; et al., Auteur Année de publication : 2021 Article en page(s) : pp 654 - 665 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] détection d'objet
[Termes IGN] image optique
[Termes IGN] optimisation (mathématiques)Résumé : (auteur) Due to the bird’s eye view of remote sensing sensors, the orientational information of an object is a key factor that has to be considered in object detection. To obtain rotating bounding boxes, existing studies either rely on rotated anchoring schemes or adding complex rotating ROI transfer layers, leading to increased computational demand and reduced detection speeds. In this study, we propose a novel internal-external optimized convolutional neural network for arbitrary orientated object detection in optical remote sensing images. For the internal optimization, we designed an anchor-based single-shot head detector that adopts the concept of coarse-to-fine detection for two-stage object detection networks. The refined rotating anchors are generated from the coarse detection head module and fed into the refining detection head module with a link of an embedded deformable convolutional layer. For the external optimization, we propose an IOU balanced loss that addresses the regression challenges related to arbitrary orientated bounding boxes. Experimental results on the DOTA and HRSC2016 benchmark datasets show that our proposed method outperforms selected methods. Numéro de notice : A2021-129 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.1080/10095020.2021.1972772 Date de publication en ligne : 27/09/2021 En ligne : https://doi.org/10.1080/10095020.2021.1972772 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99355
in Geo-spatial Information Science > vol 24 n° 4 (October 2021) . - pp 654 - 665[article]Spatio-temporal-spectral observation model for urban remote sensing / Zhenfeng Shao in Geo-spatial Information Science, vol 24 n° 3 (July 2021)
![]()
[article]
Titre : Spatio-temporal-spectral observation model for urban remote sensing Type de document : Article/Communication Auteurs : Zhenfeng Shao, Auteur ; Wenfu Wu, Auteur ; Deren Li, Auteur Année de publication : 2021 Article en page(s) : pp 372 - 386 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] biomasse aérienne
[Termes IGN] cartographie des risques
[Termes IGN] complexité
[Termes IGN] fusion d'images
[Termes IGN] image satellite
[Termes IGN] inondation
[Termes IGN] modèle mathématique
[Termes IGN] scène urbaine
[Termes IGN] surface imperméable
[Termes IGN] zone urbaineMots-clés libres : spatio-temporal-spectral observation model Résumé : (auteur) Taking cities as objects being observed, urban remote sensing is an important branch of remote sensing. Given the complexity of the urban scenes, urban remote sensing observation requires data with a high temporal resolution, high spatial resolution, and high spectral resolution. To the best of our knowledge, however, no satellite owns all the above characteristics. Thus, it is necessary to coordinate data from existing remote sensing satellites to meet the needs of urban observation. In this study, we abstracted the urban remote sensing observation process and proposed an urban spatio-temporal-spectral observation model, filling the gap of no existing urban remote sensing framework. In this study, we present four applications to elaborate on the specific applications of the proposed model: 1) a spatio-temporal fusion model for synthesizing ideal data, 2) a spatio-spectral observation model for urban vegetation biomass estimation, 3) a temporal-spectral observation model for urban flood mapping, and 4) a spatio-temporal-spectral model for impervious surface extraction. We believe that the proposed model, although in a conceptual stage, can largely benefit urban observation by providing a new data fusion paradigm. Numéro de notice : A2021-722 Affiliation des auteurs : non IGN Thématique : IMAGERIE/URBANISME Nature : Article DOI : 10.1080/10095020.2020.1864232 Date de publication en ligne : 08/02/2021 En ligne : https://doi.org/10.1080/10095020.2020.1864232 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=98642
in Geo-spatial Information Science > vol 24 n° 3 (July 2021) . - pp 372 - 386[article]