Détail de l'auteur
Auteur Fengpeng Li |
Documents disponibles écrits par cet auteur (2)



Unsupervised representation high-resolution remote sensing image scene classification via contrastive learning convolutional neural network / Fengpeng Li in Photogrammetric Engineering & Remote Sensing, PERS, vol 87 n° 8 (August 2021)
![]()
[article]
Titre : Unsupervised representation high-resolution remote sensing image scene classification via contrastive learning convolutional neural network Type de document : Article/Communication Auteurs : Fengpeng Li, Auteur ; Jiabao Li, Auteur ; Wei Han, Auteur ; et al., Auteur Année de publication : 2021 Article en page(s) : pp 577 - 591 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] classification non dirigée
[Termes IGN] classification par réseau neuronal
[Termes IGN] grande échelle
[Termes IGN] image à haute résolution
[Termes IGN] image aérienne
[Termes IGN] moyenne échelle
[Termes IGN] petite échelle
[Termes IGN] régression linéaire
[Termes IGN] réseau neuronal convolutifRésumé : (Auteur) Inspired by the outstanding achievement of deep learning, supervised deep learning representation methods for high-spatial-resolution remote sensing image scene classification obtained state-of-the-art performance. However, supervised deep learning representation methods need a considerable amount of labeled data to capture class-specific features, limiting the application of deep learning-based methods while there are a few labeled training samples. An unsupervised deep learning representation, high-resolution remote sensing image scene classification method is proposed in this work to address this issue. The proposed method, called contrastive learning, narrows the distance between positive views: color channels belonging to the same images widens the gaps between negative view pairs consisting of color channels from different images to obtain class-specific data representations of the input data without any supervised information. The classifier uses extracted features by the convolutional neural network (CNN)-based feature extractor with labeled information of training data to set space of each category and then, using linear regression, makes predictions in the testing procedure. Comparing with existing unsupervised deep learning representation high-resolution remote sensing image scene classification methods, contrastive learning CNN achieves state-of-the-art performance on three different scale benchmark data sets: small scale RSSCN7 data set, midscale aerial image data set, and large-scale NWPU-RESISC45 data set. Numéro de notice : A2021-670 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.14358/PERS.87.8.577 Date de publication en ligne : 01/08/2021 En ligne : https://doi.org/10.14358/PERS.87.8.577 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=98806
in Photogrammetric Engineering & Remote Sensing, PERS > vol 87 n° 8 (August 2021) . - pp 577 - 591[article]Réservation
Réserver ce documentExemplaires (1)
Code-barres Cote Support Localisation Section Disponibilité 105-2021081 SL Revue Centre de documentation Revues en salle Disponible High-resolution remote sensing image scene classification via key filter bank based on convolutional neural network / Fengpeng Li in IEEE Transactions on geoscience and remote sensing, vol 58 n° 11 (November 2020)
![]()
[article]
Titre : High-resolution remote sensing image scene classification via key filter bank based on convolutional neural network Type de document : Article/Communication Auteurs : Fengpeng Li, Auteur ; Ruyi Feng, Auteur ; Wei Han, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp 8077 - 8092 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] extraction de traits caractéristiques
[Termes IGN] filtrage numérique d'image
[Termes IGN] image à haute résolution
[Termes IGN] jeu de données
[Termes IGN] segmentation sémantique
[Termes IGN] test statistiqueRésumé : (auteur) High-resolution remote sensing (HRRS) image scene classification has attracted an enormous amount of attention due to its wide application in a range of tasks. Due to the rapid development of deep learning (DL), models based on convolutional neural network (CNN) have made competitive achievements on HRRS image scene classification because of the excellent representation capacity of DL. The scene labels of HRRS images extremely depend on the combination of global information and information from key regions or locations. However, most existing models based on CNN tend only to represent the global features of images or overstate local information capturing from key regions or locations, which may confuse different categories. To address this issue, a key region or location capturing method called key filter bank (KFB) is proposed in this article, and KFB can retain global information at the same time. This method can combine with different CNN models to improve the performance of HRRS imagery scene classification. Moreover, for the convenience of practical tasks, an end-to-end model called KFBNet where KFB combined with DenseNet-121 is proposed to compare the performance with existing models. This model is evaluated on public benchmark data sets, and the proposed model makes better performance on benchmarks than the state-of-the-art methods. Numéro de notice : A2020-683 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2020.2987060 Date de publication en ligne : 23/04/2020 En ligne : https://doi.org/10.1109/TGRS.2020.2987060 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=96208
in IEEE Transactions on geoscience and remote sensing > vol 58 n° 11 (November 2020) . - pp 8077 - 8092[article]