IEEE Transactions on geoscience and remote sensing / IEEE Geoscience and remote sensing society (Etats-Unis) . vol 61 n° 3Paru le : 01/03/2023 |
[n° ou bulletin]
est un bulletin de IEEE Transactions on geoscience and remote sensing / IEEE Geoscience and remote sensing society (Etats-Unis) (1986 -)
[n° ou bulletin]
|
Dépouillements
Ajouter le résultat dans votre panierMultiresolution analysis pansharpening based on variation factor for multispectral and panchromatic images from different times / Peng Wang in IEEE Transactions on geoscience and remote sensing, vol 61 n° 3 (March 2023)
[article]
Titre : Multiresolution analysis pansharpening based on variation factor for multispectral and panchromatic images from different times Type de document : Article/Communication Auteurs : Peng Wang, Auteur ; Hongyu Yao, Auteur ; Bo Huang, Auteur ; et al., Auteur Année de publication : 2023 Article en page(s) : n° 5401217 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] analyse multirésolution
[Termes IGN] données multitemporelles
[Termes IGN] image multibande
[Termes IGN] image panchromatique
[Termes IGN] pansharpening (fusion d'images)
[Termes IGN] pouvoir de résolution géométriqueRésumé : (auteur) Most pansharpening methods refer to the fusion of the original low-resolution multispectral (MS) and high-resolution panchromatic (PAN) images acquired simultaneously over the same area. Due to its good robustness, multiresolution analysis (MRA) has become one of the important categories of pansharpening methods. However, when only MS and PAN images acquired at different times can be provided, the fusion results from current MRA methods are often not ideal due to the failure to effectively analyze multitemporal misalignments between MS and PAN images from different times. To solve this issue, MRA pansharpening based on variation factor for MS and PAN images from different times is proposed. The MRA pansharpening based on dual-scale regression model is first established, and the variation factor is then introduced to effectively analyze the multitemporal misalignments by using the alternating direction method of multipliers (ADMM), yielding the final fusion results. Experiments with synthetic and real datasets show that the proposed method exhibits significant performance improvement compared to the traditional pansharpening methods, as well as the state-of-the-art MRA methods. Visual comparisons demonstrate that the variation factor introduces encouraging improvements in the compensation of multitemporal misalignments in ground objects and advances pansharpening applications for MS and PAN images acquired at different times. Numéro de notice : A2023-184 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2023.3252001 En ligne : https://doi.org/10.1109/TGRS.2023.3252001 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=102956
in IEEE Transactions on geoscience and remote sensing > vol 61 n° 3 (March 2023) . - n° 5401217[article]A unified attention paradigm for hyperspectral image classification / Qian Liu in IEEE Transactions on geoscience and remote sensing, vol 61 n° 3 (March 2023)
[article]
Titre : A unified attention paradigm for hyperspectral image classification Type de document : Article/Communication Auteurs : Qian Liu, Auteur ; Zebin Wu, Auteur ; Yang Xu, Auteur ; et al., Auteur Année de publication : 2023 Article en page(s) : n° 5506316 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] attention (apprentissage automatique)
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] extraction de traits caractéristiques
[Termes IGN] image hyperspectrale
[Termes IGN] précision de la classification
[Termes IGN] séparateur à vaste margeRésumé : (auteur) Attention mechanisms improve the classification accuracies by enhancing the salient information for hyperspectral images (HSIs). However, existing HSI attention models are driven by advanced achievements of computer vision, which are not able to fully exploit the spectral–spatial structure prior of HSIs and effectively refine features from a global perspective. In this article, we propose a unified attention paradigm (UAP) that defines the attention mechanism as a general three-stage process including optimizing feature representations, strengthening information interaction, and emphasizing meaningful information. Meanwhile, we designed a novel efficient spectral–spatial attention module (ESSAM) under this paradigm, which adaptively adjusts feature responses along the spectral and spatial dimensions at an extremely low parameter cost. Specifically, we construct a parameter-free spectral attention block that employs multiscale structured encodings and similarity calculations to perform global cross-channel interactions, and a memory-enhanced spatial attention block that captures key semantics of images stored in a learnable memory unit and models global spatial relationship by constructing semantic-to-pixel dependencies. ESSAM takes full account of the spatial distribution and low-dimensional characteristics of HSIs, with better interpretability and lower complexity. We develop a dense convolutional network based on efficient spectral–spatial attention network (ESSAN) and experiment on three real hyperspectral datasets. The experimental results demonstrate that the proposed ESSAM brings higher accuracy improvement compared to advanced attention models. Numéro de notice : A2023-185 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2023.3257321 Date de publication en ligne : 15/12/2023 En ligne : https://doi.org/10.1109/TGRS.2023.3257321 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=102957
in IEEE Transactions on geoscience and remote sensing > vol 61 n° 3 (March 2023) . - n° 5506316[article]