|
[n° ou bulletin]
est un bulletin de IEEE Transactions on geoscience and remote sensing / IEEE Geoscience and remote sensing society (Etats-Unis) (1986 -) ![]()
[n° ou bulletin]
|
Dépouillements


Hourly rainfall forecast model using supervised learning algorithm / Qingzhi Zhao in IEEE Transactions on geoscience and remote sensing, vol 60 n° 1 (January 2022)
![]()
[article]
Titre : Hourly rainfall forecast model using supervised learning algorithm Type de document : Article/Communication Auteurs : Qingzhi Zhao, Auteur ; Yang Liu, Auteur ; Wanqiang Yao, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 4100509 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de géodésie spatiale
[Termes IGN] autocorrélation
[Termes IGN] classification dirigée
[Termes IGN] classification par séparateurs à vaste marge
[Termes IGN] données GNSS
[Termes IGN] heure
[Termes IGN] modèle de simulation
[Termes IGN] modèle météorologique
[Termes IGN] précipitation
[Termes IGN] série temporelle
[Termes IGN] station GNSS
[Termes IGN] Taïwan
[Termes IGN] vapeur d'eauRésumé : (auteur) Previous studies on short-term rainfall forecast using precipitable water vapor (PWV) and meteorological parameters mainly focus on rain occurrence, while the rainfall forecast is rarely investigated. Therefore, an hourly rainfall forecast (HRF) model based on a supervised learning algorithm is proposed in this study to predict rainfall with high accuracy and time resolution. Hourly PWV derived from Global Navigation Satellite System (GNSS) and temperature data are used as input parameters of the HRF model, and a support vector machine is introduced to train the proposed model. In addition, this model also considers the time autocorrelation of rainfall in the previous epoch. Hourly PWV data of 21 GNSS stations and collocated meteorological parameters (temperature and rainfall) for five years in Taiwan Province are selected to validate the proposed model. Internal and external validation experiments have been performed under the cases of slight, moderate, and heavy rainfall. Average root-mean-square error (RMSE) and relative RMSE of the proposed HRF model are 1.36/1.39 mm/h and 1.00/0.67, respectively. In addition, the proposed HRF model is compared with the similar works in previous studies. Compared results reveal the satisfactory performance and superiority of the proposed HRF model in terms of time resolution and forecast accuracy. Numéro de notice : A2022-024 Affiliation des auteurs : non IGN Thématique : POSITIONNEMENT Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2021.3054582 Date de publication en ligne : 09/02/2021 En ligne : https://doi.org/10.1109/TGRS.2021.3054582 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99253
in IEEE Transactions on geoscience and remote sensing > vol 60 n° 1 (January 2022) . - n° 4100509[article]Detection and biomass estimation of phaeocystis globosa blooms off Southern China from UAV-based hyperspectral measurements / Xue Li in IEEE Transactions on geoscience and remote sensing, vol 60 n° 1 (January 2022)
![]()
[article]
Titre : Detection and biomass estimation of phaeocystis globosa blooms off Southern China from UAV-based hyperspectral measurements Type de document : Article/Communication Auteurs : Xue Li, Auteur ; Shaoling Shang, Auteur ; Zhongping Lee, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 4200513 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de télédétection
[Termes IGN] algue
[Termes IGN] biomasse
[Termes IGN] cartographie thématique
[Termes IGN] Chine
[Termes IGN] chlorophylle
[Termes IGN] couleur de l'océan
[Termes IGN] espèce exotique envahissante
[Termes IGN] image captée par drone
[Termes IGN] image hyperspectrale
[Termes IGN] plancton
[Termes IGN] réflectanceRésumé : (auteur) Phaeocystis globosa (P. globosa) is a unique causative species of harmful algal blooms, which can form gelatinous colonies. We, for the first time, used unmanned aerial vehicle (UAV) measurements to identify P. globosa blooms and to quantify the biomass. Based on in situ measured remote sensing reflectance ( Rrs ), it is found that, for P. globosa blooms, the maximum of the second-derivative ( dλ2Rrs ) of Rrs(λ) in the 460–480-nm domain is beyond 466 nm. An analysis of the absorption properties from algal cultures suggested that this feature comes from the absorption of chlorophyll c3 (Chl −/c3 ) around 466 nm, a prominent feature of P. globosa. This position of dλ2Rrs maximum was, thus, selected as the criterion for P. globosa identification. The spatial extent of P. globosa blooms in two bays off southern China was then mapped by applying the criterion to UAV-measured Rrs . Twelve out of 16 UAV and in situ match-up stations were consistently identified as dominated by P. globosa, indicating the accuracy of 75%. Furthermore, using localized empirical models, chlorophyll a (Chl −/a ) concentration and colony numbers of P. globosa were estimated from UAV-derived Rrs , where P. globosa colonies were found in a range of ~3–37 gel matrix/L, indicating the occurrence of weak to moderate P. globosa blooms during the surveys. The promising results suggest a high potential for detection and quantification of P. globosa blooms in near-shore bays or harbors using UAV-based hyperspectral remote sensing, where conventional ocean color satellite remote sensing runs into difficulties. Numéro de notice : A2022-025 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2021.3051466 Date de publication en ligne : 26/01/2021 En ligne : https://doi.org/10.1109/TGRS.2021.3051466 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99254
in IEEE Transactions on geoscience and remote sensing > vol 60 n° 1 (January 2022) . - n° 4200513[article]A comparison of linear-mode and single-photon airborne LiDAR in species-specific forest inventories / Janne Raty in IEEE Transactions on geoscience and remote sensing, vol 60 n° 1 (January 2022)
![]()
[article]
Titre : A comparison of linear-mode and single-photon airborne LiDAR in species-specific forest inventories Type de document : Article/Communication Auteurs : Janne Raty, Auteur ; Petri Varvia, Auteur ; Lauri Korhonen, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 4401514 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Lasergrammétrie
[Termes IGN] altitude
[Termes IGN] analyse comparative
[Termes IGN] capteur linéaire
[Termes IGN] carte de la végétation
[Termes IGN] données lidar
[Termes IGN] données localisées 3D
[Termes IGN] Finlande
[Termes IGN] forêt boréale
[Termes IGN] instrumentation Leica
[Termes IGN] instrumentation Riegl
[Termes IGN] inventaire forestier étranger (données)
[Termes IGN] photon
[Termes IGN] Pinophyta
[Termes IGN] semis de points
[Termes IGN] signal laserRésumé : (auteur) Single-photon airborne light detection and ranging (LiDAR) systems provide high-density data from high flight altitudes. We compared single-photon and linear-mode airborne LiDAR for the prediction of species-specific volumes in boreal coniferous-dominated forests. The LiDAR data sets were acquired at different flight altitudes using Leica SPL100 (single-photon, 17 points ⋅ m−2 ), Riegl VQ-1560i (linear-mode, 11 points ⋅ m−2 ), and Leica ALS60 (linear-mode, 0.6 points ⋅ m−2 ) LiDAR systems. Volumes were predicted at the plot-level using Gaussian process regression with predictor variables extracted from the LiDAR data sets and aerial images. Our findings showed that the Leica SPL100 produced a greater mean root-mean-squared error (RMSE) value (41.7 m3 ⋅ ha −1 ) than the Leica ALS60 (39.3 m3 ⋅ ha −1 ) in the prediction of species-specific volumes. Correspondingly, the Riegl VQ-1560i (mean RMSE = 33.0 m3 ⋅ ha −1 ) outperformed both the Leica ALS60 and the Leica SPL100. We found that the cumulative distributions of the first echo heights >1.3 m were rather similar among the data sets, whereas the last echo distributions showed larger differences. We conclude that the Leica SPL100 data set is suitable for area-based LiDAR inventory by tree species although the prediction errors are greater than with data obtained using the modern linear-mode LiDAR, such as Riegl VQ-1560i. Numéro de notice : A2022-026 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article DOI : 10.1109/TGRS.2021.3060670 Date de publication en ligne : 04/03/2021 En ligne : https://doi.org/10.1109/TGRS.2021.3060670 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99257
in IEEE Transactions on geoscience and remote sensing > vol 60 n° 1 (January 2022) . - n° 4401514[article]Deep image translation with an affinity-based change prior for unsupervised multimodal change detection / Luigi Tommaso Luppino in IEEE Transactions on geoscience and remote sensing, vol 60 n° 1 (January 2022)
![]()
[article]
Titre : Deep image translation with an affinity-based change prior for unsupervised multimodal change detection Type de document : Article/Communication Auteurs : Luigi Tommaso Luppino, Auteur ; Michael Kampffmeyer, Auteur ; filipo Maria Bianchi, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 4700422 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image mixte
[Termes IGN] analyse comparative
[Termes IGN] architecture de réseau
[Termes IGN] classification non dirigée
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] détection de changement
[Termes IGN] extraction de traits caractéristiques
[Termes IGN] réseau antagoniste génératifRésumé : (auteur) Image translation with convolutional neural networks has recently been used as an approach to multimodal change detection. Existing approaches train the networks by exploiting supervised information of the change areas, which, however, is not always available. A main challenge in the unsupervised problem setting is to avoid that change pixels affect the learning of the translation function. We propose two new network architectures trained with loss functions weighted by priors that reduce the impact of change pixels on the learning objective. The change prior is derived in an unsupervised fashion from relational pixel information captured by domain-specific affinity matrices. Specifically, we use the vertex degrees associated with an absolute affinity difference matrix and demonstrate their utility in combination with cycle consistency and adversarial training. The proposed neural networks are compared with the state-of-the-art algorithms. Experiments conducted on three real data sets show the effectiveness of our methodology. Numéro de notice : A2022-027 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2021.3056196 Date de publication en ligne : 17/02/2021 En ligne : https://doi.org/10.1109/TGRS.2021.3056196 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99263
in IEEE Transactions on geoscience and remote sensing > vol 60 n° 1 (January 2022) . - n° 4700422[article]A novel unmixing-based hypersharpening method via convolutional neural network / Xiaochen Lu in IEEE Transactions on geoscience and remote sensing, vol 60 n° 1 (January 2022)
![]()
[article]
Titre : A novel unmixing-based hypersharpening method via convolutional neural network Type de document : Article/Communication Auteurs : Xiaochen Lu, Auteur ; Tong Li, Auteur ; Junping Zhang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 5503614 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] analyse des mélanges spectraux
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] fusion d'images
[Termes IGN] image hyperspectrale
[Termes IGN] image multibande
[Termes IGN] pansharpening (fusion d'images)
[Termes IGN] pouvoir de résolution géométrique
[Termes IGN] pouvoir de résolution spectraleRésumé : (auteur) Hypersharpening (namely, hyperspectral (HS) and multispectral (MS) image fusion) aims at enhancing the spatial resolution of HS image via an auxiliary higher resolution MS image. Currently, numerous hypersharpening methods are proposed successively, among which the unmixing-based approaches have been widely researched and demonstrated their effectiveness in the spectral fidelity aspect. However, existing unmixing-based fusion methods substantially employ mathematical techniques to solve the spectral mixture model, without taking full advantage of the collaborative spatial–spectral information that is usually helpful for abundance estimation improvement. To overcome this drawback, in this article, a novel unmixing-based HS and MS image fusion method, via a convolutional neural network (CNN), is proposed to promote spectral fidelity. The main idea of this work is to use CNN to fully explore the spatial information and the spectral information of both HS and MS images simultaneously, thereby enhancing the accuracy of estimating the abundance maps. Experiments on four simulated and real remote sensing data sets demonstrate that the proposed method is beneficial to the spectral fidelity of the fused images compared with some state-of-the-art algorithms. Meanwhile, it is also easy to implement and has a certain advantage in running time. Numéro de notice : A2022-028 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2021.3063105 Date de publication en ligne : 22/03/2021 En ligne : https://doi.org/10.1109/TGRS.2021.3063105 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99264
in IEEE Transactions on geoscience and remote sensing > vol 60 n° 1 (January 2022) . - n° 5503614[article]