Détail de l'auteur
Auteur Hannes Feilhauer |
Documents disponibles écrits par cet auteur (2)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Mapping land-use intensity of grasslands in Germany with machine learning and Sentinel-2 time series / Maximilian Lange in Remote sensing of environment, vol 277 (August 2022)
[article]
Titre : Mapping land-use intensity of grasslands in Germany with machine learning and Sentinel-2 time series Type de document : Article/Communication Auteurs : Maximilian Lange, Auteur ; Hannes Feilhauer, Auteur ; Ingolf Kühn, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 112888 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de télédétection
[Termes IGN] Allemagne
[Termes IGN] apprentissage automatique
[Termes IGN] bande spectrale
[Termes IGN] carte d'utilisation du sol
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] échantillonnage de données
[Termes IGN] image Sentinel-MSI
[Termes IGN] indice de végétation
[Termes IGN] prairie
[Termes IGN] série temporelleRésumé : (auteur) Information on grassland land-use intensity (LUI) is crucial for understanding trends and dynamics in biodiversity, ecosystem functioning, earth system science and environmental monitoring. LUI is a major driver for numerous environmental processes and indicators, such as primary production, nitrogen deposition and resilience to climate extremes. However, large extent, high resolution data on grassland LUI is rare. New satellite generations, such as Copernicus Sentinel-2, enable a spatially comprehensive detection of the mainly subtle changes induced by land-use intensification by their fine spatial and temporal resolution. We developed a methodology quantifying key parameters of grassland LUI such as grazing intensity, mowing frequency and fertiliser application across Germany using Convolutional Neural Networks (CNN) on Sentinel-2 satellite data with 20 m × 20 m spatial resolution. Subsequently, these land-use components were used to calculate a continuous LUI index. Predictions of LUI and its components were validated using comprehensive in situ grassland management data. A feature contribution analysis using Shapley values substantiates the applicability of the methodology by revealing a high relevance of springtime satellite observations and spectral bands related to vegetation health and structure. We achieved an overall classification accuracy of up to 66% for grazing intensity, 68% for mowing, 85% for fertilisation and an r2 of 0.82 for subsequently depicting LUI. We evaluated the methodology's robustness with a spatial 3-fold cross-validation by training and predicting on geographically distinctly separated regions. Spatial transferability was assessed by delineating the models' area of applicability. The presented methodology enables a high resolution, large extent mapping of land-use intensity of grasslands. Numéro de notice : A2022-468 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.1016/j.rse.2022.112888 Date de publication en ligne : 13/05/2022 En ligne : https://doi.org/10.1016/j.rse.2022.112888 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100805
in Remote sensing of environment > vol 277 (August 2022) . - n° 112888[article]Transfer learning from citizen science photographs enables plant species identification in UAV imagery / Salim Soltani in ISPRS Open Journal of Photogrammetry and Remote Sensing, vol 5 (August 2022)
[article]
Titre : Transfer learning from citizen science photographs enables plant species identification in UAV imagery Type de document : Article/Communication Auteurs : Salim Soltani, Auteur ; Hannes Feilhauer, Auteur ; Robbert Duker, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 100016 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] base de données naturalistes
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] distribution spatiale
[Termes IGN] données localisées des bénévoles
[Termes IGN] espèce végétale
[Termes IGN] filtrage de la végétation
[Termes IGN] identification de plantes
[Termes IGN] image captée par drone
[Termes IGN] orthoimage couleur
[Termes IGN] science citoyenne
[Termes IGN] segmentation sémantiqueRésumé : (auteur) Accurate information on the spatial distribution of plant species and communities is in high demand for various fields of application, such as nature conservation, forestry, and agriculture. A series of studies has shown that Convolutional Neural Networks (CNNs) accurately predict plant species and communities in high-resolution remote sensing data, in particular with data at the centimeter scale acquired with Unoccupied Aerial Vehicles (UAV). However, such tasks often require ample training data, which is commonly generated in the field via geocoded in-situ observations or labeling remote sensing data through visual interpretation. Both approaches are laborious and can present a critical bottleneck for CNN applications. An alternative source of training data is given by using knowledge on the appearance of plants in the form of plant photographs from citizen science projects such as the iNaturalist database. Such crowd-sourced plant photographs typically exhibit very different perspectives and great heterogeneity in various aspects, yet the sheer volume of data could reveal great potential for application to bird’s eye views from remote sensing platforms. Here, we explore the potential of transfer learning from such a crowd-sourced data treasure to the remote sensing context. Therefore, we investigate firstly, if we can use crowd-sourced plant photographs for CNN training and subsequent mapping of plant species in high-resolution remote sensing imagery. Secondly, we test if the predictive performance can be increased by a priori selecting photographs that share a more similar perspective to the remote sensing data. We used two case studies to test our proposed approach with multiple RGB orthoimages acquired from UAV with the target plant species Fallopia japonica and Portulacaria afra respectively. Our results demonstrate that CNN models trained with heterogeneous, crowd-sourced plant photographs can indeed predict the target species in UAV orthoimages with surprising accuracy. Filtering the crowd-sourced photographs used for training by acquisition properties increased the predictive performance. This study demonstrates that citizen science data can effectively anticipate a common bottleneck for vegetation assessments and provides an example on how we can effectively harness the ever-increasing availability of crowd-sourced and big data for remote sensing applications. Numéro de notice : A2022-488 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article DOI : 10.1016/j.ophoto.2022.100016 Date de publication en ligne : 23/05/2022 En ligne : https://doi.org/10.1016/j.ophoto.2022.100016 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100956
in ISPRS Open Journal of Photogrammetry and Remote Sensing > vol 5 (August 2022) . - n° 100016[article]