Détail de l'auteur
Auteur V.S. Martins |
Documents disponibles écrits par cet auteur (1)



Deep learning high resolution burned area mapping by transfer learning from Landsat-8 to PlanetScope / V.S. Martins in Remote sensing of environment, vol 280 (October 2022)
![]()
[article]
Titre : Deep learning high resolution burned area mapping by transfer learning from Landsat-8 to PlanetScope Type de document : Article/Communication Auteurs : V.S. Martins, Auteur ; D.P. Roy, Auteur ; H. Huang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 113203 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] Afrique (géographie politique)
[Termes IGN] apprentissage profond
[Termes IGN] carte thématique
[Termes IGN] cartographie automatique
[Termes IGN] correction radiométrique
[Termes IGN] données d'entrainement (apprentissage automatique)
[Termes IGN] forêt tropicale
[Termes IGN] image Landsat-OLI
[Termes IGN] image PlanetScope
[Termes IGN] incendie
[Termes IGN] précision de la classification
[Termes IGN] régression
[Termes IGN] savaneRésumé : (auteur) High spatial resolution commercial satellite data provide new opportunities for terrestrial monitoring. The recent availability of near-daily 3 m observations provided by the PlanetScope constellation enables mapping of small and spatially fragmented burns that are not detected at coarser spatial resolution. This study demonstrates, for the first time, the potential for automated PlanetScope 3 m burned area mapping. The PlanetScope sensors have no onboard calibration or short-wave infrared bands, and have variable overpass times, making them challenging to use for large area, automated, burned area mapping. To help overcome these issues, a U-Net deep learning algorithm was developed to classify burned areas from two-date Planetscope 3 m image pairs acquired at the same location. The deep learning approach, unlike conventional burned area mapping algorithms, is applied to image spatial subsets and not to single pixels and so incorporates spatial as well as spectral information. Deep learning requires large amounts of training data. Consequently, transfer learning was undertaken using pre-existing Landsat-8 derived burned area reference data to train the U-Net that was then refined with a smaller set of PlanetScope training data. Results across Africa considering 659 PlanetScope radiometrically normalized image pairs sensed one day apart in 2019 are presented. The U-Net was first trained with different numbers of randomly selected 256 × 256 30 m pixel patches extracted from 92 pre-existing Landsat-8 burned area reference data sets defined for 2014 and 2015. The U-Net trained with 300,000 Landsat patches provided about 13% 30 m burn omission and commission errors with respect to 65,000 independent 30 m evaluation patches. The U-Net was then refined by training on 5,000 256 × 256 3 m patches extracted from independently interpreted PlanetScope burned area reference data. Qualitatively, the refined U-Net was able to more precisely delineate 3 m burn boundaries, including the interiors of unburned areas, and better classify “faint” burned areas indicative of low combustion completeness and/or sparse burns. The refined U-Net 3 m classification accuracy was assessed with respect to 20 independently interpreted PlanetScope burned area reference data sets, composed of 339.4 million 3 m pixels, with low 12.29% commission and 12.09% omission errors. The dependency of the U-Net classification accuracy on the burned area proportion within 3 m pixel 256 × 256 patches was also examined, and patches Numéro de notice : A2022-774 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.1016/j.rse.2022.113203 Date de publication en ligne : 08/08/2022 En ligne : https://doi.org/10.1016/j.rse.2022.113203 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=101802
in Remote sensing of environment > vol 280 (October 2022) . - n° 113203[article]