Descripteur
Termes IGN > informatique > intelligence artificielle > apprentissage automatique > fonction de perte
fonction de perte |
Documents disponibles dans cette catégorie (2)



Etendre la recherche sur niveau(x) vers le bas
Deep learning for the detection of early signs for forest damage based on satellite imagery / Dennis Wittich in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol V-2-2022 (2022 edition)
![]()
[article]
Titre : Deep learning for the detection of early signs for forest damage based on satellite imagery Type de document : Article/Communication Auteurs : Dennis Wittich, Auteur ; Franz Rottensteiner, Auteur ; Mirjana Voelsen, Auteur ; Christian Heipke, Auteur ; Sönke Müller, Auteur Année de publication : 2022 Article en page(s) : pp 307 - 315 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de télédétection
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] dégradation de la flore
[Termes IGN] dommage forestier causé par facteurs naturels
[Termes IGN] fonction de perte
[Termes IGN] image Sentinel-MSI
[Termes IGN] régression
[Termes IGN] série temporelle
[Termes IGN] surveillance forestièreRésumé : (auteur) We present an approach for detecting early signs for upcoming forest damages by training a Convolutional Neural Network (CNN) for the pixel-wise prediction of the remaining life-time (RLT) of trees in forests based on Sentinel-2 imagery. We focus on a scenario in which reference data are only available for a related task, namely for a bi-temporal pixel-wise classification of forest degradation. This reference is used to train a CNN for the pixel-wise prediction of forest degradation. In this context, we propose a new sub-sampling-based approach for compensating the effects of a heavy class imbalance in the training data. Using the resulting classification model, we predict semi-labels for images of a Sentinel-2 time series, from which training data for a CNN designed to regress the RLT can be derived after some label cleansing. However, due to data gaps in the time series, e.g. caused by clouds, only intervals can be derived for the target variable to be regressed, and for some training pixels one of the interval limits may even be unknown. Consequently, we propose a new loss function for training a CNN for regressing the RLT that only requires the known interval limits. The method is evaluated on a data set in Germany, covering a time-span of 5 years. We show that the proposed sub-sampling strategy for dealing with strong label imbalance when training the classifier significantly reduces the training time compared to other approaches. We further show that our model predicts the RLT with a maximum error of two months for 80% of the forest pixels that die within one year from the acquisition date of the Sentinel-2 image. Numéro de notice : A2022-432 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE/INFORMATIQUE Nature : Article DOI : 10.5194/isprs-annals-V-2-2022-307-2022 Date de publication en ligne : 17/05/2022 En ligne : https://doi.org/10.5194/isprs-annals-V-2-2022-307-2022 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100738
in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences > vol V-2-2022 (2022 edition) . - pp 307 - 315[article]Uncertainty estimation for stereo matching based on evidential deep learning / Chen Wang in Pattern recognition, vol 124 (April 2022)
![]()
[article]
Titre : Uncertainty estimation for stereo matching based on evidential deep learning Type de document : Article/Communication Auteurs : Chen Wang, Auteur ; Xiang Wang, Auteur ; Jiawei Zhang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 108498 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] appariement d'images
[Termes IGN] apprentissage profond
[Termes IGN] distribution de Gauss
[Termes IGN] fonction de perte
[Termes IGN] lissage de données
[Termes IGN] modèle d'incertitude
[Termes IGN] reconstruction d'imageRésumé : (auteur) Although deep learning-based stereo matching approaches have achieved excellent performance in recent years, it is still a non-trivial task to estimate the uncertainty of the produced disparity map. In this paper, we propose a novel approach to estimate both aleatoric and epistemic uncertainties for stereo matching in an end-to-end way. We introduce an evidential distribution, named Normal Inverse-Gamma (NIG) distribution, whose parameters can be used to calculate the uncertainty. Instead of directly regressed from aggregated features, the uncertainty parameters are predicted for each potential disparity and then averaged via the guidance of matching probability distribution. Furthermore, considering the sparsity of ground truth in real scene datasets, we design two additional losses. The first one tries to enlarge uncertainty on incorrect predictions, so uncertainty becomes more sensitive to erroneous regions. The second one enforces the smoothness of the uncertainty in the regions with smooth disparity. Most stereo matching models, such as PSM-Net, GA-Net, and AA-Net, can be easily integrated with our approach. Experiments on multiple benchmark datasets show that our method improves stereo matching results. We prove that both aleatoric and epistemic uncertainties are well-calibrated with incorrect predictions. Particularly, our method can capture increased epistemic uncertainty on out-of-distribution data, making it effective to prevent a system from potential fatal consequences. Code is available at https://github.com/Dawnstar8411/StereoMatching-Uncertainty. Numéro de notice : A2022-198 Affiliation des auteurs : non IGN Thématique : IMAGERIE/INFORMATIQUE Nature : Article DOI : 10.1016/j.patcog.2021.108498 Date de publication en ligne : 23/12/2021 En ligne : https://doi.org/10.1016/j.patcog.2021.108498 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99992
in Pattern recognition > vol 124 (April 2022) . - n° 108498[article]