Détail de l'auteur
Auteur Eero Liski |
Documents disponibles écrits par cet auteur (1)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Comparison of neural networks and k-nearest neighbors methods in forest stand variable estimation using airborne laser data / Andras Balazs in ISPRS Open Journal of Photogrammetry and Remote Sensing, vol 4 (April 2022)
[article]
Titre : Comparison of neural networks and k-nearest neighbors methods in forest stand variable estimation using airborne laser data Type de document : Article/Communication Auteurs : Andras Balazs, Auteur ; Eero Liski, Auteur ; Sakari Tuominen, Auteur Année de publication : 2022 Article en page(s) : n° 100012 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Lasergrammétrie
[Termes IGN] algorithme génétique
[Termes IGN] bois sur pied
[Termes IGN] classification barycentrique
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] covariance
[Termes IGN] diamètre à hauteur de poitrine
[Termes IGN] données lidar
[Termes IGN] données localisées 3D
[Termes IGN] Finlande
[Termes IGN] hauteur des arbres
[Termes IGN] inventaire forestier étranger (données)
[Termes IGN] peuplement forestier
[Termes IGN] réseau neuronal artificiel
[Termes IGN] semis de points
[Termes IGN] volume en boisRésumé : (auteur) In the remote sensing of forests, point cloud data from airborne laser scanning contains high-value information for predicting the volume of growing stock and the size of trees. At the same time, laser scanning data allows a very high number of potential features that can be extracted from the point cloud data for predicting the forest variables. In some methods, the features are first extracted by user-defined algorithms and the best features are selected based on supervised learning, whereas both tasks can be carried out automatically by deep learning methods typically based on deep neural networks. In this study we tested k-nearest neighbor method combined with genetic algorithm (k-NN), artificial neural network (ANN), 2-dimensional convolutional neural network (2D-CNN) and 3-dimensional CNN (3D-CNN) for estimating the following forest variables: volume of growing stock, stand mean height and mean diameter. The results indicate that there were no major differences in the accuracy of the tested methods, but the ANN and 3D-CNN generally resulted in the lowest RMSE values for the predicted forest variables and the highest R2 values between the predicted and observed forest variables. The lowest RMSE scores were 20.3% (3D-CNN), 6.4% (3D-CNN) and 11.2% (ANN) and the highest R2 results 0.90 (3D-CNN), 0.95 (3D-CNN) and 0.85 (ANN) for volume of growing stock, stand mean height and mean diameter, respectively. Covariances of all response variable combinations and all predictions methods were lower than corresponding covariances of the field observations. ANN predictions had the highest covariances for mean height vs. mean diameter and total growing stock vs. mean diameter combinations and 3D-CNN for mean height vs. total growing stock. CNNs have distinct theoretical advantage over the other methods in complex recognition or classification tasks, but the utilization of their full potential may possibly require higher point density clouds than applied here. Thus, the relatively low density of the point clouds data may have been a contributing factor to the somewhat inconclusive ranking of the methods in this study. The input data and computer codes are available at: https://github.com/balazsan/ALS_NNs. Numéro de notice : A2022-265 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.ophoto.2022.100012 Date de publication en ligne : 12/03/2022 En ligne : https://doi.org/10.1016/j.ophoto.2022.100012 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100263
in ISPRS Open Journal of Photogrammetry and Remote Sensing > vol 4 (April 2022) . - n° 100012[article]