Détail de l'auteur
Auteur Di Wang |
Documents disponibles écrits par cet auteur (2)



Improving deep learning on point cloud by maximizing mutual information across layers / Di Wang in Pattern recognition, vol 131 (November 2022)
![]()
[article]
Titre : Improving deep learning on point cloud by maximizing mutual information across layers Type de document : Article/Communication Auteurs : Di Wang, Auteur ; Lulu Tang, Auteur ; Xu Wang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 108892 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] détection d'objet
[Termes IGN] entropie de Shannon
[Termes IGN] information sémantique
[Termes IGN] segmentation sémantique
[Termes IGN] semis de points
[Termes IGN] transformation géométrique
[Termes IGN] vision par ordinateur
[Termes IGN] visualisation 3DRésumé : (auteur) It is a fundamental and vital task to enhance the perception capability of the point cloud learning network in 3D machine vision applications. Most existing methods utilize feature fusion and geometric transformation to improve point cloud learning without paying enough attention to mining further intrinsic information across multiple network layers. Motivated to improve consistency between hierarchical features and strengthen the perception capability of the point cloud network, we propose exploring whether maximizing the mutual information (MI) across shallow and deep layers is beneficial to improve representation learning on point clouds. A novel design of Maximizing Mutual Information (MMI) Module is proposed, which assists the training process of the main network to capture discriminative features of the input point clouds. Specifically, the MMI-based loss function is employed to constrain the differences of semantic information in two hierarchical features extracted from the shallow and deep layers of the network. Extensive experiments show that our method is generally applicable to point cloud tasks, including classification, shape retrieval, indoor scene segmentation, 3D object detection, and completion, and illustrate the efficacy of our proposed method and its advantages over existing ones. Our source code is available at https://github.com/wendydidi/MMI.git. Numéro de notice : A2022-780 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : https://doi.org/10.1016/j.patcog.2022.108892 Date de publication en ligne : 08/07/2022 En ligne : https://doi.org/10.1016/j.patcog.2022.108892 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=101859
in Pattern recognition > vol 131 (November 2022) . - n° 108892[article]Unsupervised semantic and instance segmentation of forest point clouds / Di Wang in ISPRS Journal of photogrammetry and remote sensing, vol 165 (July 2020)
![]()
[article]
Titre : Unsupervised semantic and instance segmentation of forest point clouds Type de document : Article/Communication Auteurs : Di Wang, Auteur Année de publication : 2020 Article en page(s) : pp 86 - 97 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Lasergrammétrie
[Termes IGN] analyse de groupement
[Termes IGN] classification non dirigée
[Termes IGN] données lidar
[Termes IGN] hauteur des arbres
[Termes IGN] houppier
[Termes IGN] indice foliaire
[Termes IGN] interprétation automatique
[Termes IGN] segmentation sémantique
[Termes IGN] semis de points
[Termes IGN] télémètre laser terrestreRésumé : (auteur) Terrestrial Laser Scanning (TLS) has been increasingly used in forestry applications including forest inventory and plant ecology. Tree biophysical properties such as leaf area distributions and wood volumes can be accurately estimated from TLS point clouds. In these applications, a prerequisite is to properly understand the information content of large scale point clouds (i.e., semantic labelling of point clouds), so that tree-scale attributes can be retrieved. Currently, this requirement is undergoing laborious and time consuming manual works. In this work, we jointly address the problems of semantic and instance segmentation of forest point clouds. Specifically, we propose an unsupervised pipeline based on a structure called superpoint graph, to simultaneously perform two tasks: single tree isolation and leaf-wood classification. The proposed method is free from restricted assumptions of forest types. Validation using simulated data resulted in a mean Intersection over Union (mIoU) of 0.81 for single tree isolation, and an overall accuracy of 87.7% for leaf-wood classification. The single tree isolation led to a relative root mean square error (RMSE%) of 2.9% and 19.8% for tree height and crown diameter estimations, respectively. Comparisons with existing methods on other benchmark datasets showed state-of-the-art results of our method on both single tree isolation and leaf-wood classification tasks. We provide the entire framework as an open-source tool with an end-user interface. This study closes the gap for using TLS point clouds to quantify tree-scale properties in large areas, where automatic interpretation of the information content of TLS point clouds remains a crucial challenge. Numéro de notice : A2020-347 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2020.04.020 Date de publication en ligne : 28/05/2020 En ligne : https://doi.org/10.1016/j.isprsjprs.2020.04.020 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=95228
in ISPRS Journal of photogrammetry and remote sensing > vol 165 (July 2020) . - pp 86 - 97[article]Réservation
Réserver ce documentExemplaires (3)
Code-barres Cote Support Localisation Section Disponibilité 081-2020071 RAB Revue Centre de documentation En réserve 3L Disponible 081-2020073 DEP-RECP Revue LaSTIG Dépôt en unité Exclu du prêt 081-2020072 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt