Descripteur
Termes descripteurs IGN > informatique > intelligence artificielle > réseau neuronal artificiel > réseau de neurones profond
réseau de neurones profond |



Etendre la recherche sur niveau(x) vers le bas
A deep learning architecture for semantic address matching / Yue Lin in International journal of geographical information science IJGIS, vol 34 n° 3 (March 2020)
![]()
[article]
Titre : A deep learning architecture for semantic address matching Type de document : Article/Communication Auteurs : Yue Lin, Auteur ; Mengjun Kang, Auteur ; Yuyang Wu, Auteur Année de publication : 2020 Article en page(s) : pp 559 - 576 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Géomatique
[Termes descripteurs IGN] appariement d'adresses
[Termes descripteurs IGN] appariement sémantique
[Termes descripteurs IGN] apprentissage automatique
[Termes descripteurs IGN] apprentissage profond
[Termes descripteurs IGN] géocodage par adresse postale
[Termes descripteurs IGN] gestion urbaine
[Termes descripteurs IGN] inférence sémantique
[Termes descripteurs IGN] repésentation vectorielle
[Termes descripteurs IGN] réseau de neurones profond
[Termes descripteurs IGN] Shenzhen
[Termes descripteurs IGN] similitude sémantique
[Termes descripteurs IGN] traitement du langage naturelRésumé : (auteur) Address matching is a crucial step in geocoding, which plays an important role in urban planning and management. To date, the unprecedented development of location-based services has generated a large amount of unstructured address data. Traditional address matching methods mainly focus on the literal similarity of address records and are therefore not applicable to the unstructured address data. In this study, we introduce an address matching method based on deep learning to identify the semantic similarity between address records. First, we train the word2vec model to transform the address records into their corresponding vector representations. Next, we apply the enhanced sequential inference model (ESIM), a deep text-matching model, to make local and global inferences to determine if two addresses match. To evaluate the accuracy of the proposed method, we fine-tune the model with real-world address data from the Shenzhen Address Database and compare the outputs with those of several popular address matching methods. The results indicate that the proposed method achieves a higher matching accuracy for unstructured address records, with its precision, recall, and F1 score (i.e., the harmonic mean of precision and recall) reaching 0.97 on the test set. Numéro de notice : A2020-106 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1080/13658816.2019.1681431 date de publication en ligne : 24/10/2019 En ligne : https://doi.org/10.1080/13658816.2019.1681431 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=94702
in International journal of geographical information science IJGIS > vol 34 n° 3 (March 2020) . - pp 559 - 576[article]Réservation
Réserver ce documentExemplaires (1)
Code-barres Cote Support Localisation Section Disponibilité 079-2020031 SL Revue Centre de documentation Revues en salle Disponible Volcano-seismic transfer learning and uncertainty quantification with bayesian neural networks / Angel Bueno in IEEE Transactions on geoscience and remote sensing, vol 58 n° 2 (February 2020)
![]()
[article]
Titre : Volcano-seismic transfer learning and uncertainty quantification with bayesian neural networks Type de document : Article/Communication Auteurs : Angel Bueno, Auteur ; Carmen Benitez, Auteur ; Silvio De Angelis, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Statistiques
[Termes descripteurs IGN] apprentissage profond
[Termes descripteurs IGN] classification bayesienne
[Termes descripteurs IGN] classification par réseau neuronal
[Termes descripteurs IGN] forme d'onde
[Termes descripteurs IGN] incertitude des données
[Termes descripteurs IGN] réseau bayesien
[Termes descripteurs IGN] réseau de neurones profond
[Termes descripteurs IGN] Russie
[Termes descripteurs IGN] séisme
[Termes descripteurs IGN] sismologie
[Termes descripteurs IGN] surveillance géologique
[Termes descripteurs IGN] volcanologie
[Termes descripteurs IGN] Washington (Etats-Unis ; état)Résumé : (auteur) Over the past few years, deep learning (DL) has emerged as an important tool in the fields of volcano and earthquake seismology. However, these methods have been applied without performing thorough analyses of the associated uncertainties. Here, we propose a solution to enhance volcano-seismic monitoring systems, through probabilistic Bayesian DL; we implement and demonstrate a workflow for waveform classification, rapid quantification of the associated uncertainty, and link these uncertainties to changes in volcanic unrest. Specifically, we introduce Bayesian neural networks (BNNs) to perform event identification, classification, and their estimated uncertainty on data gathered at two active volcanoes, Mount St. Helens, Washington, USA, and Bezymianny, Kamchatka, Russia. We demonstrate how BNNs achieve excellent performance (92.08%) in discriminating both the type of event and its origin when the two data sets are merged together, and no additional training information is provided. Finally, we demonstrate that the data representations learned by the BNNs are transferable across different eruptive periods. We also find that the estimated uncertainty is related to changes in the state of unrest at the volcanoes and propose that it could be used to gauge whether the learned models may be exported to other eruptive scenarios. Numéro de notice : A2020-094 Affiliation des auteurs : non IGN Thématique : MATHEMATIQUE/POSITIONNEMENT Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2019.2941494 date de publication en ligne : 07/10/2019 En ligne : https://doi.org/10.1109/TGRS.2019.2941494 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=94657
in IEEE Transactions on geoscience and remote sensing > vol 58 n° 2 (February 2020) . - pp[article]Superpixel-enhanced deep neural forest for remote sensing image semantic segmentation / Li Mi in ISPRS Journal of photogrammetry and remote sensing, vol 159 (January 2020)
![]()
[article]
Titre : Superpixel-enhanced deep neural forest for remote sensing image semantic segmentation Type de document : Article/Communication Auteurs : Li Mi, Auteur ; Zhenzhong Chen, Auteur Année de publication : 2020 Article en page(s) : pp 140 - 152 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image
[Termes descripteurs IGN] apprentissage automatique
[Termes descripteurs IGN] classification par forêts aléatoires
[Termes descripteurs IGN] image à très haute résolution
[Termes descripteurs IGN] processus stochastique
[Termes descripteurs IGN] réseau de neurones profond
[Termes descripteurs IGN] segmentation sémantique
[Termes descripteurs IGN] SLIC algorithm
[Termes descripteurs IGN] superpixelRésumé : (Auteur) Semantic segmentation plays an important role in remote sensing image understanding. Great progress has been made in this area with the development of Deep Convolutional Neural Networks (DCNNs). However, due to the complexity of ground objects’ spectrum, DCNNs with simple classifier have difficulties in distinguishing ground object categories even though they can represent image features effectively. Additionally, DCNN-based semantic segmentation methods learn to accumulate contextual information over large receptive fields that causes blur on object boundaries. In this work, a novel approach named Superpixel-enhanced Deep Neural Forest (SDNF) is proposed to target the aforementioned problems. To improve the classification ability, we introduce Deep Neural Forest (DNF), where the representation learning of deep neural network is conducted by a completely differentiable decision forest. Therefore, better classification accuracy is achieved by combining DCNNs with decision forests in an end-to-end manner. In addition, considering the homogeneity within superpixels and heterogeneity between superpixels, a Superpixel-enhanced Region Module (SRM) is proposed to further alleviate the noises and strengthen edges of ground objects. Experimental results on the ISPRS 2D semantic labeling benchmark demonstrate that our model significantly outperforms state-of-the-art methods thus validate the efficiency of our proposed SDNF. Numéro de notice : A2020-014 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2019.11.006 date de publication en ligne : 29/11/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.11.006 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=94403
in ISPRS Journal of photogrammetry and remote sensing > vol 159 (January 2020) . - pp 140 - 152[article]Réservation
Réserver ce documentExemplaires (3)
Code-barres Cote Support Localisation Section Disponibilité 081-2020011 SL Revue Centre de documentation Revues en salle Disponible 081-2020013 DEP-RECP Revue MATIS Dépôt en unité Exclu du prêt 081-2020012 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Torch-Points3D: A modular multi-task framework for reproducible deep learning on 3D point clouds / Thomas Chaton (2020)
![]()
Titre : Torch-Points3D: A modular multi-task framework for reproducible deep learning on 3D point clouds Type de document : Article/Communication Auteurs : Thomas Chaton, Auteur ; Nicolas Chaulet, Auteur ; Sofiane Horache, Auteur ; Loïc Landrieu , Auteur
Editeur : [s.l.] : [s.n.] Année de publication : 2020 Projets : AI4GEO / Conférence : 3DV 2020, International Conference on 3D Vision 25/11/2020 27/11/2020 en ligne Japon Importance : 12 p. - n° 282 Format : 21 x 30 cm Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Informatique
[Termes descripteurs IGN] apprentissage profond
[Termes descripteurs IGN] cadre conceptuel
[Termes descripteurs IGN] code source libre
[Termes descripteurs IGN] données localisées 3D
[Termes descripteurs IGN] reproductibilité
[Termes descripteurs IGN] réseau de neurones profondRésumé : (auteur) We introduce Torch-Points3D, an open-source framework designed to facilitate the use of deep networks on 3D data. Its modular design, efficient implementation, and user-friendly interfaces make it a relevant tool for research and productization alike. Beyond multiple quality-of-life features, our goal is to standardize a higher level of transparency and reproducibility in 3D deep learning research, and to lower its barrier to entry. In this paper, we present the design principles of Torch-Points3D, as well as extensive benchmarks of multiple state-of-the-art algorithms and inference schemes across several datasets and tasks. The modularity of Torch-Points3D allows us to design fair and rigorous experimental protocols in which all methods are evaluated in the same conditions. The Torch-Points3D repository : https://github.com/nicolas-chaulet/torch-points3d. Numéro de notice : C2020-019 Affiliation des auteurs : LaSTIG+Ext (2020- ) Thématique : IMAGERIE/INFORMATIQUE Nature : Poster nature-HAL : Poster-avec-CL DOI : en attente En ligne : https://hal.archives-ouvertes.fr/hal-03013190 Format de la ressource électronique : vers HAL Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=96456 Addressing overfitting on point cloud classification using Atrous XCRF / Hasan Asy’ari Arief in ISPRS Journal of photogrammetry and remote sensing, vol 155 (September 2019)
![]()
[article]
Titre : Addressing overfitting on point cloud classification using Atrous XCRF Type de document : Article/Communication Auteurs : Hasan Asy’ari Arief, Auteur ; Ulf Geir Indahl, Auteur ; Geir-Harald Strand, Auteur ; Håvard Tveite, Auteur Année de publication : 2019 Article en page(s) : pp 90 - 101 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Lasergrammétrie
[Termes descripteurs IGN] champ aléatoire conditionnel
[Termes descripteurs IGN] classification automatique
[Termes descripteurs IGN] réseau de neurones profond
[Termes descripteurs IGN] réseau neuronal convolutif
[Termes descripteurs IGN] semis de pointsRésumé : (Auteur) Advances in techniques for automated classification of point cloud data introduce great opportunities for many new and existing applications. However, with a limited number of labelled points, automated classification by a machine learning model is prone to overfitting and poor generalization. The present paper addresses this problem by inducing controlled noise (on a trained model) generated by invoking conditional random field similarity penalties using nearby features. The method is called Atrous XCRF and works by forcing a trained model to respect the similarity penalties provided by unlabeled data. In a benchmark study carried out using the ISPRS 3D labeling dataset, our technique achieves 85.0% in term of overall accuracy, and 71.1% in term of F1 score. The result is on par with the current best model for the benchmark dataset and has the highest value in term of F1 score. Additionally, transfer learning using the Bergen 2018 dataset, without model retraining, was also performed. Even though our proposal provides a consistent 3% improvement in term of accuracy, more work still needs to be done to alleviate the generalization problem on the domain adaptation and the transfer learning field. Numéro de notice : A2019-312 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.1016/j.isprsjprs.2019.07.002 date de publication en ligne : 11/07/2019 En ligne : https://doi.org/10.1016/j.isprsjprs.2019.07.002 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=93337
in ISPRS Journal of photogrammetry and remote sensing > vol 155 (September 2019) . - pp 90 - 101[article]Réservation
Réserver ce documentExemplaires (3)
Code-barres Cote Support Localisation Section Disponibilité 081-2019091 RAB Revue Centre de documentation En réserve 3L Disponible 081-2019093 DEP-RECP Revue MATIS Dépôt en unité Exclu du prêt 081-2019092 DEP-RECF Revue Nancy Dépôt en unité Exclu du prêt Permalink