Détail de l'auteur
Auteur Antonios Morellos |
Documents disponibles écrits par cet auteur (1)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Comparison of deep neural networks in detecting field grapevine diseases using transfer learning / Antonios Morellos in Remote sensing, vol 14 n° 18 (September-2 2022)
[article]
Titre : Comparison of deep neural networks in detecting field grapevine diseases using transfer learning Type de document : Article/Communication Auteurs : Antonios Morellos, Auteur ; Xanthoula Eirini Pantazi, Auteur ; Charalampos Paraskevas, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 4648 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de télédétection
[Termes IGN] apprentissage profond
[Termes IGN] classification par réseau neuronal convolutif
[Termes IGN] détection de changement
[Termes IGN] extraction de traits caractéristiques
[Termes IGN] Grèce
[Termes IGN] jeu de données
[Termes IGN] maladie cryptogamique
[Termes IGN] maladie phytosanitaire
[Termes IGN] viticultureRésumé : (auteur) Plants diseases constitute a substantial threat for farmers given the high economic and environmental impact of their treatment. Detecting possible pathogen threats in plants based on non-destructive remote sensing and computer vision methods offers an alternative to existing laboratory methods and leads to improved crop management. Vine is an important crop that is mainly affected by fungal diseases. In this study, photos from healthy leaves and leaves infected by a fungal disease of vine are used to create disease identification classifiers. The transfer learning technique was employed in this study and was used to train three different deep convolutional neural network (DCNN) approaches that were compared according to their classification accuracy, namely AlexNet, VGG-19, and Inception v3. The above-mentioned models were trained on the open-source PlantVillage dataset using two training approaches: feature extraction, where the weights of the base deep neural network model were frozen and only the ones on the newly added layers were updated, and fine tuning, where the weights of the base model were also updated during training. Then, the created models were validated on the PlantVillage dataset and retrained using a custom field-grown vine photo dataset. The results showed that the fine-tuning approach showed better validation and testing accuracy, for all DCNNs, compared to the feature extraction approach. As far as the results of DCNNs are concerned, the Inception v3 algorithm outperformed VGG-19 and AlexNet in almost all the cases, demonstrating a validation performance of 100% for the fine-tuned strategy on the PlantVillage dataset and an accuracy of 83.3% for the respective strategy on a custom vine disease use case dataset, while AlexNet achieved 87.5% validation and 66.7% accuracy for the respective scenarios. Regarding VGG-19, the validation performance reached 100%, with an accuracy of 76.7%. Numéro de notice : A2022-768 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.3390/rs14184648 Date de publication en ligne : 17/09/2022 En ligne : https://doi.org/10.3390/rs14184648 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=101794
in Remote sensing > vol 14 n° 18 (September-2 2022) . - n° 4648[article]