Descripteur
Termes IGN > 1- Outils - instruments et méthodes > instrument > capteur (télédétection) > capteur imageur > caméra numérique
caméra numériqueSynonyme(s)chambre numériqueVoir aussi |
Documents disponibles dans cette catégorie (309)
Ajouter le résultat dans votre panier
Visionner les documents numériques
Affiner la recherche Interroger des sources externes
Etendre la recherche sur niveau(x) vers le bas
Deblurring low-light images with events / Chu Zhou in International journal of computer vision, vol 131 n° 5 (May 2023)
[article]
Titre : Deblurring low-light images with events Type de document : Article/Communication Auteurs : Chu Zhou, Auteur ; Minggui Teng, Auteur ; Jin Han, Auteur ; et al., Auteur Année de publication : 2023 Article en page(s) : pp 1284 - 1298 Note générale : bilbiographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] apprentissage profond
[Termes IGN] caméra d'événement
[Termes IGN] correction d'image
[Termes IGN] filtrage du bruit
[Termes IGN] flou
[Termes IGN] image à basse résolution
[Termes IGN] image RVBRésumé : (auteur) Modern image-based deblurring methods usually show degenerate performance in low-light conditions since the images often contain most of the poorly visible dark regions and a few saturated bright regions, making the amount of effective features that can be extracted for deblurring limited. In contrast, event cameras can trigger events with a very high dynamic range and low latency, which hardly suffer from saturation and naturally encode dense temporal information about motion. However, in low-light conditions existing event-based deblurring methods would become less robust since the events triggered in dark regions are often severely contaminated by noise, leading to inaccurate reconstruction of the corresponding intensity values. Besides, since they directly adopt the event-based double integral model to perform pixel-wise reconstruction, they can only handle low-resolution grayscale active pixel sensor images provided by the DAVIS camera, which cannot meet the requirement of daily photography. In this paper, to apply events to deblurring low-light images robustly, we propose a unified two-stage framework along with a motion-aware neural network tailored to it, reconstructing the sharp image under the guidance of high-fidelity motion clues extracted from events. Besides, we build an RGB-DAVIS hybrid camera system to demonstrate that our method has the ability to deblur high-resolution RGB images due to the natural advantages of our two-stage framework. Experimental results show our method achieves state-of-the-art performance on both synthetic and real-world images. Numéro de notice : A2023-210 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 10.1007/s11263-023-01754-5 Date de publication en ligne : 06/02/2023 En ligne : https://doi.org/10.1007/s11263-023-01754-5 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=103062
in International journal of computer vision > vol 131 n° 5 (May 2023) . - pp 1284 - 1298[article]In-camera IMU angular data for orthophoto projection in underwater photogrammetry / Erica Nocerino in ISPRS Open Journal of Photogrammetry and Remote Sensing, vol 7 (January 2023)
[article]
Titre : In-camera IMU angular data for orthophoto projection in underwater photogrammetry Type de document : Article/Communication Auteurs : Erica Nocerino, Auteur ; Fabio Menna, Auteur Année de publication : 2023 Article en page(s) : n° 100027 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Photogrammétrie
[Termes IGN] caméra numérique
[Termes IGN] carte bathymétrique
[Termes IGN] centrale inertielle
[Termes IGN] compensation par faisceaux
[Termes IGN] mesure géodésique
[Termes IGN] orthophotographie
[Termes IGN] photogrammétrie sous-marine
[Termes IGN] positionnement par GNSS
[Termes IGN] redressement différentiel
[Termes IGN] roulis
[Termes IGN] structure-from-motion
[Termes IGN] tangageRésumé : (auteur) Among photogrammetric products, orthophotos are probably the most versatile and widely used in many fields of application. In the last years, coupled with the spread of semi-automated survey and processing approaches based on photogrammetry, orthophotos have become almost a standard for monitoring the underwater environment. If on land the definition of the reference coordinate system and projection plane for the orthophoto generation is trivial, underwater it may represent a challenge. In this paper, we address the issue of defining the vertical direction and resulting horizontal plane (levelling) for the differential ortho rectification. We propose a non-invasive, contactless method based on roll and pitch angular data provided by in-camera IMU sensors and embedded in the Exif metadata of JPEG and raw image files. We show how our approach can be seamlessly integrated into automatic SfM/MVS pipelines, provide the mathematical background, and showcase real-world applications results in an underwater monitoring project. The results illustrate the effectiveness of the proposed method and, for the first time, provide a metric evaluation of the definition of the vertical direction with low-cost sensors enclosed in digital cameras directly underwater. Numéro de notice : A2023-119 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Numéro de périodique DOI : 10.1016/j.ophoto.2022.100027 Date de publication en ligne : 07/12/2022 En ligne : https://doi.org/10.1016/j.ophoto.2022.100027 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=102493
in ISPRS Open Journal of Photogrammetry and Remote Sensing > vol 7 (January 2023) . - n° 100027[article]Benchmarking laser scanning and terrestrial photogrammetry to extract forest inventory parameters in a complex temperate forest / Daniel Kükenbrink in International journal of applied Earth observation and geoinformation, vol 113 (September 2022)
[article]
Titre : Benchmarking laser scanning and terrestrial photogrammetry to extract forest inventory parameters in a complex temperate forest Type de document : Article/Communication Auteurs : Daniel Kükenbrink, Auteur ; Mauro Marty, Auteur ; Ruedi Bösch, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 102999 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications photogrammétriques
[Termes IGN] caméra à bas coût
[Termes IGN] cartographie et localisation simultanées
[Termes IGN] détection d'arbres
[Termes IGN] diamètre à hauteur de poitrine
[Termes IGN] données lidar
[Termes IGN] données localisées 3D
[Termes IGN] forêt tempérée
[Termes IGN] inventaire forestier (techniques et méthodes)
[Termes IGN] lidar mobile
[Termes IGN] lidar topographique
[Termes IGN] photogrammétrie terrestre
[Termes IGN] semis de points
[Termes IGN] série temporelle
[Termes IGN] structure-from-motion
[Termes IGN] Zurich (Suisse)Résumé : (auteur) National forest inventories (NFI) are important for the assessment of the state and development of forests. Traditional NFIs often rely on statistical sampling approaches as well as expert assessment which may suffer from observer bias and may lack robustness for time series analysis. Over the course of the last decade, close-range remote sensing techniques such as terrestrial and mobile laser scanning became ever more established for the assessment of three-dimensional (3D) forest structure. With the ongoing trend to make the systems smaller, easier to use and more efficient, the pathway is being opened for an operational inclusion of such devices within the framework of an NFI to support the traditional field assessment. Close-range remote sensing could potentially speed up field inventory work as well as increase the area in which certain parameters are assessed. Benchmarks are needed to evaluate the performance of different close-range remote sensing devices and approaches, both in terms of efficiency as well as accuracy. In this study we evaluate the performance of two terrestrial (TLS), one handheld mobile (PLS) and two drone based (UAVLS) laser scanning systems to detect trees and extract the diameter at breast height (DBH) in three plots with a steep gradient in tree and understorey vegetation density. As a novelty, we also tested the acquisition of 3D point-clouds using a low-cost action camera (GoPro) in conjunction with the Structure from Motion (SfM) technique and compared its performance with those of the more costly LiDAR devices. Among the many parameters evaluated in traditional NFIs, the focus of the performance evaluation of this study is set on the automatic tree detection and DBH extraction. The results showed that TLS delivers the highest tree detection rate (TDR) of up to 94.6% under leaf-off and up to 82% under leaf-on conditions and a relative RMSE (rRMSE) for the DBH extraction between 2.5 and 9%, depending on the undergrowth complexity. The tested PLS system (leaf-on) achieved a TDR of up to 80% with an rRMSE between 3.7 and 5.8%. The tested UAVLS systems showed lowest TDR of less than 77% under leaf-off and less than 37% under leaf-on conditions. The novel GoPro approach achieved a TDR of up to 53% under leaf-on conditions. The reduced TDR can be explained by the reduced area coverage due to the chosen circular acquisition path taken with the GoPro approach. The DBH extraction performance on the other hand is comparable to those of the LiDAR devices with an rRMSE between 2 and 9%. Further benchmarks are needed in order to fully assess the applicability of these systems in the framework of an NFI. Especially the robustness under varying forest conditions (seasonality) and over a broader range of forest types and canopy structure has to be evaluated. Numéro de notice : A2022-787 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article DOI : 10.1016/j.jag.2022.102999 Date de publication en ligne : 05/09/2022 En ligne : https://doi.org/10.1016/j.jag.2022.102999 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=101893
in International journal of applied Earth observation and geoinformation > vol 113 (September 2022) . - n° 102999[article]Calibration radiométrique et géométrique d'une caméra fish-eye pour la mesure de l'hémisphère de luminance incidente / Manchun Lei (2022)
Titre : Calibration radiométrique et géométrique d'une caméra fish-eye pour la mesure de l'hémisphère de luminance incidente Type de document : Rapport Auteurs : Manchun Lei ; Christophe Meynard , Auteur ; Jean-Michaël Muller , Auteur ; Christian Thom , Auteur Editeur : Saint-Mandé : Institut national de l'information géographique et forestière - IGN (2012-) Année de publication : 2022 Importance : 34 p. Note générale : bibliographie Langues : Français (fre) Descripteur : [Vedettes matières IGN] Acquisition d'image(s) et de donnée(s)
[Termes IGN] caméra numérique
[Termes IGN] étalonnage géométrique
[Termes IGN] étalonnage radiométrique
[Termes IGN] image hémisphérique
[Termes IGN] MicMacRésumé : (auteur) [introduction] [...] Nous avons développé un imageur hémisphérique de luminance incidente basé sur une caméra légère fish-eye. La particularité de cet imageur est sa légèreté et donc sa mobilité, ce qui est pratique pour mesurer l’hémisphère de luminance incidente des différentes surfaces (sol et façade) dans l’environnement. Ce rapport a pour objectif de documenter la procédure de calibration radiométrique et géométrique réalisée, depuis la description théorique jusqu’aux considérations pratiques. La phase de validation est aussi présentée. Note de contenu : 1 Introduction
2 Description des instruments
3 Calibration de bruit d'obscurité
4 Calibration radiométrique
5 Calibration géométriqueNuméro de notice : 17734 Affiliation des auteurs : UGE-LASTIG (2020- ) Thématique : IMAGERIE Nature : Rapport nature-HAL : Rapport DOI : sans En ligne : https://hal.science/hal-03665394v1 Format de la ressource électronique : URL Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100625 Documents numériques
peut être téléchargé
Calibration radiométrique et géométrique... - pdf auteurAdobe Acrobat PDF
Titre : Event-driven feature detection and tracking for visual SLAM Type de document : Thèse/HDR Auteurs : Ignacio Alzugaray, Auteur Editeur : Zurich : Eidgenossische Technische Hochschule ETH - Ecole Polytechnique Fédérale de Zurich EPFZ Année de publication : 2022 Note générale : bibliographie
thesis submitted to attain the degree of Doctor of Sciences of ETH ZurichLangues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] caméra d'événement
[Termes IGN] cartographie et localisation simultanées
[Termes IGN] détection d'objet
[Termes IGN] extraction de traits caractéristiques
[Termes IGN] image floue
[Termes IGN] reconnaissance de formes
[Termes IGN] séquence d'images
[Termes IGN] vision par ordinateurIndex. décimale : THESE Thèses et HDR Résumé : (auteur) Traditional frame-based cameras have become the de facto sensor of choice for a multitude of applications employing Computer Vision due to their compactness, low cost, ubiquity, and ability to provide information-rich exteroceptive measurements. Despite their dominance in the field, these sensors exhibit limitations in common, real-world scenarios where detrimental effects, such as motion blur during high-speed motion or over-/underexposure in scenes with poor illumination, are prevalent. Challenging the dominance of traditional cameras, the recent emergence of bioinspired event cameras has opened up exciting research possibilities for robust perception due to their high-speed sensing, High-Dynamic-Range capabilities, and low power consumption. Despite their promising characteristics, event cameras present numerous challenges due to their unique output: a sparse and asynchronous stream of events, only capturing incremental perceptual changes at individual pixels. This radically different sensing modality renders most of the traditional Computer Vision algorithms incompatible without substantial prior adaptation, as they are initially devised for processing sequences of images captured at fixed frame-rate. Consequently, the bulk of existing event-based algorithms in the literature have opted to discretize the event stream into batches and process them sequentially, effectively reverting to frame-like representations in an attempt to mimic the processing of image sequences from traditional sensors. Such event-batching algorithms have demonstrably outperformed other alternative frame-based algorithms in scenarios where the quality of conventional intensity images is severely compromised, unveiling the inherent potential of these new sensors and popularizing them. To date, however, many newly designed event-based algorithms still rely on a contrived discretization of the event stream for its processing, suggesting that the full potential of event cameras is yet to be harnessed by processing their output more naturally. This dissertation departs from the mere adaptation of traditional frame-based approaches and advocates instead for the development of new algorithms integrally designed for event cameras to fully exploit their advantageous characteristics. In particular, the focus of this thesis lies on describing a series of novel strategies and algorithms that operate in a purely event-driven fashion, \ie processing each event as soon as it gets generated without any intermediate buffering of events into arbitrary batches and thus avoiding any additional latency in their processing. Such event-driven processes present additional challenges compared to their simpler event-batching counterparts, which, in turn, can largely be attributed to the requirement to produce reliable results at event-rate, entailing significant practical implications for their deployment in real-world applications. The body of this thesis addresses the design of event-driven algorithms for efficient and asynchronous feature detection and tracking with event cameras, covering alongside crucial elements on pattern recognition and data association for this emerging sensing modality. In particular, a significant portion of this thesis is devoted to the study of visual corners for event cameras, leading to the design of innovative event-driven approaches for their detection and tracking as corner-events. Moreover, the presented research also investigates the use of generic patch-based features and their event-driven tracking for the efficient retrieval of high-quality feature tracks. All the developed algorithms in this thesis serve as crucial stepping stones towards a completely event-driven, feature-based Simultaneous Localization And Mapping (SLAM) pipeline. This dissertation extends upon established concepts from state-of-the-art, event-driven methods and further explores the limits of the event-driven paradigm in realistic monocular setups. While the presented approaches solely rely on event-data, the gained insights are seminal to future investigations targeting the combination of event-based vision with other, complementary sensing modalities. The research conducted here paves the way towards a new family of event-driven algorithms that operate efficiently, robustly, and in a scalable manner, envisioning a potential paradigm shift in event-based Computer Vision. Note de contenu : 1- Introduction
2- Contribution
3- Conclusion and outlookNuméro de notice : 28699 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Thèse étrangère Note de thèse : PhD Thesis : Sciences : ETH Zurich : 2022 DOI : sans En ligne : https://www.research-collection.ethz.ch/handle/20.500.11850/541700 Format de la ressource électronique : URL Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=100470 Implementation of close range photogrammetry using modern non-metric digital cameras for architectural documentation / Mariem A. Elhalawani in Geodesy and cartography, vol 47 n° 1 (January 2021)PermalinkUsing automated vegetation cover estimation from close-range photogrammetric point clouds to compare vegetation location properties in mountain terrain / R. Niederheiser in GIScience and remote sensing, vol 58 n° 1 (February 2021)PermalinkPermalinkPermalinkVers un protocole de calibration de caméras statiques à l'aide d'un drone / Jean-François Villeforceix (2021)PermalinkTowards online UAS‐based photogrammetric measurements for 3D metrology inspection / Fabio Menna in Photogrammetric record, vol 35 n° 172 (December 2020)PermalinkStructure from motion for complex image sets / Mario Michelini in ISPRS Journal of photogrammetry and remote sensing, vol 166 (August 2020)PermalinkImproved depth estimation for occlusion scenes using a light-field camera / Changkun Yang in Photogrammetric Engineering & Remote Sensing, PERS, vol 86 n° 7 (July 2020)PermalinkGeometric modelling and calibration of a spherical camera imaging system / Derek D. Lichti in Photogrammetric record, vol 35 n° 170 (June 2020)PermalinkPermalink