Descripteur
Termes IGN > imagerie > image spatiale > image satellite > image Worldview
image WorldviewVoir aussi |
Documents disponibles dans cette catégorie (97)
Ajouter le résultat dans votre panier
Visionner les documents numériques
Affiner la recherche Interroger des sources externes
Etendre la recherche sur niveau(x) vers le bas
RPC-based coregistration of VHR imagery for urban change detection / Shabnam Jabari in Photogrammetric Engineering & Remote Sensing, PERS, vol 82 n° 7 (juillet 2016)
[article]
Titre : RPC-based coregistration of VHR imagery for urban change detection Type de document : Article/Communication Auteurs : Shabnam Jabari, Auteur ; Yun Zhang, Auteur Année de publication : 2016 Article en page(s) : pp 521 - 534 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] angle de visée
[Termes IGN] coefficient de corrélation
[Termes IGN] détection de changement
[Termes IGN] image à très haute résolution
[Termes IGN] image Geoeye
[Termes IGN] image Ikonos
[Termes IGN] image multitemporelle
[Termes IGN] image Worldview
[Termes IGN] milieu urbain
[Termes IGN] modèle numérique de surface
[Termes IGN] modèle par fonctions rationnelles
[Termes IGN] points homologuesRésumé : (Auteur) In urban change detection, coregistration between bi-temporal Very High Resolution (VHR) images taken from different viewing angles, especially from high off-nadir angles, is very challenging. The relief displacements of elevated objects in such images usually lead to significant misregistration that negatively affects the accuracy of change detection. This paper presents a novel solution, called Patch-Wise CoRegistration (PWCR), that can overcome the misregistration problem caused by viewing angle difference and accordingly improve the accuracy of urban change detection. The PWCR method utilizes a Digital Surface Model (DSM) and the Rational Polynomial Coefficients (RPCs) of the images to find corresponding points in a bi-temporal image set. The corresponding points are then used to generate corresponding patches in the image set. To prove that the PWCR method can overcome the misregistration problem and help achieving accurate change detection, two change detection criteria are tested and incorporated into a change detection framework. Experiments on four bi-temporal image sets acquired by Ikonos, GeoEye-1, and Worldview-2 satellites from different viewing angles show that the PWCR method can achieve highly accurate image patch coregistration (up to 80 percent higher than traditional coregistration for elevated objects), so that the change detection framework can produce accurate urban change detection results (over 90 percent). Numéro de notice : A2016-514 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article DOI : 0.14358/PERS.82.7.521 En ligne : http://dx.doi.org/10.14358/PERS.82.7.521 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=81585
in Photogrammetric Engineering & Remote Sensing, PERS > vol 82 n° 7 (juillet 2016) . - pp 521 - 534[article]Exploiting joint sparsity for pansharpening : the J-SparseFI algorithm / Xiao Xiang Zhu in IEEE Transactions on geoscience and remote sensing, vol 54 n° 5 (May 2016)
[article]
Titre : Exploiting joint sparsity for pansharpening : the J-SparseFI algorithm Type de document : Article/Communication Auteurs : Xiao Xiang Zhu, Auteur ; Claas Grohnfeldt, Auteur ; Richard Bamler, Auteur Année de publication : 2016 Article en page(s) : pp 2664 - 2681 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] algorithme de fusion
[Termes IGN] données clairsemées
[Termes IGN] fusion d'images
[Termes IGN] image multibande
[Termes IGN] image panchromatique
[Termes IGN] image Worldview
[Termes IGN] reconstruction d'image
[Termes IGN] régularisation de Tychonoff
[Termes IGN] réponse spectraleRésumé : (Auteur) Recently, sparse signal representation of image patches has been explored to solve the pansharpening problem. Although these proposed sparse-reconstruction-based methods lead to promising results, three issues remained unsolved: 1) high computational cost; 2) no consideration given to the possibility of mutually correlated information in different multispectral channels; and 3) requirement that the spectral responses of the panchromatic (Pan) image and the multispectral image cover the same wavelength range, which is not necessarily valid for most sensors. In this paper, we propose a sophisticated sparse image fusion algorithm, which is named “jointly sparse fusion of images” (J-SparseFI). It is based on the earlier proposed sparse fusion of images (SparseFI) algorithm and overcomes the aforementioned three drawbacks of the existing sparse image fusion algorithms. The computational problem is handled by reducing the problem size and by proposing a fully parallelizable scheme. Moreover, J-SparseFI exploits the possible signal structure correlations between multispectral channels by introducing the joint sparsity model (JSM) and sharpening the highly correlated adjacent multispectral channels together. This is done by exploiting the distributed compressive sensing theory that restricts the solution of an underdetermined system by considering an ensemble of signals being jointly sparse. J-SparseFI also offers a practical solution to overcome spectral range mismatch between the Pan and multispectral images. By means of sensor spectral response and channel mutual correlation analysis, the multispectral channels are assigned to primary groups of joint channels, secondary groups of joint channels, and individual channels. Primary groups of joint channels, individual channels, and secondary groups of joint channels are then reconstructed sequentially, by the JSM or by modified SparseFI, using a dictionary trained from the Pan image or previously reconstructed high-resolution multispectral channels. A recipe of how to choose appropriate algorithm parameters, including the most crucial regularization parameter, is provided. The algorithm is evaluated and validated using WorldView-2-like images that are simulated using very high resolution airborne HySpex hyperspectral imagery and further practically demonstrated using real WorldView-2 images. The algorithm's performance is compared with other state-of-the-art methods. Visual and quantitative analyses demonstrate the high quality of the proposed method. In particular, the analysis of the difference images suggests that J-SparseFI is superior in image resolution recovery. Numéro de notice : A2016-844 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2015.2504261 En ligne : http://dx.doi.org/10.1109/TGRS.2015.2504261 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=82890
in IEEE Transactions on geoscience and remote sensing > vol 54 n° 5 (May 2016) . - pp 2664 - 2681[article]Quantitative quality evaluation of pansharpened imagery: consistency versus synthesis / Frosti Palsson in IEEE Transactions on geoscience and remote sensing, vol 54 n° 3 (March 2016)
[article]
Titre : Quantitative quality evaluation of pansharpened imagery: consistency versus synthesis Type de document : Article/Communication Auteurs : Frosti Palsson, Auteur ; Johannes R. Sveinsson, Auteur ; Magnus Orn Ulfarsson, Auteur ; Jon Atli Benediktsson, Auteur Année de publication : 2016 Article en page(s) : pp 1247 - 1259 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] cohérence des données
[Termes IGN] évaluation
[Termes IGN] fusion d'images
[Termes IGN] image de synthèse
[Termes IGN] image Quickbird
[Termes IGN] image Worldview
[Termes IGN] pansharpening (fusion d'images)
[Termes IGN] problème inverse
[Termes IGN] qualité des donnéesRésumé : (Auteur) Pansharpening is the process of fusing a high-resolution panchromatic image and a low-spatial-resolution multispectral image to yield a high-spatial-resolution multispectral image. This is a typical ill-posed inverse problem, and in the past two decades, many methods have been proposed to solve it. Still, there is no general consensus on the best way to quantitatively evaluate the spectral and spatial quality of the fused image. In this paper, we compare the two most widely used and accepted methods for quality evaluation. The first method is the verification of the synthesis property which states that the fused image should be as identical as possible to the multispectral image that the sensor would observe at a higher resolution. This is impossible to verify unless the observed images are spatially degraded so that the original observed multispectral image can be used as reference. The second method is to use metrics that do not use a reference, such as the quality no reference (QNR) metrics. However, there is another property, i.e., the consistency property, which states that the fused image reduced to the resolution of the original multispectral image should be as identical to the original image as possible. This has generally been considered a necessary condition that does not have to imply correct fusion. Using real WorldView-2 and QuickBird data and a total of 18 component substitution and multiresolution analysis methods, we demonstrate that the consistency property can indeed be used to give reliable assessment of the relative performance of pansharpening methods and is superior to using the QNR metrics. Numéro de notice : A2016-126 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2015.2476513 En ligne : http://dx.doi.org/10.1109/TGRS.2015.2476513 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=80007
in IEEE Transactions on geoscience and remote sensing > vol 54 n° 3 (March 2016) . - pp 1247 - 1259[article]Réservation
Réserver ce documentExemplaires (1)
Code-barres Cote Support Localisation Section Disponibilité 065-2016031 SL Revue Centre de documentation Revues en salle Disponible Optimising the spatial resolution of WorldView-2 pan-sharpened imagery for predicting levels of Gonipterus scutellatus defoliation in KwaZulu-Natal, South Africa / Romano Lottering in ISPRS Journal of photogrammetry and remote sensing, vol 112 (February 2016)
[article]
Titre : Optimising the spatial resolution of WorldView-2 pan-sharpened imagery for predicting levels of Gonipterus scutellatus defoliation in KwaZulu-Natal, South Africa Type de document : Article/Communication Auteurs : Romano Lottering, Auteur ; Onisimo Mutanga, Auteur Année de publication : 2016 Article en page(s) : pp 13–22 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Applications de télédétection
[Termes IGN] Afrique du sud (état)
[Termes IGN] Eucalyptus (genre)
[Termes IGN] image Worldview
[Termes IGN] indice de végétation
[Termes IGN] insecte phyllophage
[Termes IGN] optimisation (mathématiques)
[Termes IGN] pouvoir de résolution spectrale
[Termes IGN] prévention des risquesRésumé : (auteur) Gonipterus scutellatus Gyllenhal is a leaf feeding weevil that is a major defoliator of the genus Eucalyptus. Understanding the relationship between levels of weevil induced vegetation defoliation and the optimal spatial resolution of satellite images is essential for effective management of plantation resources. The objective of this study was to identify appropriate spatial resolutions for predicting levels of weevil induced defoliation. We resampled the Normalized Difference Vegetation Index (NDVI), Simple Ratio (SR) and Enhanced Vegetation Index (EVI) images computed from a WorldView-2 pan-sharpened image, which is characterised with a 0.5 m spatial resolution and 8 spectral bands. Within each plantation compartment 30 × 30 m plots were established, representing different levels of defoliation. From the centre of each plot, the spatial resolution of the original image was progressively resampled from 1.5 to 8.5 m, with 1 m increments. The minimal variance for each level of defoliation was then established and used as an indicator for quantitatively selecting the optimal spatial resolution. Results indicate that an appropriate spatial resolution was established at 1.25, 1.25, 1.75 and 2.25 m for low, medium, high and severe levels of defoliation, respectively. In addition, an Artificial Neural Network was run to determine the relationship between the appropriate spatial resolution and levels of Gonipterus scutellatus induced defoliation. The model yielded an R2 of 0.80, with an RMSE of 1.28 (2.45% of the mean measured defoliation) based on an independent test dataset. We then compared this model to a model developed using the original 0.5 m image spatial resolution. Our results suggest that optimising the spatial resolution of remotely sensed imagery essentially improves the prediction of vegetation defoliation. In essence, this study provides the foundation for multi-scale defoliation mapping using high spatial resolution imagery. Numéro de notice : A2016-136 Affiliation des auteurs : non IGN Thématique : FORET/IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2015.11.010 En ligne : https://doi.org/10.1016/j.isprsjprs.2015.11.010 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=80307
in ISPRS Journal of photogrammetry and remote sensing > vol 112 (February 2016) . - pp 13–22[article]Distinctive order based self-similarity descriptor for multi-sensor remote sensing image matching / Amin Sedaghat in ISPRS Journal of photogrammetry and remote sensing, vol 108 (October 2015)
[article]
Titre : Distinctive order based self-similarity descriptor for multi-sensor remote sensing image matching Type de document : Article/Communication Auteurs : Amin Sedaghat, Auteur ; Hamid Ebadi, Auteur Année de publication : 2015 Article en page(s) : pp 62 – 71 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] appariement d'images
[Termes IGN] extraction automatique
[Termes IGN] image Geoeye
[Termes IGN] image IRS
[Termes IGN] image Landsat-ETM+
[Termes IGN] image multicapteur
[Termes IGN] image Quickbird
[Termes IGN] image SPOT 4
[Termes IGN] image SPOT 5
[Termes IGN] image SPOT 6
[Termes IGN] image Terra-ASTER
[Termes IGN] image Worldview
[Termes IGN] invariant
[Termes IGN] SIFT (algorithme)Résumé : (auteur) Robust, well-distributed and accurate feature matching in multi-sensor remote sensing image is a difficult task duo to significant geometric and illumination differences. In this paper, a robust and effective image matching approach is presented for multi-sensor remote sensing images. The proposed approach consists of three main steps. In the first step, UR-SIFT (Uniform robust scale invariant feature transform) algorithm is applied for uniform and dense local feature extraction. In the second step, a novel descriptor namely Distinctive Order Based Self Similarity descriptor, DOBSS descriptor, is computed for each extracted feature. Finally, a cross matching process followed by a consistency check in the projective transformation model is performed for feature correspondence and mismatch elimination. The proposed method was successfully applied for matching various multi-sensor satellite images as: ETM+, SPOT 4, SPOT 5, ASTER, IRS, SPOT 6, QuickBird, GeoEye and Worldview sensors, and the results demonstrate its robustness and capability compared to common image matching techniques such as SIFT, PIIFD, GLOH, LIOP and LSS. Numéro de notice : A2015-852 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1016/j.isprsjprs.2015.06.003 En ligne : https://doi.org/10.1016/j.isprsjprs.2015.06.003 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=79222
in ISPRS Journal of photogrammetry and remote sensing > vol 108 (October 2015) . - pp 62 – 71[article]Mangrove tree crown delineation from high-resolution imagery / Muditha K. Heenkenda in Photogrammetric Engineering & Remote Sensing, PERS, vol 81 n° 6 (June 2015)PermalinkSpatial analysis of high-resolution urban thermal patterns in Vojvodina, Serbia / Dusan Jovanovic in Geocarto international, vol 30 n° 5 - 6 (May - July 2015)PermalinkSetting new standards in Earth observation / Anonyme in GEO: Geoconnexion international, vol 14 n° 3 (March 2015)PermalinkRadiometric and geometric evaluation of GeoEye-1, WorldView-2 and Pléiades-1A stereo images for 3D information extraction / Daniela Poli in ISPRS Journal of photogrammetry and remote sensing, vol 100 (February 2015)PermalinkVegetation Burn Severity Mapping Using Landsat-8 and WorldView-2 / Zhuoting Wu in Photogrammetric Engineering & Remote Sensing, PERS, vol 81 n° 2 (February 2015)PermalinkEtude expérimentale en cartographie de la végétation par télédétection / Vanessa Sellin in Cybergeo, European journal of geography, n° 2015 ([01/01/2015])PermalinkA rule-based parameter aided with object-based classification approach for extraction of building and roads from WorldView-2 images / Zahra Ziaei in Geocarto international, vol 29 n° 5 - 6 (August - October 2014)PermalinkDevelopment of fuzzy rule-based parameters for urban object-oriented classification using very high resolution imagery / Alireza Hamedianfar in Geocarto international, vol 29 n° 3 - 4 (June - July 2014)PermalinkGeneration and quality assessment of stereo-extracted DSM from Geoeye-1 and Worldview-2 imagery / Manuel Angel Aguilar in IEEE Transactions on geoscience and remote sensing, vol 52 n° 2 (February 2014)PermalinkMulti-agent recognition system based on object based image analysis using WorldView-2 / Fatemeh Tabib Mahmoudi in Photogrammetric Engineering & Remote Sensing, PERS, vol 80 n° 2 (February 2014)Permalink