Descripteur
Termes IGN > mathématiques > analyse numérique > optimisation (mathématiques) > programmation linéaire
programmation linéaireSynonyme(s)optimisation linéaireVoir aussi |
Documents disponibles dans cette catégorie (52)
Ajouter le résultat dans votre panier
Visionner les documents numériques
Affiner la recherche Interroger des sources externes
Etendre la recherche sur niveau(x) vers le bas
Triangular factorization-based simplex algorithms for hyperspectral unmixing / W. Xia in IEEE Transactions on geoscience and remote sensing, vol 50 n° 11 Tome 1 (November 2012)
[article]
Titre : Triangular factorization-based simplex algorithms for hyperspectral unmixing Type de document : Article/Communication Auteurs : W. Xia, Auteur ; H. Pu, Auteur ; et al., Auteur Année de publication : 2012 Article en page(s) : pp 4420 - 4440 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] algorithme d'apprentissage
[Termes IGN] algorithme du simplexe
[Termes IGN] analyse des mélanges spectraux
[Termes IGN] factorisation
[Termes IGN] image hyperspectrale
[Termes IGN] programmation linéaireRésumé : (Auteur) In the linear unmixing of hyperspectral images, the observation pixels form a simplex whose vertices correspond to the endmembers, hence finding the endmembers is equivalent to extracting these vertices. A common technique for determining vertices is to analyze the simplex volume, but it usually has a high computational complexity, resulting from the exhaustive searching of volume in the large hyperspectral data. This problem limits the practicability and real-time application. In this paper, we utilize triangular factorization (TF) to calculate the volume, deducing a method named simplex volume analysis based on TF (SVATF). It requires just one comparison through the data to succeed in finding the global optimal solution for all the endmembers, thus improving the searching efficiency. Dimensionality reduction transformation is not necessary, which is another advantage of this method. Moreover, since TF is a broad conception including different methods, SVATF is a framework including various implementations. Based on TF, we also propose a fast learning algorithm named abundance quantification based on TF to estimate the abundances, which further saves the computation by utilizing the intermediate values involved in SVATF. The abundance estimation method can rectify possible errors in the given endmembers by utilizing two important constraints (abundance nonnegative constraint and abundance sum-to-one constraint) of the linear mixture model, so it is useful for the imagery without pure pixels. Experimental results on synthetic and real hyperspectral data demonstrate that the proposed methods can obtain accurate results with much lower computational complexity, with respect to other state-of-the-art methods. Numéro de notice : A2012-587 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2012.2195185 Date de publication en ligne : 22/05/2012 En ligne : https://doi.org/10.1109/TGRS.2012.2195185 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=32033
in IEEE Transactions on geoscience and remote sensing > vol 50 n° 11 Tome 1 (November 2012) . - pp 4420 - 4440[article]Geometric unmixing of large hyperspectral images: A barycentric coordinate approach / Paul Honeine in IEEE Transactions on geoscience and remote sensing, vol 50 n° 6 (June 2012)
[article]
Titre : Geometric unmixing of large hyperspectral images: A barycentric coordinate approach Type de document : Article/Communication Auteurs : Paul Honeine, Auteur ; C. Richard, Auteur Année de publication : 2012 Article en page(s) : pp 2185 - 2195 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Traitement d'image optique
[Termes IGN] algorithme du simplexe
[Termes IGN] analyse des mélanges spectraux
[Termes IGN] classification barycentrique
[Termes IGN] image hyperspectraleRésumé : (Auteur) In hyperspectral imaging, spectral unmixing is one of the most challenging and fundamental problems. It consists of breaking down the spectrum of a mixed pixel into a set of pure spectra, called endmembers, and their contributions, called abundances. Many endmember extraction techniques have been proposed in literature, based on either a statistical or a geometrical formulation. However, most, if not all, of these techniques for estimating abundances use a least-squares solution. In this paper, we show that abundances can be estimated using a geometric formulation. To this end, we express abundances with the barycentric coordinates in the simplex defined by endmembers. We propose to write them in terms of a ratio of volumes or a ratio of distances, which are quantities that are often computed to identify endmembers. This property allows us to easily incorporate abundance estimation within conventional endmember extraction techniques, without incurring additional computational complexity. We use this key property with various endmember extraction techniques, such as N-Findr, vertex component analysis, simplex growing algorithm, and iterated constrained endmembers. The relevance of the method is illustrated with experimental results on real hyperspectral images. Numéro de notice : A2012-263 Affiliation des auteurs : non IGN Thématique : IMAGERIE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1109/TGRS.2012.2188408 Date de publication en ligne : 14/11/2011 En ligne : https://doi.org/10.1109/TGRS.2012.2188408 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=31709
in IEEE Transactions on geoscience and remote sensing > vol 50 n° 6 (June 2012) . - pp 2185 - 2195[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 065-2012061 RAB Revue Centre de documentation En réserve L003 Disponible Network adjustment in surveying engineering: linear goal programming versus least squares / S. Alp in SaLIS Surveying and land information science, vol 70 n° 1 (Spring 2010)
[article]
Titre : Network adjustment in surveying engineering: linear goal programming versus least squares Type de document : Article/Communication Auteurs : S. Alp, Auteur ; E. Yavuz, Auteur ; N. Ersoy, Auteur Année de publication : 2010 Article en page(s) : pp 29 - 37 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Systèmes de référence et réseaux
[Termes IGN] compensation de coordonnées
[Termes IGN] méthode des moindres carrés
[Termes IGN] programmation linéaireRésumé : (Auteur) In engineering, especially in surveying engineering, network adjustment is made to find out the definite values of the unknowns and the measurements. Generally, the least squares method is used for vertical network adjustment. In this study, the linear goal programming method is proposed as an alternative method to the least squares method for network adjustment. The linear goal programming method is explained with an example, and the results are compared. The results found by using these two methods are similar. This study confirms that linear goal programming methods can be used in vertical network adjustment. Copyright SaLIS Numéro de notice : A2010-159 Affiliation des auteurs : non IGN Thématique : POSITIONNEMENT Nature : Article DOI : sans Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=83712
in SaLIS Surveying and land information science > vol 70 n° 1 (Spring 2010) . - pp 29 - 37[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 121-2010011 RAB Revue Centre de documentation En réserve L003 Disponible
Titre : 3D topography : a simplicial complex-based solution in a spatial DBMS Type de document : Thèse/HDR Auteurs : F. Penninga, Auteur Editeur : Delft : Netherlands Geodetic Commission NGC Année de publication : 2008 Collection : Netherlands Geodetic Commission Publications on Geodesy, ISSN 0165-1706 num. 66 Importance : 192 p. Format : 17 x 24 cm ISBN/ISSN/EAN : 978-90-6132-304-4 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Vedettes matières IGN] Géomatique
[Termes IGN] algorithme du simplexe
[Termes IGN] base de données localisées 3D
[Termes IGN] données localisées 3D
[Termes IGN] milieu urbain
[Termes IGN] modèle conceptuel de données localisées
[Termes IGN] objet géographique 3D
[Termes IGN] système de gestion de base de données
[Termes IGN] tétraèdre
[Termes IGN] Triangulated Irregular Network
[Termes IGN] visualisation 3DIndex. décimale : 32.00 Topographie - généralités Résumé : (Auteur) Current topographic products are limited to a real world representation in only two dimensions, with at best some additional point heights and contour lines. Modelling the real world in two dimensions implies a rather drastic simplification of three di-mensional real world elements. By representing these elements in two dimensions, loss of information is inevitable. Due to this simplification, accuracy of analysis results is limited and a meaningful, insightful representation of complex situations is hard to obtain. Environmental issues like high concentrations of particulate matter along highways in urban areas, the effects of noise and odour propagation and risk analysis of liquefied petroleum gas storage tanks are random examples of current issues in 3D urban planning in which more precision is required than 2D analyses can offer. In a time with increasing attention for these kind of environmental and sustainability issues, limitations of 2D models become real problematic and trigger the demand for 3D topography.
The development of 3D topography is also supply-driven, especially by the increasing availability of high density laser scan data. Height data becomes available with point densities -multiple height points per square meter- that were previously unthinkable with traditional photogrammetric stereo techniques. Direct 3D data ac-quisition by terrestrial laser scanning is emerging, thus providing detailed measure-ments of facades, tunnels and even indoor topography. The fast developments in this field are partly triggered by the emerging popularity of personal navigation devices, which will use 3D models in the future to simplify user interpretation of the (map) display.
Objective and research question
The objective of this research is to develop a data structure that is capable of han-dling large data volumes and offers support for loading, updating, querying, analysis and especially validation. To achieve this, a triangular approach will be used, due to its advantages in maintaining consistency, its robustness and editability. This tri-angular approach creates a network of triangles (in 2D) or tetrahedrons (in 3D), in which topographic features are represented by sets of triangles or tetrahedrons. Such a network is an example of an irregular tessellation, in which the real world is de--composed into smaller (triangle/tetrahedron-shaped) building blocks. The resulting networks are called TINs (Triangular Irregular Networks) or TENs (TEtrahedronised irregular Networks). The presence of boundaries of topographic features are ensured by the use of constraints, preventing the deletion of crucial boundary edges and trian-gles. Algorithms exist to calculate these constrained triangulations and constrained tetrahedronisations of topographic data.
In this research a two-step approach will be adopted. First one has to decide how real-world objects should be modelled into features, secondly one needs to store these features in such a way that the requirements in terms of querying, analysis and validation are met. An obvious step in dealing with large volumes of geographically referenced data, is to use a spatial database.
This objective is expressed in the main research question:
How can a 3D topographic representation be realised in a feature-based triangular data model?
Note that the term 'triangular' is used here in general dimension, so both triangle-and tetrahedron-based models will be considered. As mentioned before, a two-step approach will be adopted to achieve a solution to the main research question. In accordance with the two steps, two key questions can be distinguished:
How to develop a conceptual model that describes the real world phenomena (the topographic features), regarding the general purpose-characteristic of to-pographic data sets?
How to implement this conceptual model, i.e. how to develop a suitable DBMS data structure?
The results of this research will be summarised according to this two-step approach.
A conceptual data model for 3D topography
One of the basic assumptions within this research is the use of triangular data models. As a result, topographic features will be described as sets of triangles and these fea-tures will be connected by triangles as well, thus creating one triangular network. This research explored two different approaches to triangular modelling of 3D topography.
The first one is a very pragmatic hybrid approach that combines a 2.5D* sur-face with 3D objects for those cases where 2.5D modelling is not sufficient. In terms of triangular data structures, this approach combines a TIN with several TENs. These irregular data structures not only allow varying point density (de-pending on local model complexity), but extend this irregularity into varying even model dimensionality, thus offering the ultimate fit-for-purpose approach. Unfortunately, connecting TIN and TEN networks appeared to be very difficult at design level and during prototype implementation.
The second approach avoids these problems, since it is a full 3D approach using only a TEN. Two fundamental observations are of great importance:
Physical objects have by definition a volume. In reality, there are no point, line or polygon objects; only point, line or polygon representations exist (at a certain level of abstraction/generalisation).
The real world can be considered a volume partition: a set of nonoverlap-ping volumes that form a closed (i.e. no gaps within the domain) modelled space. Objects like 'earth' or 'air' are thus explicitly included in the model.
In topographic data models, planar features like walls or roofs are obviously very useful. They can be part of the volumetric data model as 'derived features', i.e. these features depend on the relationship between volume features. For example, the earth surface is the boundary between air and earth features, while a wall or a roof are the result of adjacent building and air features. In terms of UML, these planar features are modelled as association classes. As a result, planar features are lifetime dependent from the association between two volume features.
Among the advantages of the full volumetric approach are its explicit inclusion of air and earth (often subject of analysis), its extensibility (geology, air traf-fic/telecommunication corridors, etc.) and its strong mathematical definition (full connectivity enables the use of topology for query, analysis and validation). As a re-sult, topographic features will be modelled in a TEN. Each feature will be represented by a set of tetrahedrons.
A data structure for 3D topography
The newly developed data structure has three important characteristics:
It has a solid mathematical foundation. Operators and definitions from the mathematical field of Poincare simplicial homology (part of algebraic topology) are used to handle simplexes^, the basic elements in a triangular data structure. Simplexes are well defined, ordered and constructed of simplexes of lower di-mension. The boundary operator can be used to derive these less dimensional
*See section 2.2 for an overview of often-used dimension indicators
tA simplex can loosely be defined as the simplest shape in a dimension, in which simplest refers to minimising the number of points required to define such a shape, for instance a point, a line, a triangle and a tetrahedron. See section 4.1 for a proper mathematical definition simplexes. Based on the ordering of simplexes, one can determine orientation, a useful concept in GIS. Another important concept from simplicial homology is the simplicial complex, since such a set of connected simplexes will be used to model 3D topographic features.
It is developed as a spatial database data structure. Applying definitions and operators from simplicial homology enables one to store a TEN in a relatively compact way. The new simplicial complex-based method requires only explicit storage of tetrahedrons, while simplexes of lower dimensions (triangles, edges, nodes), constraints (which guarantee feature boundary presence) and topologi-cal relationships can be derived in views. Using functions to derive views from a table is typical database functionality. In this implementation, simplexes are en-coded by their vertices, similar to the annotation in simplicial homology. These simplex encodings are extended with a feature identifier, indicating which to-pographic feature is (partly) represented by this simplex. So, a tetrahedron is encoded as 83 =< vq, Vi, V2,v^, fid >. Two variants in simplex encoding have been developed: coordinate concatenation and identifier concatenation. The concept of coordinate concatenation is to concatenate x, y and z coordinates as node identifiers and to concatenate the resulting unique node codes to describe simplexes of higher dimension. The alternative approach, identifier concatena-tion, uses separate (meaningless) node identifiers to encode simplexes to reduce the number of coordinate repetitions, since a specific node will be part of multi-ple tetrahedrons. This approach requires an additional node table to store node geometries.
It is an editable data structure, which is a crucial prerequisite to be a feasible approach for topographic data storage. Incremental updates are required, since complete rebuilds of the TEN structure will be time-consuming due to the ex-pected data volumes. Whereas most existing update algorithms for constrained tetrahedronisations use node insertions, followed by edge reconstruction, this research presents edge insertion operators. Nine exhaustive and mutually exclusive cases are distinguished, based on the location in the TEN of the inserted edge's nodes. These operators guarantee the constrained edge's presence in the structure. Existing operators might fail to recover these edges, due to the pres-ence of nearby constrained edges, which would typically happen in topographic data sets.
Conclusions
This dissertation presents a new topological approach to data modelling, based on a tetrahedral network. Operators and definitions from the field of simplicial homology are used to define and handle this structure of tetrahedrons. Simplicial homology provides a solid mathematical foundation for the data structure and offers full control over orientation of simplexes and enables one to derive substantial parts of the TEN structure efficiently, instead of explicitly storing all primitives. DBMS characteristics as the usage of views, functions and function-based indexes are extensively used to realise this potential data reduction. A proof-of-concept implementation was created and tests with several data sets show that the prevailing view that tetrahedrons are more expensive in terms of storage when compared to polyhedrons, is not correct when using the proposed approach. Storage requirements for 3D objects in tetrahe-dronised form (excluding the space in between these objects) and 3D objects stored as polyhedrons, are in the same order of magnitude.
A TEN has favourable characteristics from a computational point of view. All elements of the tetrahedral network consist by definition of flat faces, all elements are convex and they are well defined. Validation of 3D objects is also simplified by tetrahedronisation. Furthermore, a full volumetric approach enables future integra-tion of topography with other 3D data like geological layers, polluted regions or air traffic and telecommunication corridors. The price of this full volumetric approach in terms of storage space is high (about 75% of the tetrahedrons models air or earth); nevertheless this approach is likely to become justifiable due to current developments towards sustainable urban development and increased focus on environmental issues.
Now the innovative aspects of the proposed method has to be identified. Neither the idea to use a TEN data structure for 3D data nor the idea to use simplexes (in terms of simplicial homology) in a DBMS implementation is new. However, the proposed approach reduces data storage and eliminates the need for explicit updates of both topology and simplexes of lower dimension. By doing so, the approach tackles common drawbacks as TEN extensiveness and laboriousness of maintaining topology. Furthermore, applying simplicial homology offers full control over orientation of sim-plexes, which is a significant advantage, especially in 3D. In addition to this aspect, the mathematical theory of simplicial homology offers a solid theoretical foundation for both the data structure and data operations. Integrating these concepts with database functionality results in a new innovative approach to 3D data modelling.
An often raised objection to a TEN approach is its presumed complexity. Obviously, a l:n relation exists between features and their tetrahedron representations. However, as long as a user handles only features (as polyhedrons) and implemented algorithms translate these polyhedrons into operations on the TEN, one can over-come the perceived complexity. Furthermore, the prevailing view that tetrahedrons are more expensive in terms of storage than polyhedrons has been falsified in this research.
Overall, the simplicial complex-based modelling approach provides a provable correct modelling method. The use of tetrahedrons is justified by the mathematical benefits and the acceptable storage requirements. Obviously, including air and earth within the model comes at a price, but -as stated earlier- this approach is likely to become justifiable, due to current sustainability and environmentally-driven developments. The decision to develop the data structure as a database structure contributes to the overall feasibility, since a database will become indispensable due to the expected data volumes.Note de contenu : Acknowledgements
1 Introduction
1.1 Motivation
1.2 Objective and main research question
1.3 Research scope and limitations
1.4 Contribution of the work
1.5 Outline
2 Research background
2.1 Problem domain: Towards 3D topography
2.2 Defining dimensions in the range 2D-3D
2.3 Deriving requirements for the conceptual data model and structure from the problem
2.4 Managing 3D data: related research on 3D data structures
2.5 Triangular data structures and algorithms
2.6 Relevant database concepts
I Conceptual modelling of 3D Topography
3 Two triangular data models for 3D topography
3.1 Approach 1: an integrated 2.5D/3D model
3.2 Approach 2: a full 3D data model
3.3 The choice for the full 3D approach
II A Data structure for 3D Topography
4 Theoretical foundations: Poincare simplicial homology
4.1 Mathematical description of simplexes
4.2 Orientation of simplexes
4.3 Combining simplexes: simplicial complexes
4.4 Operations on simplexes and simplicial complexes
5 A simplicial complex-based solution for 3D topography
5.1 Representing topographic features in a TEN
5.2 Early ideas: three TEN-based data structures for the full 3D approach
5.3 Preferred solution: applying simplicial homology to the TEN
5.4 Implementing the data structure in a DBMS environment
5.5 Summary
6 Updating features in the Data Structure
6.1 Incremental update: feature insertion
6.2 Incremental update: feature deletion
6.3 Quality improvement of TEN structure
6.4 Initial bulk loading and bulk rebuild
III Evaluation and conclusions
7 Evaluation and discussion
7.1 Evaluation material: three different data sets
7.2 Evaluating bulk tetrahedronisation process
7.3 Evaluating storage requirements
7.4 Evaluating initial visualisation tools
7.5 Discussing requirements for 3D data sets with correct topology
7.6 Identifying future developments
8 Conclusions
8.1 Results.
8.2 Main conclusions
8.3 Discussion
8.4 Future researchNuméro de notice : 15361 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Thèse étrangère DOI : sans Accessibilité hors numérique : Non accessible via le SUDOC En ligne : https://www.ncgeo.nl/index.php/en/publicatiesgb/publications-on-geodesy/item/250 [...] Format de la ressource électronique : URL Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=62703 Exemplaires(2)
Code-barres Cote Support Localisation Section Disponibilité 15361-01 32.00 Livre Centre de documentation Topographie Disponible 15361-02 32.00 Livre Centre de documentation Topographie Disponible Étude comparative de différentes méthodes d'estimation / Samuel Nahmani (2008)
Titre : Étude comparative de différentes méthodes d'estimation Type de document : Rapport Auteurs : Samuel Nahmani , Auteur ; Arnaud Pollet , Auteur Editeur : Paris : Institut Géographique National - IGN (1940-2007) Année de publication : 2008 Collection : Publications du LAREG Sous-collection : Memorandum Importance : 52 p. Format : 21 x 30 cm Langues : Français (fre) Descripteur : [Vedettes matières IGN] Statistiques
[Termes IGN] algorithme du simplexe
[Termes IGN] algorithme génétique
[Termes IGN] analyse comparative
[Termes IGN] espace vectoriel
[Termes IGN] estimation statistique
[Termes IGN] méthode des moindres carrés
[Termes IGN] qualité des données
[Termes IGN] statistique mathématiqueNuméro de notice : 15351 Affiliation des auteurs : IGN (1940-2011) Thématique : MATHEMATIQUE Nature : Rapport d'étude technique Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=40668 Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 15351-01 23.60 Livre Centre de documentation Mathématiques Disponible Evolution of clusters in dynamic point patterns: with a case study of ants' simulation / Maxim Shoshany in International journal of geographical information science IJGIS, vol 21 n° 6-7 (july 2007)PermalinkEvaluation of the Newton-Raphson method for three-point resection in photogrammetry / S.M. Easa in SaLIS Surveying and land information science, vol 67 n° 1 (March 2007)PermalinkCumul de mesures de télémétrie laser sur satellites / Arnaud Pollet (2006)PermalinkSuper-resolution land cover mapping using a Markov random field based approach / T. Kasetkasem in Remote sensing of environment, vol 96 n° 3 (30/06/2005)PermalinkDeriving new minimum cost pathways from existing paths / Denis J. Dean in Cartography and Geographic Information Science, vol 32 n° 1 (January 2005)PermalinkEstimation des paramètres de transformation entre différentes versions de l'ITRF / P. Nouaille-Degorce (2005)PermalinkThe 3-point resection problem in photogrammetry / W. Tan in Surveying and land information science, vol 64 n° 3 (01/09/2004)PermalinkIntegration of linear programming and a watershed-scale hydrologic model for proposing an optimized land-use and assessing its impact on soil conservation: a case study of the Nagwan watershed in the Hazaribagh district of Jharkhand, India / R. Kaur in International journal of geographical information science IJGIS, vol 18 n° 1 (january - february 2004)PermalinkApproximation de surfaces moléculaires / B. Cotasson (2004)PermalinkSchätzung von Vegetationsparametern aus multispektralen Fernerkundungsdaten / F. Kurz (2003)Permalink