Détail de l'auteur
Auteur Hua Liao |
Documents disponibles écrits par cet auteur (7)
Ajouter le résultat dans votre panier Affiner la recherche Interroger des sources externes
Detecting individuals' spatial familiarity with urban environments using eye movement data / Hua Liao in Computers, Environment and Urban Systems, vol 93 (April 2022)
[article]
Titre : Detecting individuals' spatial familiarity with urban environments using eye movement data Type de document : Article/Communication Auteurs : Hua Liao, Auteur ; Wendi Zhao, Auteur ; Changbo Zhang, Auteur ; et al., Auteur Année de publication : 2022 Article en page(s) : n° 101758 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Termes IGN] analyse visuelle
[Termes IGN] apprentissage automatique
[Termes IGN] classification par forêts d'arbres décisionnels
[Termes IGN] navigation pédestre
[Termes IGN] oculométrie
[Termes IGN] service fondé sur la position
[Termes IGN] zone urbaine
[Vedettes matières IGN] GéovisualisationRésumé : (auteur) The spatial familiarity of environments is an important high-level user context for location-based services (LBS). Knowing users' familiarity level of environments is helpful for enabling context-aware LBS that can automatically adapt information services according to users' familiarity with the environment. Unlike state-of-the-art studies that used questionnaires, sketch maps, mobile phone positioning (GPS) data, and social media data to measure spatial familiarity, this study explored the potential of a new type of sensory data - eye movement data - to infer users' spatial familiarity of environments using a machine learning approach. We collected 38 participants' eye movement data when they were performing map-based navigation tasks in familiar and unfamiliar urban environments. We trained and cross-validated a random forest classifier to infer whether the users were familiar or unfamiliar with the environments (i.e., binary classification). By combining basic statistical features and fixation semantic features, we achieved a best accuracy of 81% in a 10-fold classification and 70% in the leave-one-task-out (LOTO) classification. We found that the pupil diameter, fixation dispersion, saccade duration, fixation count and duration on the map were the most important features for detecting users' spatial familiarity. Our results indicate that detecting users' spatial familiarity from eye tracking data is feasible in map-based navigation and only a few seconds (e.g., 5 s) of eye movement data is sufficient for such detection. These results could be used to develop context-aware LBS that adapt their services to users' familiarity with the environments. Numéro de notice : A2022-121 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article DOI : 10.1016/j.compenvurbsys.2022.101758 Date de publication en ligne : 21/01/2022 En ligne : https://doi.org/10.1016/j.compenvurbsys.2022.101758 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99663
in Computers, Environment and Urban Systems > vol 93 (April 2022) . - n° 101758[article]Identifying map users with eye movement data from map-based spatial tasks: user privacy concerns / Hua Liao in Cartography and Geographic Information Science, vol 49 n° 1 (January 2022)
[article]
Titre : Identifying map users with eye movement data from map-based spatial tasks: user privacy concerns Type de document : Article/Communication Auteurs : Hua Liao, Auteur ; Weihua Dong, Auteur ; Zhicheng Zhan, Auteur Année de publication : 2022 Article en page(s) : pp 50 - 69 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Termes IGN] classification par forêts d'arbres décisionnels
[Termes IGN] comportement
[Termes IGN] confidentialité
[Termes IGN] identité
[Termes IGN] lecture de carte
[Termes IGN] oculométrie
[Termes IGN] orientation
[Termes IGN] partage de données localisées
[Termes IGN] protection de la vie privée
[Termes IGN] utilisateur
[Termes IGN] visualisation cartographique
[Vedettes matières IGN] CartologieRésumé : (auteur) Individuals with different characteristics exhibit different eye movement patterns in map reading and wayfinding tasks. In this study, we aim to explore whether and to what extent map users’ eye movements can be used to detect who created them. Specifically, we focus on the use of gaze data for inferring users’ identities when users are performing map-based spatial tasks. We collected 32 participants’ eye movement data as they utilized maps to complete a series of self-localization and spatial orientation tasks. We extracted five sets of eye movement features and trained a random forest classifier. We used a leave-one-task-out approach to cross-validate the classifier and achieved the best identification rate of 89%, with a 2.7% equal error rate. This result is among the best performances reported in eye movement user identification studies. We evaluated the feature importance and found that basic statistical features (e.g. pupil size, saccade latency and fixation dispersion) yielded better performance than other feature sets (e.g. spatial fixation densities, saccade directions and saccade encodings). The results open the potential to develop personalized and adaptive gaze-based map interactions but also raise concerns about user privacy protection in data sharing and gaze-based geoapplications. Numéro de notice : A2022-018 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article DOI : 10.1080/15230406.2021.1980435 Date de publication en ligne : 06/10/2021 En ligne : https://doi.org/10.1080/15230406.2021.1980435 Format de la ressource électronique : URL article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=99161
in Cartography and Geographic Information Science > vol 49 n° 1 (January 2022) . - pp 50 - 69[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 032-2022011 RAB Revue Centre de documentation En réserve L003 Disponible Comparing pedestrians’ gaze behavior in desktop and in real environments / Weihua Dong in Cartography and Geographic Information Science, Vol 47 n° 5 (September 2020)
[article]
Titre : Comparing pedestrians’ gaze behavior in desktop and in real environments Type de document : Article/Communication Auteurs : Weihua Dong, Auteur ; Hua Liao, Auteur ; Bing Liu, Auteur ; et al., Auteur Année de publication : 2020 Article en page(s) : pp 432 - 451 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Termes IGN] analyse comparative
[Termes IGN] analyse visuelle
[Termes IGN] comportement
[Termes IGN] espace urbain
[Termes IGN] lecture de carte
[Termes IGN] monde virtuel
[Termes IGN] navigation pédestre
[Termes IGN] oculométrie
[Termes IGN] piéton
[Termes IGN] test statistique
[Termes IGN] travail
[Termes IGN] vision par ordinateur
[Vedettes matières IGN] GéovisualisationRésumé : (auteur) This research is motivated by the widespread use of desktop environments in the lab and by the recent trend of conducting real-world eye-tracking experiments to investigate pedestrian navigation. Despite the existing significant differences between the real world and the desktop environments, how pedestrians’ visual behavior in real environments differs from that in desktop environments is still not well understood. Here, we report a study that recorded eye movements for a total of 82 participants while they were performing five common navigation tasks in an unfamiliar urban environment (N = 39) and in a desktop environment (N = 43). By analyzing where the participants allocated their visual attention, what objects they fixated on, and how they transferred their visual attention among objects during navigation, we found similarities and significant differences in the general fixation indicators, spatial fixation distributions and attention to the objects of interest. The results contribute to the ongoing debate over the validity of using desktop environments to investigate pedestrian navigation by providing insights into how pedestrians allocate their attention to visual stimuli to accomplish navigation tasks in the two environments. Numéro de notice : A2020-488 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1080/15230406.2020.176251 Date de publication en ligne : 29/05/2020 En ligne : https://doi.org/10.1080/15230406.2020.1762513 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=95658
in Cartography and Geographic Information Science > Vol 47 n° 5 (September 2020) . - pp 432 - 451[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 032-2020051 RAB Revue Centre de documentation En réserve L003 Disponible Comparing the roles of landmark visual salience and semantic salience in visual guidance during indoor wayfinding / Weihua Dong in Cartography and Geographic Information Science, vol 47 n° 3 (May 2020)
[article]
Titre : Comparing the roles of landmark visual salience and semantic salience in visual guidance during indoor wayfinding Type de document : Article/Communication Auteurs : Weihua Dong, Auteur ; Tong Qin, Auteur ; Hua Liao, Auteur Année de publication : 2020 Article en page(s) : pp 229 - 243 Note générale : bibliographie Langues : Anglais (eng) Descripteur : [Termes IGN] analyse visuelle
[Termes IGN] interprétation (psychologie)
[Termes IGN] oculométrie
[Termes IGN] point de repère
[Termes IGN] questionnaire
[Termes IGN] saillance
[Termes IGN] scène intérieure
[Termes IGN] segmentation sémantique
[Termes IGN] test statistique
[Termes IGN] vision
[Termes IGN] vision par ordinateur
[Vedettes matières IGN] GéovisualisationRésumé : (auteur) Landmark visual salience (characterized by features that contrast with their surroundings and visual peculiarities) and semantic salience (characterized by features with unusual or important meaning and content in the environment) are two important factors that affect an individual’s visual attention during wayfinding. However, empirical evidence regarding which factor dominates visual guidance during indoor wayfinding is rare, especially in real-world environments. In this study, we assumed that semantic salience dominates the guidance of visual attention, which means that semantic salience will correlate with participants’ fixations more significantly than visual salience. Notably, in previous studies, semantic salience was shown to guide visual attention in static images or familiar scenes in a laboratory environment. To validate this assumption, first, we collected the eye movement data of 22 participants as they found their way through a building. We then computed the landmark visual and semantic salience using computer vision models and questionnaires, respectively. Finally, we conducted correlation tests to verify our assumption. The results failed to validate our assumption and show that the role of salience in visual guidance in a real-world wayfinding process is different from the role of salience in perceiving static images or scenes in a laboratory. Visual salience dominates visual attention during indoor wayfinding, but the roles of salience in visual guidance are mixed across different landmark classes and tasks. The results provide new evidence for understanding how pedestrians visually interpret landmark information during real-world indoor wayfinding. Numéro de notice : A2020-169 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1080/15230406.2019.1697965 Date de publication en ligne : 18/12/2019 En ligne : https://doi.org/10.1080/15230406.2019.1697965 Format de la ressource électronique : url article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=94841
in Cartography and Geographic Information Science > vol 47 n° 3 (May 2020) . - pp 229 - 243[article]Exemplaires(1)
Code-barres Cote Support Localisation Section Disponibilité 032-2020031 RAB Revue Centre de documentation En réserve L003 Disponible Inferring user tasks in pedestrian navigation from eye movement data in real-world environments / Hua Liao in International journal of geographical information science IJGIS, Vol 33 n° 3-4 (March - April 2019)
[article]
Titre : Inferring user tasks in pedestrian navigation from eye movement data in real-world environments Type de document : Article/Communication Auteurs : Hua Liao, Auteur ; Weihua Dong, Auteur ; Haosheng Huang, Auteur ; et al., Auteur Année de publication : 2019 Article en page(s) : pp 739 - 763 Note générale : Bibliographie Langues : Anglais (eng) Descripteur : [Termes IGN] apprentissage automatique
[Termes IGN] calcul d'itinéraire
[Termes IGN] classification par forêts d'arbres décisionnels
[Termes IGN] inférence
[Termes IGN] navigation pédestre
[Termes IGN] oculométrie
[Termes IGN] représentation cognitive
[Vedettes matières IGN] GéovisualisationRésumé : (Auteur) Eye movement data convey a wealth of information that can be used to probe human behaviour and cognitive processes. To date, eye tracking studies have mainly focused on laboratory-based evaluations of cartographic interfaces; in contrast, little attention has been paid to eye movement data mining for real-world applications. In this study, we propose using machine-learning methods to infer user tasks from eye movement data in real-world pedestrian navigation scenarios. We conducted a real-world pedestrian navigation experiment in which we recorded eye movement data from 38 participants. We trained and cross-validated a random forest classifier for classifying five common navigation tasks using five types of eye movement features. The results show that the classifier can achieve an overall accuracy of 67%. We found that statistical eye movement features and saccade encoding features are more useful than the other investigated types of features for distinguishing user tasks. We also identified that the choice of classifier, the time window size and the eye movement features considered are all important factors that influence task inference performance. Results of the research open doors to some potential real-world innovative applications, such as navigation systems that can provide task-related information depending on the task a user is performing. Numéro de notice : A2019-214 Affiliation des auteurs : non IGN Thématique : GEOMATIQUE Nature : Article nature-HAL : ArtAvecCL-RevueIntern DOI : 10.1080/13658816.2018.1482554 Date de publication en ligne : 26/06/2018 En ligne : https://doi.org/10.1080/13658816.2018.1482554 Format de la ressource électronique : URL Article Permalink : https://documentation.ensg.eu/index.php?lvl=notice_display&id=92686
in International journal of geographical information science IJGIS > Vol 33 n° 3-4 (March - April 2019) . - pp 739 - 763[article]Exemplaires(2)
Code-barres Cote Support Localisation Section Disponibilité 079-2019031 RAB Revue Centre de documentation En réserve L003 Disponible 079-2019032 RAB Revue Centre de documentation En réserve L003 Disponible Assessing the effectiveness and efficiency of map colour for colour impairments using an eye-tracking approach / Weihua Dong in Cartographic journal (the), Vol 53 n° 2 (May 2016)PermalinkEye tracking to explore the potential of enhanced imagery basemaps in web mapping / Weihua Dong in Cartographic journal (the), vol 51 n° 4 (November 2014)Permalink