[Purpose/significance] Eye tracking is widely used in human-computer interaction research. The analysis of eye tracking in human-computer interaction research can deeply understand the research status of eye tracking in human-computer interaction, the role of eye tracking in this process, and the trend of future research.[Method/process] This paper used Web of Science and ACM as data sources. Python and VOSviewer were used to cluster topics of related research. Summarizing the content of related articles was also conducted. Research on eye-tracking in human-computer interaction was analyzed from the perspective of "quantity" and "quality".[Result/conclusion] Interactive input, output, and application directions were the main topics of related research. Research trends summarized based on the analysis of "quantity" and "quality" can provide reference for subsequent research on eye-tracking and human-computer interaction.
Lu Liuxing
,
Shi Yu
,
Li Jiyuan
,
Wu Dan
. Eye-tracking in Human-computer Interaction: Status Quo, Roles, and Trends[J]. Library and Information Service, 2020
, 64(1)
: 113
-119
.
DOI: 10.13266/j.issn.0252-3116.2020.01.014
[1] 吴丹, 陆柳杏, 何大庆,等. 海外人机交互理论与实践前沿追踪[C]//武汉大学中国高校哲学社会科学发展与评价研究中心. 海外人文社会科学发展年度报告. 武汉:武汉大学出版社, 2017, 351-379.
[2] 冯成志, 沈模卫. 视线跟踪技术及其在人机交互中的应用[J]. 浙江大学学报(理学版), 2002, 29(2):225-232.
[3] DUCHOWSKI A T. Eye tracking methodology:theory and practice[M]. 3rd ed. Cham:Springer International Publishing AG, 2017, 11-12.
[4] FUHL W, TONSEN M, BULLING A, et al. Pupil detection for head-mounted eye tracking in the wild:an evaluation of the state of the art[J]. Machine vision and applications, 2016, 27(8):1275-1288.
[5] BULLING A, ROGGEN D, GERHARD T. Wearable EOG goggles:seamless sensing and context-awareness in everyday environments[J]. Journal of ambient intelligence and smart environments, 2009, 1(2):157-171.
[6] LI D, PARKHURST D J. OpenEyes:an open-hardware open-source system for low-cost eye tracking[J]. Journal of modern optics, 2006, 53(9):1295-1311.
[7] HORNOF A, CAVENDER A, HOSELTON R. EyeDraw:a system for drawing pictures with eye movements[C]//Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility. New York:ACM, 2004:86-93.
[8] OVIATT S. Multimodal interfaces for dynamic interactive maps[C]//Proceedings of the SIGCHI conference on human factors in computing systems. New York:ACM, 1996:95-102.
[9] 周晓磊. 人机交互中人体工效模型的建立及其应用的研究[M]. 北京:首都经济贸易大学出版社, 2014:7.
[10] MOSTAFA J,GWIZDKA J. Deepening the role of the user:neuro-physiological evidence as a basis for studying and improving search[C]//Proceedings of the 2016 ACM on conference on human information interaction and retrieval. New York:ACM, 2016:63-70.
[11] 吴丹, 刘春香. 交互式信息检索研究中的眼动追踪分析[J]. 中国图书馆学报, 2019, 45(2):111-130.
[12] SENDURUR E, YILDIRIM Z. Students' web search strategies with different task types:an eye-tracking study[J]. International journal of human-computer interaction, 2015, 31(2):101-111.
[13] 吴丹, 梁少博, 董晶. 查询式序列视角下跨设备搜索眼动变化研究——信息准备与信息重用阶段的比较[J]. 情报学报, 2019(2):220-230.
[14] 梁少博, 吴丹, 董晶等. 跨设备搜索引擎结果页面注意力分布研究——基于眼动视觉数据的实证分析[J]. 图书情报知识, 2018(1):27-35.
[15] WU D, XU S, XU X, et al. Users' visual attention flow on the search result page of digital cultural heritage collection[C]//Proceedings of the association for information science and technology. 2019, 56(1):816-818.
[16] WU K, HUANG Y. Emotions and eye-tracking of differing age groups searching one-book wall[J]. Aslib journal of information management, 2018, 70(4):434-454.
[17] WEILL-TESSIER P, TURNER J, GELLERSEN H. How do you look at what you touch?:a study of touch interaction and gaze correlation on tablets[C]//Proceedings of the ninth biennial ACM symposium on eye tracking research & applications. New York:ACM, 2016:329-330.
[18] PFEUFFER K, ALEXANDER J, MING K C, et al. Gaze-touch:Combining gaze with multi-touch for interaction on the same surface[C]//Proceedings of the 27th annual ACM symposium on user interface software and technology. New York:ACM, 2014:509-518.
[19] CHATTERJEE I, XIAO R, HARRISON C. Gaze+gesture:expressive, precise and targeted free-space interactions[C]//Proceedings of the 2015 ACM on international conference on multimodal interaction. New York:ACM, 2015:131-138.
[20] VIEIRA D, FREITAS J D, ACARTVRK C, et al. "Read that article":exploring synergies between gaze and speech interaction[C]//Proceedings of the 17th International ACM SIGACCESS conference on computers & accessibility. New York:ACM, 2015:341-342.
[21] KONTOGIORGOS D, SIBIRTSEVA E, PEREIRA A. Multimodal Reference resolution in collaborative assembly tasks[C]//Proceedings of the 4th international workshop on multimodal analyses enabling artificial agents in human-machine interaction. New York:ACM, 2018:38-42.
[22] ROIDER F, RVMELIN S, PFLEGING B, et al. The effects of situational demands on gaze, speech and gesture input in the vehicle[C]//Proceedings of the 9th ACM international conference on automotive user interfaces and interactive vehicular applications. New York:ACM, 2017:94-102.
[23] S□PAKOV O, MAJARANTA P. Scrollable keyboards for casual eye typing[J]. PsychNology Journal, 2009(7):159-173.
[24] BOZOMITU R G, PASARICA A, CEHAN V, et al. Implementation of eye-tracking system based on circular hough transform algorithm[C]//. Proceedings of the 2015 e-health and bioengineering conference. IEEE, 2015:1-4.
[25] BOZOMITU R G, PA□SA□RICA□ A, TA□RNICERIU D, et al. Development of an eye tracking-based human-computer interface for real-time applications[J]. Sensors, 2019, 19(16):3630.
[26] JIANG J, GUO F, CHEN J, et al. Applying eye-tracking technology to measure interactive experience toward the navigation interface of mobile games considering different visual attention mechanisms[J]. Applied sciences-basel, 2019, 9(16):3242.
[27] ALT F, SCHNEEGASS S, AUDA J, et al. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays[C]//Proceedings of the 19th international conference on Intelligent User Interfaces. New York:ACM. 2014:267-272.
[28] ALGHOFAILI R, SAWAHATA Y, HUANG H, et al. Lost in style:gaze-driven adaptive aid for VR navigation[C]//Proceedings of the 2019 CHI conference on human factors in computing systems. New York:ACM, 2019:paper No. 348.
[29] YOSHIMURA A, KHOKHAR A, BORST C W. Visual cues to restore student attention based on eye gaze drift, and application to an offshore training system[C]//Symposium on spatial user interaction. New York:ACM, 2019.
[30] HEWETT T T, BAECKER R, CARD S, et al. ACM SIGCHI curricula for human-computer interaction[R]. New York:ACM. 1992:13-28.