default search action
ETRA 2020: Stuttgart, Germany - Short Papers
- Andreas Bulling, Anke Huckauf, Eakta Jain, Ralph Radach, Daniel Weiskopf:
ETRA '20: 2020 Symposium on Eye Tracking Research and Applications, Short Papers, Stuttgart, Germany, June 2-5, 2020. ACM 2020, ISBN 978-1-4503-7134-6
SESSION 1: Methods
- Alexander Schäfer, Tomoko Isomura, Gerd Reis, Katsumi Watanabe, Didier Stricker:
MutualEyeContact: A conversation analysis tool with focus on eye contact. 1:1-1:5 - Koki Koshikawa, Masato Sasaki, Takamasa Utsu, Kentaro Takemura:
Polarized Near-Infrared Light Emission for Eye Gaze Estimation. 2:1-2:4 - Yuto Tamura, Kentaro Takemura:
Estimating Point-of-Gaze using Smooth Pursuit Eye Movements without Implicit and Explicit User-Calibration. 3:1-3:4 - Wolfgang Fuhl, Hong Gao, Enkelejda Kasneci:
Neural networks for optical vector and eye ball parameter estimation. 4:1-4:5 - Wolfgang Fuhl, Hong Gao, Enkelejda Kasneci:
Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation. 5:1-5:5 - Pawel Kasprowski, Katarzyna Harezlak:
Protecting from Lunchtime Attack Using an Uncalibrated Eye Tracker Signal. 6:1-6:5 - Chia-Kai Yang, Tanja Blascheck, Chat Wacharamanotham:
A Comparison of a Transition-based and a Sequence-based Analysis of AOI Transition Sequences. 7:1-7:5 - Sai Akanksha Punuganti, Jorge Otero-Millan:
Detection of Saccades and Quick-Phases in Eye Movement Recordings with Nystagmus. 8:1-8:5 - Moayad Mokatren, Tsvi Kuflik, Ilan Shimshoni:
EyeLinks: Methods to compute reliable stereo mappings used for eye gaze tracking. 9:1-9:5 - Nathaniel Barbara, Tracey A. Camilleri, Kenneth P. Camilleri:
EOG-Based Ocular and Gaze Angle Estimation Using an Extended Kalman Filter. 10:1-10:5 - Jan Ehlers, Annika Meinecke:
Voluntary Pupil Control in Noisy Environments. 11:1-11:5 - Ayush Kumar, Prantik Howlader, Rafael Garcia, Daniel Weiskopf, Klaus Mueller:
Challenges in Interpretability of Neural Networks for Eye Movement Data. 12:1-12:5 - Olivier Le Meur, Pierre-Adrien Fons:
Predicting image influence on visual saliency distribution: the focal and ambient dichotomy. 13:1-13:5 - Peter Hausamann, Christian Sinnott, Paul R. MacNeilage:
Positional head-eye tracking outside the lab: an open-source solution. 14:1-14:5 - Thomas C. Kübler:
The Perception Engineer's Toolkit for Eye-Tracking data analysis. 15:1-15:4 - Gonzalo Garde, Andoni Larumbe-Bergera, Benoît Bossavit, Rafael Cabeza, Sonia Porta, Arantxa Villanueva:
Gaze estimation problem tackled through synthetic images. 16:1-16:5 - Dmytro Katrychuk, Henry K. Griffith, Oleg Komogortsev:
A Calibration Framework for Photosensor-based Eye-Tracking System. 17:1-17:5 - Artem V. Belopolsky:
Getting more out of Area of Interest (AOI) analysis with SPLOT. 18:1-18:4 - Isayas B. Adhanom, Samantha C. Lee, Eelke Folmer, Paul R. MacNeilage:
GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-based Eye Trackers. 19:1-19:5 - Cristina Palmero Cantariño, Oleg V. Komogortsev, Sachin S. Talathi:
Benefits of temporal information for appearance-based gaze estimation. 20:1-20:5
SESSION 2: Privacy and Security
- Efe Bozkir, Ali Burak Ünal, Mete Akgün, Enkelejda Kasneci, Nico Pfeifer:
Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework. 21:1-21:5 - Aayush Kumar Chaudhary, Jeff B. Pelz:
Privacy-Preserving Eye Videos using Rubber Sheet Model. 22:1-22:5
SESSION 3: Saliency
- Hamed Rezazadegan Tavakoli, Ali Borji, Juho Kannala, Esa Rahtu:
Deep Audio-Visual Saliency: Baseline Model and Data. 23:1-23:5 - David Geisler, Daniel Weber, Nora Castner, Enkelejda Kasneci:
Exploiting the GBVS for Saliency aware Gaze Heatmaps. 24:1-24:5
SESSION 4: User Interfaces and Interaction
- Immo Schuetz, T. Scott Murdison, Marina Zannoli:
A Psychophysics-inspired Model of Gaze Selection Performance. 25:1-25:5 - Daniel Walper, Julia Kassau, Philipp Methfessel, Timo Pronold, Wolfgang Einhäuser:
Optimizing user interfaces in food production: gaze tracking is more sensitive for A-B-testing than behavioral data alone. 26:1-26:4 - Sven Bertel, Stefanie Wetzel:
Comparing Eye Movements Between Physical Rotation Interaction Techniques. 27:1-27:5 - Sunu Wibirama, Suatmi Murnani, Noor Akhmad Setiawan:
Spontaneous Gaze Gesture Interaction in the Presence of Noises and Various Types of Eye Movements. 28:1-28:5 - Yasmeen Abdrabou, Ken Pfeuffer, Mohamed Khamis, Florian Alt:
GazeLockPatterns: Comparing Authentication Using Gaze and Touch for Entering Lock Patterns. 29:1-29:6
SESSION 5: Virtual Reality
- Ashima Keshava, Anete Aumeistere, Krzysztof Izdebski, Peter König:
Decoding Task From Oculomotor Behavior In Virtual Reality. 30:1-30:5 - Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, Enkelejda Kasneci:
A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination. 31:1-31:5
SESSION 6: Applications
- Seoyoung Ahn, Conor Kelton, Aruna Balasubramanian, Gregory J. Zelinsky:
Towards Predicting Reading Comprehension From Gaze Behavior. 32:1-32:5 - Alma-Sophia Merscher, Matthias Gamer:
Freezing of Gaze in Anticipation of Avoidable Threat: A threat-specific proxy for freezing-like behavior in humans. 33:1-33:3 - Minoru Nakayama, Kohei Shoda:
Impact of evoked reward expectation on ocular information during a controlled card game. 34:1-34:5 - Sarah-Christin Freytag:
Sweet Pursuit: User Acceptance and Performance of a Smooth Pursuit controlled Candy Dispensing Machine in a Public Setting. 35:1-35:5 - Darlene E. Edewaard, Richard A. Tyrrell, Andrew T. Duchowski, Ellen C. Szubski, Savana S. King:
Using Eye Tracking to Assess the Temporal Dynamics By Which Drivers Notice Cyclists in Daylight: Drivers Becoming Aware of Cyclists. 36:1-36:5 - Rémy Siegfried, Bozorgmehr Aminian, Jean-Marc Odobez:
ManiGaze: a Dataset for Evaluating Remote Gaze Estimator in Object Manipulation Situations. 37:1-37:5 - Gavindya Jayawardena, Anne Michalek, Andrew T. Duchowski, Sampath Jayarathna:
Pilot Study of Audiovisual Speech-In-Noise (SIN) Performance of Young Adults with ADHD. 38:1-38:5 - Anna Lisa Gert, Benedikt V. Ehinger, Tim C. Kietzmann, Peter König:
Faces strongly attract early fixations in naturally sampled real-world stimulus materials. 39:1-39:5 - János Szalma, Béla Weiss:
Data-Driven Classification of Dyslexia Using Eye-Movement Correlates of Natural Reading. 40:1-40:4
SESSION 7: Cognition
- Shannon Patricia Devlin, Jake Ryan Flynn, Sara Lu Riggs:
How Shared Visual Attention Patterns of Pairs Unfold Over Time when Workload Changes. 41:1-41:5 - Benedict C. O. F. Fehringer:
One threshold to rule them all? Modification of the Index of Pupillary Activity to optimize the indication of cognitive load. 42:1-42:5 - Charlotte Schwedes, Oliver C. Raufeisen, Dirk Wentura:
Insights into the processes underlying the early fixation-based memory effect. 43:1-43:4 - Matteo Valsecchi, Arash Akbarinia, Raquel Gil Rodríguez, Karl R. Gegenfurtner:
Pedestrians Egocentric Vision: Individual and Collective Analysis. 44:1-44:5 - Yuki Kubota, Tomohiko Hayakawa, Masatoshi Ishikawa:
Quantitative Perception Measurement of the Rotating Snakes Illusion Considering Temporal Dependence and Gaze Information. 45:1-45:4
SESSION 1: Analysis Techniques
- Raphael Menges, Sophia Kramer, Stefan Hill, Marius Nisslmueller, Chandan Kumar, Steffen Staab:
A Visualization Tool for Eye Tracking Data Analysis in the Web. 46:1-46:5 - Feiyang Wang, Adam James Bradley, Christopher Collins:
Eye Tracking for Target Acquisition in Sparse Visualizations. 47:1-47:5 - Michael Burch, Neil Timmermans:
Sankeye: A Visualization Technique for AOI Transitions. 48:1-48:5
SESSION 2: Evaluation
- Seyda Öney, Nils Rodrigues, Michael Becher, Thomas Ertl, Guido Reina:
Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality. 49:1-49:5 - Nelusa Pathmanathan, Michael Becher, Nils Rodrigues, Guido Reina, Thomas Ertl, Daniel Weiskopf, Michael Sedlmair:
Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality. 50:1-50:5 - Annalena Streichert, Katrin Angerbauer, Magdalena Schwarzl, Michael Sedlmair:
Comparing Input Modalities for Shape Drawing Tasks. 51:1-51:5
SESSION 1
- Selina Emhardt, Halszka Jarodzka, Saskia Brand-Gruwel, Christian Drumm, Tamara van Gog:
Introducing Eye Movement Modeling Examples for Programming Education and the Role of Teacher's Didactic Guidance. 52:1-52:4 - Natalia Chitalkina, Roman Bednarik, Marjaana Puurtinen, Hans Gruber:
When you ignore what you see: How to study proof-readers' error in pseudocode reading. 53:1-53:5 - Florian Hauser, Stefan Schreistetter, Rebecca Reuter, Jürgen Horst Mottok, Hans Gruber, Kenneth Holmqvist, Nick Schorr:
Code Reviews in C++: Preliminary Results from an Eye Tracking Study. 54:1-54:5 - Minoru Nakayama, Hiroto Harada:
Eye Movement Features in response to Comprehension Performance during the Reading of Programs. 55:1-55:5 - Unaizah Obaidellah, Tanja Blascheck, Drew T. Guarnera, Jonathan I. Maletic:
A Fine-grained Assessment on Novice Programmers' Gaze Patterns on Pseudocode Problems. 56:1-56:5 - Naser Al Madi, Cole S. Peterson, Bonita Sharif, Jonathan I. Maletic:
Can the E-Z Reader Model Predict Eye Movements Over Code? Towards a Model of Eye Movements Over Source Code. 57:1-57:4
SESSION 1: Considerations for Application of Eye-Tracking in Games
- Birte Heinemann, Matthias Ehlenz, Ulrik Schroeder:
Eye-Tracking in Educational Multi-Touch Games: Design-Based (interaction) research and great visions. 58:1-58:5 - Michael Burch, Kuno Kurzhals:
Visual Analysis of Eye Movements During Game Play. 59:1-59:5 - Alessandro Cierro, Thibault Philippette, Thomas François, Sebastien Nahon, Patrick Watrin:
Eye-tracking for Sense of Immersion and Linguistic Complexity in the Skyrim Game: Issues and Perspectives. 60:1-60:5
SESSION 2: Gaze-Based Studies
- Pawel Kasprowski, Katarzyna Harezlak:
Using the Uncalibrated Eye Tracker Signal in the Gaming Environment. 61:1-61:5 - Michael Lankes, Andreas Stöckl:
Gazing at Pac-Man: Lessons learned from a Eye-Tracking Study Focusing on Game Difficulty. 62:1-62:5 - Martin Kocur, Martin Johannes Dechant, Michael Lankes, Christian Wolff, Regan L. Mandryk:
Eye Caramba: Gaze-based Assistance for Virtual Reality Aiming and Throwing Tasks in Games. 63:1-63:6
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.