Stoll, JosefSarey Khanie, MandanaMende, SandraWienold, JanAndersen, MarilyneEinhauser, Wolfgang2013-10-242013-10-242013-10-242013https://infoscience.epfl.ch/handle/20.500.14299/96402Measuring gaze allocation during scene perception typically faces a dilemma: full control over the stimulus requires comparably constrained scenarios, while realistic tasks leave the visual input hard to control. We propose to capture the full (4pi) light-field of an oce space, while participants perform typical oce tasks. Using a wearable eye-tracking device ("EyeSeeCam"), gaze, head and body orientation are measured along with subjective well-being and performance. In the present study, 52 participants performed four oce tasks ("input", "reflection", "output", "interaction"), each with three dierent tools (phone, computer, paper) under varying lighting conditions and outside views. We found that eye and head were fundamentally differently affected by view and that this dependence was modulated by task and tool, unless participants' task was related to reading. Importantly, for some tasks head movements rather than eye movements dominated gaze allocation. Since head and body movements frequently remain unaddressed in eye-tracking studies, our data highlight the importance of unconstrained settings. Beyond assessing the interaction between top-down (task-related) and bottom-up (stimulus-related) factors for deploying gaze and attention under real-world conditions, such data are inevitable for realistic models of optimal workplace lighting and thus for the well-being of an occupant's workplace.Eye movementEye-tracking methodsOffice lightingDiscomfort glareReal-world tasks with full control over the visual scene: combining mobile gaze tracking and 4pi light-field measurementstext::conference output::conference proceedings::conference paper