%0 Conference Proceedings %T Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch %+ School of Computing and Communications [Lancaster] (SCC) %+ Perceptual User Interfaces Group [Saarbrücken] %+ The Human Computer Interaction Lab %A Turner, Jayson %A Alexander, Jason %A Bulling, Andreas %A Schmidt, Dominik %A Gellersen, Hans %Z Part 4: Gaze-Enabled Interaction Design %< avec comité de lecture %( Lecture Notes in Computer Science %B 14th International Conference on Human-Computer Interaction (INTERACT) %C Cape Town, South Africa %Y Paula Kotzé %Y Gary Marsden %Y Gitte Lindgaard %Y Janet Wesson %Y Marco Winckler %I Springer %3 Human-Computer Interaction – INTERACT 2013 %V LNCS-8118 %N Part II %P 170-186 %8 2013-09-02 %D 2013 %R 10.1007/978-3-642-40480-1_11 %K Content Transfer %K Interaction Techniques %K Eye-Based Interaction %K Mobile %K Cross-Device %Z Computer Science [cs]Conference papers %X Previous work has validated the eyes and mobile input as a viable approach for pointing at, and selecting out of reach objects. This work presents Eye Pull, Eye Push, a novel interaction concept for content transfer between public and personal devices using gaze and touch. We present three techniques that enable this interaction: Eye Cut & Paste, Eye Drag & Drop, and Eye Summon & Cast. We outline and discuss several scenarios in which these techniques can be used. In a user study we found that participants responded well to the visual feedback provided by Eye Drag & Drop during object movement. In contrast, we found that although Eye Summon & Cast significantly improved performance, participants had difficulty coordinating their hands and eyes during interaction. %G English %Z TC 13 %2 https://inria.hal.science/hal-01501741/document %2 https://inria.hal.science/hal-01501741/file/978-3-642-40480-1_11_Chapter.pdf %L hal-01501741 %U https://inria.hal.science/hal-01501741 %~ IFIP-LNCS %~ IFIP %~ IFIP-AICT %~ IFIP-TC13 %~ IFIP-INTERACT %~ IFIP-LNCS-8118