000232061 001__ 232061
000232061 005__ 20180913064545.0
000232061 0247_ $$2doi$$a10.1145/2559184.2559203
000232061 037__ $$aCONF
000232061 245__ $$aA facial affect mapping engine
000232061 269__ $$a2014
000232061 260__ $$aNew York, New York, USA$$bACM Press$$c2014
000232061 336__ $$aConference Papers
000232061 520__ $$aFacial expressions play a crucial role in human interaction. Interactive digital games can help teaching people to both express and recognise them. Such interactive games can benefit from the ability to alter user expressions dynamically and in real-time. In this demonstration, we present the Facial Affect Mapping Engine (FAME), a framework for mapping and manipulating facial expressions across images and video streams. Our system is fully automatic runs in real-time and does not require any specialist hardware. FAME presents new possibilities for the designers of intelligent interactive digital games.
000232061 700__ $$aImpett, Leonardo
000232061 700__ $$aRobinson, Peter
000232061 700__ $$aBaltrusaitis, Tadas
000232061 7112_ $$athe companion publication of the 19th international conference$$cHaifa, Israel$$d24-27 02 2014
000232061 773__ $$q33-36$$tProceedings of the companion publication of the 19th international conference on Intelligent User Interfaces - IUI Companion '14
000232061 8564_ $$s2870296$$uhttps://infoscience.epfl.ch/record/232061/files/iui2014demo.pdf$$yn/a$$zn/a
000232061 909C0 $$0252579$$pIINFCOM$$xU13217
000232061 909CO $$ooai:infoscience.tind.io:232061$$pconf$$pIC
000232061 917Z8 $$x254907
000232061 917Z8 $$x148230
000232061 937__ $$aEPFL-CONF-232061
000232061 973__ $$aOTHER$$rREVIEWED$$sPUBLISHED
000232061 980__ $$aCONF