We present current research developments in the Virtual Life of autonomous synthetic actors. After a brief description of the perception action principles with a few simple examples, we emphasize the concept of virtual sensors for virtual humans. In particular, we describe in detail our experiences in implementing virtual vision, tactile, and audition. We then describe perception-based locomotion, a multisensor based method of automatic grasping, and vision-based ball games