We present a versatile, behavioral, and rule-based animation system that includes autonomous humanoid actors whose behavior is based on synthetic sensors that are used for perceiving the virtual environment. We combine the following in a consistent approach: L-systems, a behavioral production rule system; a particle system; an acoustic environment model, including a speech recognition module; a virtual life network; and a humanoid library. Together, these systems create a real-time-structured virtual environment that both high-level autonomous humanoids and interactive users can easily share