As a result of improved understanding of brain mechanisms as well as unprecedented technical advancement in neural recording methods and computer technology, it is now possible to translate large-scale brain signals into movement intentions in real time. Such decoding of both actual and imagined movements of a subject allows for new paradigms of treatment for severely impaired patients, such as neural control of a prosthesis. The field of Brain Machine Interfaces (BMI) explores the tremendous potential of hybrid systems linking neural tissue to artificial devices. BMI operations involve a bidirectional learning process: the BMI system learns to decode brain signals by uncovering their relationship to voluntary movements, while the brain itself plastically adapts to the task. Proper BMI training is critical for its successful adoption by the patient. We believe that training a subject in a realistic virtual environment prior to the use of the physical prosthetic device is an efficient and safe method that can significantly facilitate design of practical neural prostheses for patients in need. In this dissertation we describe the control of a 3D virtual monkey (the avatar) as visual feedback for BMI with rhesus monkeys and address a number of key questions: • Monkeys’ interaction with the avatar. • Modulation of neurons in primary somatosensory (S1) and motor (M1) cortical areas during passive observation of the avatar being touched. • Modulation of neural responses by the observation of the avatar’s movements. • Changes in neural responses during long-term brain control of the avatar. We describe the plasticity of the body representation by the brain resulting from visual stimuli delivered via the avatar and tactile stimuli applied to the subject’s physical arm. We show how the avatar can be used for training rhesus monkeys to perform complex tasks. Behavioral evidence that rhesus monkeys respond to the avatar shape and motions and can even relate it with a representation of another monkey is presented. Finally two instances of novel brain controlled avatar are shown: a complete closed loop brain-machine-brain-interface with sensory feedback through direct cortical stimulation and the first successful attempt of a multi-limb BMI. We also study a simplified learning process of the BMI through the passive observation of the movements of the avatar arms.