An important goal of collective robotics is the design of control systems that allow groups of robots to accomplish common tasks by coordinating without centralized control. In this paper, we study how a group of physically assembled robots can display coherent behavior on the basis of a simple neural controller that has access only to local sensory information. This controller is synthesized through artificial evolution in a simulated environment, in order to let the robots display coordinated motion behaviors. The evolved controller proves to be robust enough to allow a smooth transfer from simulation to reality. Additionally, it generalizes to new experimental conditions, such as different sizes/shapes of the group and/or different connection mechanisms. The performance of the neural controller downloaded and tested on real robots is comparable to the results obtained in simulation.