Files

Abstract

Vision-based drone swarms have recently emerged as a promising alternative to address the fault-tolerance and flexibility limitations of centralized and communication-based aerial collective systems. Although most vision-based control algorithms rely on the detection of neighbors, they usually neglect critical perceptual factors such as visual occlusions and their effect on the scalability of the swarm. To estimate the impact of occlusions on the detection of neighbors, we propose a simple but perceptually realistic visual neighbor selection model that discards obstructed agents. We evaluate the visibility model using a potential-field-based flocking algorithm with up to one thousand agents, showing that occlusions have adverse effects on the inter-agent distances and velocity alignment as the swarm scales up, both in terms of group size and density. In particular, we find that small agent displacements have considerable effects on neighbor visibility and lead to control discontinuities. We show that the destabilizing effects of visibility switches, i.e., agents continuously becoming visible or invisible, can be mitigated if agents select their neighbors from adjacent Voronoi regions. We validate the resulting flocking algorithm using up to one hundred agents with quadcopter dynamics and subject to sensor noise in a high-fidelity physics simulator. The results show that Voronoi-based interactions enable vision-based swarms to remain collision-free, ordered, and cohesive in the presence of occlusions. These results are consistent across group sizes, agent number densities, and relative localization noise. The source code and experimental data are available at https://github.com/lis-epfl/vmodel.

Details

PDF