Abstract

The usage of Unmanned Aerial Systems (UAS) in traffic monitoring has advantages such as broader vision, flexibility, privacy, and cost-efficiency compared to other traffic monitoring sensors like loop detectors or fixed surveillance cameras. UAS has been instrumental in the recent pNEUMA experiment, where a swarm of drones collected a large-scale urban traffic dataset containing vehicle trajectories. These trajectory data are subject to challenges such as noise due to visual restrictions, perspective distortions, and human-induced errors. While the pNEUMA dataset is missing its imagery part, we present an extended version of it named pNEUMA Vision, which incorporates imagery data and annotations of vehicles in the form of image coordinates along with newly added vehicle trajectory features like azimuth. Moreover, we demonstrate that visually restricted trajectories are also highly prone to become anomalies, and vice-versa, through novel anomaly detection proposed in this literature. Specifically, we distinguish between stationary and non-stationary errors and argue that the latter account for the largest part of the noise. Furthermore, we analyze the new visual dataset with two different computer vision methods for estimating the number of vehicles on the roads from the input images. Particularly, we show that using a density map to count vehicles from drone images achieves comparable results to conventional vehicle counting methods via detection. Results show that the time-space diagrams plotted from density map predictions could identify the congested urban roads' queues better than a widely used vehicle detection method.

Details

Actions