Cascade of Descriptors to Detect and Track Objects Across Any Network of Cameras
Most of the multi-camera systems assume a well structured environment to detect and match objects across cameras. Cameras need to be fixed and calibrated. In this work, a novel system is presented to detect and match any objects in a network of uncalibrated fixed and mobile cameras. Objects are detected with the mobile cameras given only their observations from the fixed cameras. No training stage and data are used. Detected objects are correctly matched across cameras leading to a better understanding of the scene. A cascade of dense region descriptors is proposed to describe any object of interest. Various region descriptors are studied such as color histogram, histogram of oriented gradients, haar- wavelet responses, and covariance matrices of various features. The proposed descriptor outperforms existing approaches such as scale invariant feature transform (SIFT), or the speeded up robust features (SURF). Moreover, a sparse scan of the image plane is proposed to reduce the search space of the detection and matching process, approaching nearly real- time performance. The approach is robust to changes in illuminations, viewpoints, color distributions and image quality. Partial occlusions are also handled.
coarsefine2.jpg
openaccess
4.35 KB
JPEG
5173f027c1b178de2c0ecf14cfd464b1
journal.pdf
openaccess
3.63 MB
Adobe PDF
5fe50316ce29f87e8a6885523c39d75d