This paper presents a distributed algorithm for the detection of patterns or their transformed versions, in noisy images. The proposed method projects the observed signal onto a redundant and structured dictionary of functions, which are distributed among general purpose vision sensors. Each of the sensors then approximates the projections on its own part of the dictionary, and transmits that short information to a central fusion center. The pattern detection problem is then cast to a parameter estimation problem, where the parameters of the geometric transformation of the pattern of interest are sought, instead of the pattern itself. The parameters of the transformation are estimated by introducing a score function over the parameter space. Such an approach allows the fusion center to directly work in the space of features computed by the sensors, without need for signal reconstruction. It advantageously provides a generic approach, where the processing of the image is directly driven by the detection task. Experimental results indicate the effectiveness of the proposed method and its resiliency to noise in the observation.