Distributed stream processing represents a novel computing paradigm where data, sensed externally and possibly preprocessed, is pushed asynchronously to various connected computing devices with heterogeneous capabilities for processing. It enables novel applications typically characterized by the need to process high-volume data streams in a timely and responsive fashion. Some example applications include sensor networks, location-tracking services, distributed speech recognition, and network management. Recent work in large-scale distributed stream processing tackle various research challenges in both the application domain as well as in the underlying system. The main focus of this paper is to highlight some of the signal processing challenges such a novel computing framework brings. We first briefly introduce the main concepts behind distributed stream processing. Then we define the notion of relevant information from two related information-theoretic approaches. Finally, we browse existing techniques for sensing and quantizing the information given the set of classification, detection and estimation tasks, which we refer to as task-driven signal processing. We also address some of the related unexplored research challenges.