Infoscience

Conference paper

How to Make Stream Processing More Mainstream

Stream processing has a long history as a way of describing and implementing specific kinds of computational processes. So far, however, it has largely remained an exotic field of endeavor, with relatively small momentum compared to traditional von Neumann computing, and a large variety of programming models, languages, tools, and hardware realizations. However, as sequential machines cease to become faster over time, and future growth in computational speed will clearly derive from an increase in parallelism, the time has come for a general parallel programming model to supplant or complement the von Neumann abstraction. Many modern forms of computation are very well suited to a stream-based description and implementation, such as complex media coding [1], network processing [2], imaging and digital signal processing (e.g., see [3], [4]), as well as embedded control [5]. Together with the move toward parallelism, this represents a huge opportunity for stream processing. This paper shortly introduces a simple streambased model and discusses some of its properties in the light of requirements for a general parallel programming model.

    Reference

    • GR-LSM-CONF-2008-031

    Record created on 2010-01-20, modified on 2016-08-08

Related material