Résumé

The natural representation of data streams, parallelism, and composition has made dataflow an attractive programming model for expressing a wide range of stream and media processing applications, and has led MPEG and ISO to base their latest video coding standards on this model. This paper describes and compares methodologies and metrics for the optimization of signal processing algorithms represented as dataflow programs. Our approach is based on the analysis of traces and addresses some of the complexity challenges that arise from the very large data sets that are required for evaluating real-world applications. The methodology and experimental results are demonstrated and evaluated in two at-size case studies, an MPEG-4 SP and an AVC/H.264 video decoders.

Détails

Actions