Files

Abstract

Caching is an approach to smoothen the variability of traffic over time. Recently it has been proved that the local memories at the users can be exploited for reducing the peak traffic in a much more efficient way than previously believed. In this work we improve upon the existing results and introduce a novel caching strategy that takes advantage of simultaneous coded placement and coded delivery in order to decrease the worst case achievable rate with $2$ files and $K$ users. We will show that for any cache size $\frac{1}{K} < M < 1$ our scheme outperforms the state of the art.

Details

Actions

Preview