A novel framework for image inpainting is proposed, relying on graph-based diffusion processes. Depending on the construction of the graph, both flow-based and exemplar-based inpainting methods can be implemented by the same equations, hence providing a unique framework for geometry and texture-based approaches to inpainting. Furthermore, the use of a variational framework allows to overcome the usual sensitivity of exemplar-based methods to the heuristic issues by providing an evolution criterion. The use of graphs also makes our framework more flexible than former non-local variational formulations, allowing for example to mix spatial and non-local constraints and to use a data term to provide smoother blending between the initial image and the result.