We propose a semiparametric model for regression problems involving multiple response variables. Conditional dependencies between the responses are represented through a linear mixture of Gaussian processes. We propose an efficient approximate inference scheme for this semiparametric model whose complexity is linear in the number of training data points, and show how the mixing matrix and kernel parameters can be learned by empirical Bayesian techniques. Our inference technique exploits conditional independencies between the latent variables and can be seen as a variant of belief propagation with nonparametric messages. We present experimental results on a meteorological task.