Mixtures of Experts Estimate A Posteriori Probabilities
The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classification problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.
Published in ``Proceedings of the International Conference on Artificial Neural Networks (ICANN'97)''
Record created on 2006-03-10, modified on 2016-08-08