Mixtures of Experts Estimate A Posteriori Probabilities

The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classification problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.


Year:
1997
Publisher:
IDIAP
Keywords:
Note:
Published in ``Proceedings of the International Conference on Artificial Neural Networks (ICANN'97)''
Laboratories:




 Record created 2006-03-10, last modified 2018-10-01

n/a:
Download fulltextPDF
External link:
Download fulltextURL
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)