Mixtures of Experts Estimate A Posteriori Probabilities

The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classification problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.


Editor(s):
Gerstner, W.
Germond, A.
Hasler, M.
Nicoud, J. -D.
Published in:
Proceedings of the International Conference on Artificial Neural Networks (ICANN'97), 1327, 499-504
Presented at:
Proceedings of the International Conference on Artificial Neural Networks (ICANN'97)
Year:
1997
Publisher:
Berlin, Springer-Verlag
Keywords:
Note:
(IDIAP-RR 97-07)
Laboratories:




 Record created 2006-03-10, last modified 2018-03-17

External links:
Download fulltextURL
Download fulltextRelated documents
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)