Classification using localized mixtures of experts

A mixture of experts consists of a gating network that learns to partition the input space and of experts networks attributed to these different regions. This paper focuses on the choice of the gating network. First, a localized gating network based on a mixture of linear latent variable models is proposed that extends a gating network introduced by Xu et al, based on Gaussian mixture models. It is shown that this localized mixture of experts model, can be trained with the Expectation Maximization algorithm. The localized model is compared on a set of classification problems, with mixtures of experts having single or multi-layer perceptrons as gating network. It is found that the standard mixture of experts with feed-forward networks as gate often outperforms the other models.


Published in:
Proceedings of the International Conference on Artificial Neural Networks (ICANN'99), 2, 838-843
Presented at:
Proceedings of the International Conference on Artificial Neural Networks (ICANN'99)
Year:
1999
Publisher:
London: IEE
Keywords:
Note:
(IDIAP-RR 98-14)
Laboratories:




 Record created 2006-03-10, last modified 2018-03-17

n/a:
Download fulltextPDF
External links:
Download fulltextURL
Download fulltextRelated documents
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)