000175482 001__ 175482
000175482 005__ 20180913061200.0
000175482 037__ $$aREP_WORK
000175482 245__ $$aInput-dependent Regularization of Conditional Density Models
000175482 269__ $$a2000
000175482 260__ $$c2000
000175482 336__ $$aReports
000175482 520__ $$aWe emphasize the need for input-dependent regularization in the context of conditional density models (also: discriminative models) like Gaussian process predictors. This can be achieved by a simple modification of the standard Bayesian data generation model un- derlying these techniques. Specifically, we allow the latent target function to be a- priori dependent on the distribution of the input points. While the standard genera- tion model results in robust predictors, data with missing labels is ignored, which can be wasteful if relevant prior knowledge is avail- able. We show that discriminative mod- els like Fisher kernel discriminants and Co- Training classifiers can be regarded as (ap- proximate) Bayesian inference techniques un- der the modified generation model, and that the template Co-Training algorithm is related to a variant of the well-known Expectation- Maximization (EM) technique. We propose a template EM algorithm for the modified generation model which can be regarded as generalization of Co-Training.
000175482 700__ $$0244691$$aSeeger, Matthias$$g208475
000175482 8564_ $$s206312$$uhttps://infoscience.epfl.ch/record/175482/files/icml-paper.pdf$$yn/a$$zn/a
000175482 909C0 $$0252343$$pLAPMAL$$xU12368
000175482 909CO $$ooai:infoscience.tind.io:175482$$preport
000175482 917Z8 $$x208475
000175482 937__ $$aEPFL-REPORT-175482
000175482 973__ $$aOTHER$$sPUBLISHED
000175482 980__ $$aREPORT