Established in 2005 under support of MŠMT ČR (project 1M0572)

Lectures and Presetations

On Bayesian Mixture Participants.

From:
Dec. 20 2005 3:00PM
To:
Dec. 20 2005 3:20PM
Place:
ÚTIA AV ČR
Description:
This work deals with participants modelled as a finite probabilistic mixture. The mixture model is a convex combination of simpler models called components, the coefficients of the convex combination are called component weights.
If the components model the temporal dependency of data samples, we speak about dynamic components, otherwise, we speak about static components. Similarly, if the component weights depends on historical data, they are called dynamic, otherwise, they are called static. Currently, mixtures with dynamic components and static weights are used. Exact Bayesian inference of their parameters is not tractable and some approximations of Bayesian learning has to be used. The quasi-Bayes approximation and projection-based approximation was exploited to solve this task. Mixtures with static weights were successfully used in many application domains, however, for some data sets, this approach does not provide an acceptable solution. This indicates that the descriptive power of the model is not sufficient. The aim of this contribution is to introduce mixtures with dynamic weights and to present a way of using projection based estimation (PB) for their approximate inference. PB estimation is based on minimization of Kullback-Leibler divergence between “correct” posterior density and posterior density from given predefined class of good properties. The recommended argument order is used. With suitable choice of the class of posterior densities, the minimization problem can be converted to simpler problem of evaluation of moments of multivariate probability density functions. Monte-Carlo techniques or other established methods can be then used. The presentation will focus on the derived inference algorithm. Its behavior will be illustrated on simple examples.
attachment4:
 
Copyright 2005 DAR XHTML CSS