Conference Paper (international conference)

,

**: **NCTA2014 - International Conference on Neural Computation Theory and Applications, p. 65-75

**: **6-th International Conference on Neural Computation Theory and Applications,
(Rome, IT, 22.10.2014-24.10.2014)

**: **GA14-02652S, GA ČR,
GAP403/12/1557, GA ČR

**: **Probabilistic Neural Networks,
Product Mixtures,
Mixtures of Dependence Trees,
EM Algorithm

**: **http://library.utia.cas.cz/separaty/2014/RO/grim-0434119.pdf

**(eng): **We compare two probabilistic approaches to neural networks - the first one based on the mixtures of product components and the second one using the mixtures of dependence-tree distributions. The product mixture models can be efficiently estimated from data by means of EM algorithm and have some practically important properties. However, in some cases the simplicity of product components could appear too restrictive and a natural idea is to use a more complex mixture of dependence-tree distributions. By considering the concept of dependence tree we can explicitly describe the statistical relationships between pairs of variables at the level of individual components and therefore the approximation power of the resulting mixture may essentially increase. Nonetheless, in application to classification of numerals we have found that both models perform comparably and the contribution of the dependence-tree structures decreases in the course of EM iterations. Thus the optimal estimate of the dependence-tree mixture tends to converge to a simple product mixture model.

**: **IN