中国邮电高校学报(英文) ›› 2014, Vol. 21 ›› Issue (2): 98-103.doi: 10.1016/S1005-8885(14)60292-2

• Others • 上一篇    下一篇

Variational learning for finite Beta-Liouville mixture models

赖裕平1,周亚建2,平源3,郭玉翠,杨义先1   

  • 收稿日期:2013-12-11 修回日期:2014-03-17 出版日期:2014-04-30 发布日期:2014-04-30
  • 通讯作者: 赖裕平 E-mail:Laiyp2009@126.com
  • 基金资助:

    国家自然科学基金;国家自然科学基金;国家自然科学基金;国家863计划项目

Variational learning for finite Beta-Liouville mixture models

  1. 1. Information Security Center, Beijing University of Posts and Telecommunications, Beijing 100876, China 2. Department of Computer Science and Technology, Xuchang University, Xuchang 461000, China 3. School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2013-12-11 Revised:2014-03-17 Online:2014-04-30 Published:2014-04-30
  • Supported by:

    the National Natural Science Foundation of China (61303232, 61363085, 61121061, 60972077), and the Hi-Tech Research and Development Program of China (2009AA01Z430).

摘要:

In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method.

关键词:

variational inference, model selection, factorized approximation, Beta-Liouville distribution, mixing modeling

Abstract:

In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method.

Key words:

variational inference, model selection, factorized approximation, Beta-Liouville distribution, mixing modeling