Probabilistic Classification Vector Machines


The Problem

In Bayesian machine learning, the usual strategy is to use a Gaussian prior for parameters in the model. However, it does not mean that the Gaussian prior is appropriate for all problems. We theoretically and empirically investigate the issue and confirm that the Gaussian prior is not appropriate for classification problems. Therefore, we propose Probabilistic Classification Vector Machines (PCVMs) that address this problem of Gaussian prior by using a truncated Gaussian prior or each class of points.

 

Matlab toolbox

In this Matlab toolbox, we implement the expectation maximization (EM) based PCVMs algorithm. In this algorithm, we can also optimize the kernel parameters together with the model training.

 

 

Codes are available for download

The Matlab Toolbox is released under an open-source license, and is available at the following link

Download the Matlab Package

 

References


- Huanhuan Chen, Peter Tino and Xin Yao. Probabilistic Classification Vector Machines. IEEE Transactions on Neural Networks. vol.20, no.6, pp.901-914, June 2009.


- Huanhuan Chen, Peter Tino, and Xin Yao. Efficient Probabilistic Classification Vector Machine with Incremental Basis Function Selection. IEEE Transactions on Neural Networks and Learning Systems, vol.25, no.2, pp. 356-369, February 2014.

 

 

 

All Matlab codes on this page are published under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License.

Creative Commons License