rvm<P,K,D>

Description

Rvm<P,K,D> implements the Relevance Vector Machine [1, 2] using a fast algorithm [3] that is based on an analysis of the marginal likelihood [4].

Example

Definition

Defined in the KML header <kml/rvm.hpp>.

Template Parameters

Model of

Type requirements

Public Base Classes

Members

Notes

See also

References

[1]

Michael Tipping. The relevance vector machine. In Sara Solla, Todd Leen, and Klaus-Robert Müller, editors, Advances in Neural Information Processing Systems (NIPS’99), volume 12, pages 652–658, Cambridge, Massachusetts, USA, 2000. The MIT Press. ISBN 0-262-19450-3.
ftp://ftp.research.microsoft.com/users/tipping/rvm_nips.ps.gz

[2]

Michael Tipping. Sparse bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1:211–244, 2001. ISSN 1533-7928.
http://www.ai.mit.edu/projects/jmlr/papers/volume1/tipping01a/tipping01%%20a.pdf

[3]

Michael Tipping and Anita Faul. Fast marginal likelihood maximisation for sparse bayesian models. In Cristopher Bishop and Brendan Frey, editors, Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, January 3–6 2003, Key West, Florida, 2003. ISBN 0-9727358-0-1.
ftp://ftp.research.microsoft.com/users/mtipping/fastsbl.ps.gz

[4]

Anita Faul and Michael Tipping. Analysis of sparse bayesian learning. In Thomas Dietterich, Suzanna Becker, and Zoubin Ghahramani, editors, Advances in Neural Information Processing Systems (NIPS’01), volume 14, pages 383–389, Cambridge, Massachusetts, USA, 2002. The MIT Press. ISBN 0-262-04208-8.
ftp://ftp.research.microsoft.com/users/mtipping/sbtheory.ps.gz