Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 7-Day Trial for You or Your Team.

Learn More →

Co-EM support vector learning

Co-EM support vector learning Co-EM Support Vector Learning Ulf Brefeld [email protected] Tobias Sche €er [email protected] Humboldt-Universit¨t zu Berlin, Department of Computer Science, Unter den Linden 6, 10099 Berlin, Germany a Abstract Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, coEM has so far only been studied with naive Bayesian learners. We cast linear classi ers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classi cation problems and compare the family of semi-supervised support vector algorithms under di €erent conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classi cation, we observe the most accurate results reported so far. ers based on independent attribute subsets. These classi ers then provide each other with labels for the unlabeled data. The co-EM algorithm (Nigam & Ghani, 2000) combines multi-view learning with the probabilistic EM approach. This, however, requires the learning algorithm to process probabilistically labeled training data and http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Co-EM support vector learning

Association for Computing Machinery — Jul 4, 2004

Loading next page...
/lp/association-for-computing-machinery/co-em-support-vector-learning-Mesrx0Hk6m

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Datasource
Association for Computing Machinery
Copyright
Copyright © 2004 by ACM Inc.
ISBN
1-58113-838-5
doi
10.1145/1015330.1015350
Publisher site
See Article on Publisher Site

Abstract

Co-EM Support Vector Learning Ulf Brefeld [email protected] Tobias Sche €er [email protected] Humboldt-Universit¨t zu Berlin, Department of Computer Science, Unter den Linden 6, 10099 Berlin, Germany a Abstract Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, coEM has so far only been studied with naive Bayesian learners. We cast linear classi ers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classi cation problems and compare the family of semi-supervised support vector algorithms under di €erent conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classi cation, we observe the most accurate results reported so far. ers based on independent attribute subsets. These classi ers then provide each other with labels for the unlabeled data. The co-EM algorithm (Nigam & Ghani, 2000) combines multi-view learning with the probabilistic EM approach. This, however, requires the learning algorithm to process probabilistically labeled training data and

There are no references for this article.