Difference between revisions of "Co-training"

From LearnLab
Jump to: navigation, search
Line 1: Line 1:
A self-supervised learning method for learning from unlabeled examples and multiple sources. See papers by Blum and Mitchell.
+
A self-supervised learning method for learning from multiple sources. Co-training may be effective because the two or more sources of information may improve learning, particularly when there is a mix of unlabeled source examples with labeled examples.
 +
 
 +
This may be related to the testing effect, since the multiple sources of unlabeled information may improve the generation of labels that occurs in learning. Multiple sources may provide additional learning because they provide a broader basis from which to generate a label (meaning).
 +
 
 +
A.Blum and T. Mitchell. Combining labeled and unlabeled data with co-training. In Proceedings of the 1998 Conference on Computational Learning Theory, July 1998.
 +
 
 
[[Category:Glossary]]
 
[[Category:Glossary]]
 
[[Category:PSLC General]]
 
[[Category:PSLC General]]
 
[[Category:Coordinative Learning]]
 
[[Category:Coordinative Learning]]

Revision as of 18:30, 27 November 2006

A self-supervised learning method for learning from multiple sources. Co-training may be effective because the two or more sources of information may improve learning, particularly when there is a mix of unlabeled source examples with labeled examples.

This may be related to the testing effect, since the multiple sources of unlabeled information may improve the generation of labels that occurs in learning. Multiple sources may provide additional learning because they provide a broader basis from which to generate a label (meaning).

A.Blum and T. Mitchell. Combining labeled and unlabeled data with co-training. In Proceedings of the 1998 Conference on Computational Learning Theory, July 1998.