Difference between revisions of "Co-training"

From LearnLab
Jump to: navigation, search
 
(6 intermediate revisions by 3 users not shown)
Line 1: Line 1:
A self-supervised learning method for learning from unlabeled examples and multiple sources. See papers by Blum and Mitchell.
+
A [[self-supervised learning]] method for learning from multiple sources. Co-training may be effective because the two or more [[sources]] of information may improve learning, particularly when there is a mix of unlabeled source examples with labeled examples.
 +
 
 +
The learning benefits of co-training may be related to the testing effect, since the multiple sources of unlabeled information may improve the generation of labels that occurs in learning. Multiple sources may provide additional learning because they provide a broader basis from which to generate a label (meaning).
 +
 
 +
A.Blum and T. Mitchell. Combining labeled and unlabeled data with co-training. In Proceedings of the 1998 Conference on Computational Learning Theory, July 1998.
 +
 
 
[[Category:Glossary]]
 
[[Category:Glossary]]
 
[[Category:PSLC General]]
 
[[Category:PSLC General]]
 +
[[Category:Learning Processes]]
 
[[Category:Coordinative Learning]]
 
[[Category:Coordinative Learning]]

Latest revision as of 16:40, 20 August 2007

A self-supervised learning method for learning from multiple sources. Co-training may be effective because the two or more sources of information may improve learning, particularly when there is a mix of unlabeled source examples with labeled examples.

The learning benefits of co-training may be related to the testing effect, since the multiple sources of unlabeled information may improve the generation of labels that occurs in learning. Multiple sources may provide additional learning because they provide a broader basis from which to generate a label (meaning).

A.Blum and T. Mitchell. Combining labeled and unlabeled data with co-training. In Proceedings of the 1998 Conference on Computational Learning Theory, July 1998.