<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://learnlab.org/mediawiki-1.44.2/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=130.49.138.225</id>
	<title>Theory Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://learnlab.org/mediawiki-1.44.2/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=130.49.138.225"/>
	<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Special:Contributions/130.49.138.225"/>
	<updated>2026-04-29T20:12:18Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.44.2</generator>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8802</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8802"/>
		<updated>2009-01-22T21:19:41Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Dependent Variables */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on rotational kinematics, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== Annotated Bibliography ====&lt;br /&gt;
* Presentation to the PSLC Advisory Board, January, 2009&lt;br /&gt;
* Poster to be presented at the Second Annual Inter-Science of Learning Center Student and Post-Doc Conference (iSLC, &#039;09) at Seattle, WA, February 2009&lt;br /&gt;
* To be submitted as a paper to CogSci 2009&lt;br /&gt;
&lt;br /&gt;
==== References ====&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
* Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.&lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8801</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8801"/>
		<updated>2009-01-22T21:19:16Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Annotated Bibliography */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== Annotated Bibliography ====&lt;br /&gt;
* Presentation to the PSLC Advisory Board, January, 2009&lt;br /&gt;
* Poster to be presented at the Second Annual Inter-Science of Learning Center Student and Post-Doc Conference (iSLC, &#039;09) at Seattle, WA, February 2009&lt;br /&gt;
* To be submitted as a paper to CogSci 2009&lt;br /&gt;
&lt;br /&gt;
==== References ====&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
* Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.&lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8800</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8800"/>
		<updated>2009-01-22T21:18:54Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Further Information */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== Annotated Bibliography ====&lt;br /&gt;
* Presentation to the PSLC Advisory Board, January, 2008&lt;br /&gt;
* Poster to be presented at the Second Annual Inter-Science of Learning Center Student and Post-Doc Conference (iSLC, &#039;09) at Seattle, WA.&lt;br /&gt;
* To be submitted as a paper to CogSci 2009&lt;br /&gt;
&lt;br /&gt;
==== References ====&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
* Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.&lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8799</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8799"/>
		<updated>2009-01-22T21:13:47Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
* Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.&lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8798</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8798"/>
		<updated>2009-01-22T21:12:41Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8797</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8797"/>
		<updated>2009-01-22T21:12:09Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Future Plans */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
====  Future plans ====&lt;br /&gt;
Our future plans for January 2009 - August 2009:&lt;br /&gt;
* Code collaborative transcripts for different learning processes&lt;br /&gt;
* Conduct laboratory study&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8796</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8796"/>
		<updated>2009-01-22T21:09:39Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Connections */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8795</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8795"/>
		<updated>2009-01-22T21:09:18Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Connections */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
This project shares features with the following research projects:&lt;br /&gt;
&lt;br /&gt;
* [[Bridging_Principles_and_Examples_through_Analogy_and_Explanation|Bridging Principles and Examples through Analogy and Explanation]]&lt;br /&gt;
* [[Craig observing | Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi)]]&lt;br /&gt;
&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8794</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8794"/>
		<updated>2009-01-22T19:53:35Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8793</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8793"/>
		<updated>2009-01-22T19:53:22Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Schema Acquisition and Analogical Comparison&#039;&#039;&#039;: A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8792</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8792"/>
		<updated>2009-01-22T19:52:42Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Past research on collaborative learning provides compelling evidence that when students learn in groups of two or more, they show better learning gains at the group level than when working alone. Much of this research has focused on identifying conditions that underlie successful collaboration. For example, we know that factors such as presence of cognitive conflict (Schwartz, Neuman, &amp;amp; Biezuner, 2000), establishing of common ground (Clark, 2000) and scaffolding (or structuring) of the interaction are important factors affecting collaborative learning. Providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) have also been shown to facilitate collaborative learning compared to unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). These results are typically explained in terms of the sense making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components.&lt;br /&gt;
&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior work has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately. &lt;br /&gt;
In an ongoing project in the Physics LearnLab by Nokes &amp;amp; VanLehn, (2008) students learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8791</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8791"/>
		<updated>2009-01-22T19:37:04Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Abstract */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage [[analogical comparison]] to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]].&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on [[collaboration| collaborative learning]] has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the [[sense making]] processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8790</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8790"/>
		<updated>2009-01-22T19:29:11Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
 &#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on [[collaboration| collaborative learning]] has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the [[sense making]] processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8789</id>
		<title>Physics</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8789"/>
		<updated>2009-01-22T19:27:55Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Physics LearnLab Course =&lt;br /&gt;
&lt;br /&gt;
The Physics LearnLab Course (PLLC) is a research facility for studying how students learn introductory physics.  It provides baseline data on student activities throughout the physics course, and it hosts specific research studies that measure the improvement in students’ learning caused by changes in the instruction.  At this time, it is sited in the two-semester Introductory Physics courses at the US Naval Academy in Annapolis, MD and three courses at Watchung Hills Regional High School in Warren, NJ.&lt;br /&gt;
&lt;br /&gt;
In order to increase the number of LearnLab sites, it is &lt;br /&gt;
essential that we increase the number of students using Andes.&lt;br /&gt;
During 2009, we plan to create a completely new web-based user interface.  This will allow us to integrate Andes into [http://www.webassign.net WebAssign], the leading commercial provider of physics online homework making Andes easily accessible to over a hundred thousand students.&lt;br /&gt;
&lt;br /&gt;
Students in PLLC classes use the [http://www.andestutor.org Andes] intelligent tutoring system to do their homework.  [http://www.andestutor.org Andes] allows the PLLC to collect fine-grained data on student activity through the entire semester.  The remainder of the course is taught the usual way, with lectures, labs, and a commercial paper-based textbook.  &#039;&#039;In vivo&#039;&#039; experiments take place either by modifying Andes or by running studies during lab sessions that instructors have “donated” to the PLLC.  &lt;br /&gt;
&lt;br /&gt;
== Studies Conducted ==&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: center;&amp;quot;&lt;br /&gt;
|+ &#039;&#039;&#039;Summary of Studies&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
!&lt;br /&gt;
! colspan=2 | &#039;&#039;In Vivo&#039;&#039; &lt;br /&gt;
! colspan=2 | Pull Out &lt;br /&gt;
! colspan=2 | Lab &lt;br /&gt;
! colspan=4 | Capacity&lt;br /&gt;
|-&lt;br /&gt;
! Course || Run || Planned || Run || Planned || Run || Planned&lt;br /&gt;
! Total # Sections&lt;br /&gt;
! Total # Students&lt;br /&gt;
! Max # Studies / Year&lt;br /&gt;
! Max # Students / Study&lt;br /&gt;
|-&lt;br /&gt;
| Physics || 10 || 2 || 0 || 0 || 3 || 1 || 5 || 130 || 4 || 65&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Capacity was determined by counting the number of students who solved more than 40 [[Andes]] problems in Fall 2007.  There are about 25 students in a section and each LearnLab site has about 65 students.&lt;br /&gt;
&lt;br /&gt;
Completed studies:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg_Examples-as-Help | Scaffolding Problem Solving with Embedded Example to Promote Deep Learning (Ringenberg &amp;amp; VanLehn, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Diss|The effects of elaborative dialog on problem solving and learning (Hausmann &amp;amp; Chi, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Post-practice reflection (Katz)|Post-practice reflection (Katz &amp;amp; Connelly, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_questions|Deep-level questions during example studying (Craig &amp;amp; Chi, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study|Does it matter who generates the explanations? (Hausmann &amp;amp; VanLehn, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_observing|Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Reflective Dialogues (Katz)|Reflective Dialogues (Katz, Connelly &amp;amp; Treacy, 2006-2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study2|The effects of interaction on robust learning (Hausmann &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Bridging_Principles_and_Examples_through_Analogy_and_Explanation | Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Extending Reflective Dialogue Support (Katz &amp;amp; Connelly)|Extending Automated Dialogue Support for Robust Learning of Physics (Katz &amp;amp; Connelly, 2007-2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Plateau_study|The Interaction Plateau: A comparison between human tutoring, Andes, and computer-aided instruction (Hausmann, van de Sande, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Self-explanation: Meta-cognitive vs. justification prompts|Self-explanation: Meta-cognitive vs. justification prompts (Hausmann, van de Sande, Gershman, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Analogical Scaffolding in Collaborative Learning|Analogical Scaffolding in Collaborative Learning (Gadgil &amp;amp;Nokes, 2008-2009)]]&lt;br /&gt;
&lt;br /&gt;
In progress or planned:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg Ill-Defined Physics|Does Solving Ill-Defined Physics Problems Elicit More Learning than Conventional Problem Solving? (Ringenberg &amp;amp; VanLehn)]]&lt;br /&gt;
&lt;br /&gt;
*Comparing two homework systems, Sophie Gershman, 2008--2009.&lt;br /&gt;
&lt;br /&gt;
*Analogical Scaffolding in Collaborative Learning, lab study (Gadgil &amp;amp;Nokes, 2008-2009)&lt;br /&gt;
&lt;br /&gt;
==Achievements==&lt;br /&gt;
&lt;br /&gt;
From its inception in January 2005 to the present, we have achieved the following: &lt;br /&gt;
&lt;br /&gt;
===Content development milestones===&lt;br /&gt;
* The number of Andes problems assigned by instructors at the Naval Academy has increased from 58% to 100% in the Fall semester, and from 42% to 75% in the Spring semester. &lt;br /&gt;
* We have increased the total number of  working Andes problems from 350 to 556.&lt;br /&gt;
* The number of physics principles has increased from 126 to 219.  The number of  rules in the physics “Knowledge Base” (the AI system) has increased from 619 to 915.  The number of scalar quantities defined in Andes has increased from 85 to 126.  &lt;br /&gt;
* We shot videos of problems being solved—at least one per problem set—and revised many of the older videos.  These act as worked examples.  Students who view the videos in a problem set before solving any problems have a much easier time of it. &lt;br /&gt;
&lt;br /&gt;
===Enabling Technologies===&lt;br /&gt;
* We developed a way to run Andes under [http://www.cmu.edu/oli/ OLI].  In particular, we found ways to get them to communicate through the USNA firewall, to upload log data and solution files, and to recover gracefully from most crashes. &lt;br /&gt;
* We developed a method to control the data that the OLI gradebook exports to spreadsheets so that only the data that instructors wanted was exported in a format they specified.&lt;br /&gt;
* Implemented “gating,” a method to force students to solve Andes problems in a pre-determined order.  This was needed for the Sandy Katz experiment in fall 2006.&lt;br /&gt;
* Andes raw logs can now be converted to the [[http://learnlab.web.cmu.edu/ DataShop]] format at the [[knowledge component]] level (June 2007).  The knowledge components associated with each correct student action (corresponding with a [[step]]) and must incorrect action (see [[transaction]]) is determined by [[Andes]].&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The Andes log files represent a rich source of information about student problem solving but have not been studied in depth, outside the needs of specific experiments.  We have begun to study the log files and begun to promote such work in the Physics Education Research (PER) community.&lt;br /&gt;
* Studied time usage (how long does it take to apply a KC?) and time-on-task (are they really working?).  Investigated whether time-on-task could be used as a metric for student learning of KC&#039;s. &lt;br /&gt;
* Begun comparing the Log data to end-of-semester surveys administered at the USNA.  The surveys were not anonymous, so individual survey results can be matched with the associated log files.&lt;br /&gt;
* Conducted a [http://www.andestutor.org/AAPT-2007/ workshop on log file analysis] at [http://web.phys.ksu.edu/perc2007/ PERC 2007].  Two senior members of the PER community, Joe Redish and Gerd Kortemeyer, attended, expressed initial interest and corresponded with us after the conference, but no firm plans have been made.&lt;br /&gt;
&lt;br /&gt;
===Adoption of Andes=== &lt;br /&gt;
As of Fall 2008, [[Andes]] is being used at the following institutions:&lt;br /&gt;
* St. Anselm college, Manchester NH (1 instructor).&lt;br /&gt;
* US Naval Academy (1 instructor, several sections).&lt;br /&gt;
* SUNY Fredonia (1 instructor).&lt;br /&gt;
* Gannon University, Erie PA (1 instructor, several sections).&lt;br /&gt;
* Conant High School, Hoffman Estates, IL (1 instructor).&lt;br /&gt;
* Watchung Hills Regional High School, Warren NJ (2 instructors, several sections).&lt;br /&gt;
We see a shift in usage relative to previous years.  Currently, the Naval Academy &lt;br /&gt;
accounts for only 25% of our users; 32% of our users are now from High Schools.&lt;br /&gt;
&lt;br /&gt;
We observe steadily growing use of Andes by individuals not enrolled in any [http://www.cmu.edu/oli/courses/physics/ OLI course].  From January to April 2008, between 90 and 278 different users (some use is anonymous, precluding an exact count) solved a total of 1647 Andes problems.  The previous semester, a total of 1260 problems were solved.&lt;br /&gt;
&lt;br /&gt;
===Advertising Andes in the physics community=== &lt;br /&gt;
We have focused our efforts on meetings of the [http://www.aapt.org American Association of Physics Teachers (AAPT)] and the [http://www.aps.org/meetings/ American Physical Society (APS)] where we have presented numerous talks, posters, and a workshop.  &lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor for Introductory Physics Homework. Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor Homework System for Introductory Physics. Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Hausman, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Andes: An Intelligent Homework System for Introductory Physics. Contributed talk at the [http://www.aapt.org/Events/wm2007 &#039;&#039;2007 AAPT Winter Meeting&#039;&#039;], Seattle WA, January 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Changing Student Attitudes using Andes, An Intelligent Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2007 2007 AAPT Winter Meeting],&#039;&#039; Seattle WA, January 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.A21.10 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An intelligent homework helper].  Contributed talk at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.K1.199 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Changing Student Attitudes using Andes, An Intelligent Homework System.]  Poster at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausman.  An Analysis of Student Learning Using the Andes Homework System.  Contributed talk at the [http://www.aapt.org/Events/sm2007 &#039;&#039;2007 AAPT Summer Meeting&#039;&#039;], Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An Intelligent Tutor Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2007 2007 AAPT Summer Meeting],&#039;&#039; Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande &amp;amp; K. VanLehn. Cognitive Analysis of Student Learning Using LearnLab.  Workshop presented at the [http://web.phys.ksu.edu/perc2007/ &#039;&#039;Physics Education Research Conference&#039;&#039;], Greensboro NC, August 2007.  [http://www.andestutor.org/AAPT-2007/ Workshop website].&lt;br /&gt;
* S. Katz &amp;amp; J. Connelly.  Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes.  Poster presented at the &#039;&#039;[http://web.phys.ksu.edu/perc2007/ Physics Education Research Conference],&#039;&#039; Greensboro NC, August 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039;  Baltimore MD, January 2008.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; M. Wintersgill.  Student attitudes towards Andes, an intelligent tutor homework system.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039; Baltimore MD, January 2008.&lt;br /&gt;
* [http://meetings.aps.org/Meeting/OSS08/Event/86059 B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?]  Contributed talk at &#039;&#039;[http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],&#039;&#039;  Youngstown OH, March 2008.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT),&#039;&#039; Lock Haven PA, April 2008.&lt;br /&gt;
These meetings generally do not publish proceedings.&lt;br /&gt;
 &lt;br /&gt;
More recently, we have begun promoting the Physics LearnLab at regional [http://www.aapt.org AAPT] meetings:&lt;br /&gt;
* [http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],  Youngstown OH, March 2008.  Vendor exhibit.&lt;br /&gt;
* Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT), Lock Haven PA, April 2008.  Vendor exhibit.&lt;br /&gt;
*  Fall meeting of the Arizona section of the AAPT, October 2008.  Workshop for instructors.&lt;br /&gt;
In addition, we have presented Andes at other universities:  Southern Methodist University (2006), the Ohio State University (2007), Rutgers University (2007), US Air Force Academy (2007), and the US Naval Academy (2007).&lt;br /&gt;
&lt;br /&gt;
===Publications on Andes===&lt;br /&gt;
* VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., and Wintersgill, M.  The Andes Physics Tutoring System: Lessons Learned.  &#039;&#039;International Journal of Artificial Intelligence and Education,&#039;&#039; 15 (3), 1-47. &lt;br /&gt;
* Vanlehn, K., Lynch, C., Schulze, K., Shapiro, J. A., Shelby, R. H., Taylor, L., Treacy, D. J., Weinstein, A., and Wintersgill, M. C.  The Andes physics tutoring system: Five years of evaluations.  In G. McCalla, C. K. Looi, B. Bredeweg &amp;amp; J. Breuker (Eds.), &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  (pp. 678-685) Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* VanLehn, K., &amp;amp; van de Sande, B.  (in press) Expertise in elementary physics, and how to acquire it. In K. A. Ericsson (Ed.), &#039;&#039;Development of professional expertise:  Toward measurement of expert performance and design of optimal learning environments.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===Publications on PLLC experiments===&lt;br /&gt;
* Connelly, J. &amp;amp; Katz, S. (2006).  Intelligent dialogue support for physics problem solving:  Some preliminary mixed results.  &#039;&#039;Technology, Instruction, Cognition, and Learning,&#039;&#039; 4, 1-29.&lt;br /&gt;
* Ringenberg, M. &amp;amp; VanLehn, K. (2006). Scaffolding problem solving with annotated, worked-out examples to promote deep learning. In K. Ashley &amp;amp; M. Ikeda (Eds.), &#039;&#039;Intelligent Tutoring Systems: 8th International Conference, ITS2006.&#039;&#039; pp. 625-634. Amsterdam: IOS Press.  &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K. (2007) The impact of explicit strategy instruction on problem-solving behaviors across intelligent tutoring systems. In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp. 167-172 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Domain-specific and domain-independent interactive behaviors in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 548-550. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Porting an intelligent tutoring system across domains. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 551-553.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Accelerated future learning via explicit instruction of a problem solving strategy. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 409-416.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Craig, S. D., VanLehn, K., Gadgil, S., &amp;amp; Chi, M. T. H. (2007). Learning from collaboratively observing videos during problem solving with Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 554-556. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Explaining self-explaining:  A contrast between content and generation.  In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 417-424. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Self-explaining in the classroom:  Learning curve evidence   In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp 1067-1072 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Katz, S., Connelly, J., &amp;amp; Wilson, C. (2007). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.), &#039;&#039;Artificial Intelligence in Education 2007&#039;&#039;.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, May). Trialog: How Peer Collaboration Helps Remediate Errors in an ITS. Paper presented at the 21st meeting of the International FLAIRS Conference, Coconut Grove, FL.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, June). Shall we explain? Augmenting Learning from Intelligent Tutoring Systems and Peer Collaboration. Paper presented at the 9th meeting of the International Conference on Intelligent Tutoring Systems, Montréal, Canada.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., van de Sande, C., &amp;amp; VanLehn, K. (2008, June). Productive Dialog During Collaborative Problem Solving. Paper presented at the 2008 International Conference for the Learning Sciences, Utrecht, Netherlands.&lt;br /&gt;
&lt;br /&gt;
==Current Status==&lt;br /&gt;
&lt;br /&gt;
The PLLC at the US Naval Academy is currently comprised of 3-5 sections (depending on the semester) of 25 students each.  The sections are taught by Professors Mary Wintersgill and Ted McClanahan.  At Watchung Hills Regional High School, the instructors are Sophie Gershmann and Brian Brown who teach three different levels of physics courses, mostly for Juniors and Seniors.&lt;br /&gt;
The students use [http://www.cmu.edu/oli/ Open Learning Initiative (OLI)] to access [[Andes]], and the instructors use OLI to view gradebooks.  Both high school and college students use Andes at home to do their regular homework assignments.  Occasionally, Andes is used in class, but such “seat work” is not common.&lt;br /&gt;
&lt;br /&gt;
Raw log data from Andes is stored on OLI servers.  The raw data is periodically converted to [http://learnlab.web.cmu.edu/ DataShop] format, but the conversion process is still not completely satisfactory, as some information is still available only from the raw log data.  Researchers thus refer to both types of data.&lt;br /&gt;
&lt;br /&gt;
All user identification is encrypted.  The mapping between encrypted identities and student names is held by the Andes development programmer, Anders Weinstein.  Instructors see only the students’ user identification before encryption; researchers see only the encrypted identities.  Non-log data, such as hard-copies of midterm exams or audio files from verbal protocols, are collected as needed for specific experiments.  They are anonymized by Anders Weinstein and stored in locked file cabinets or secure servers.   &lt;br /&gt;
&lt;br /&gt;
Although most experiments are in vivo experiments conducted in the PLLC courses, some studies are conventional lab studies.   For instance, an experimenter might first run a study in the lab with paid volunteers and later do an improved version of the study in one or more PLLC classes.&lt;br /&gt;
&lt;br /&gt;
==Plans==&lt;br /&gt;
&lt;br /&gt;
Our major goal continues to be to expand the number of sites and instructors involved in the PLLC.  There are simply not enough lab slots and students to meet the existing demand from PLLC experimenters.  In order to increase involvement in the PLLC, we first need to increase the number of instructors using Andes in their courses, and make their experience a positive one.&lt;br /&gt;
&lt;br /&gt;
===Increase awareness of Andes===  &lt;br /&gt;
We need to increase awareness of Andes in the physics community.&lt;br /&gt;
To date, we have focused our efforts on national meetings of&lt;br /&gt;
the AAPT and APS.  However, we plan to broaden our efforts:&lt;br /&gt;
* We have begun to promote Andes at regional AAPT meetings and hope to expand this effort in the future.  &lt;br /&gt;
* We plan to arrange a summer school targeted mainly at regional high-school teachers of physics.  Our long-term desire is for the summer school activity to eventually grow into a community of users consisting of both high school and college level instructors.&lt;br /&gt;
* Continue visiting physics departments at other universities.&lt;br /&gt;
* Publish PLLC-related research in the physics education journals.&lt;br /&gt;
&lt;br /&gt;
===Web-based delivery===&lt;br /&gt;
Andes currently runs on Microsoft Windows machines as a Windows executable, requiring a software download/installation before it can be run.  We lost at least two potential sites (Paul Perkins’ High School class in Bellevue WA and the US Air Force Academy) due to issues associated with this.  In both cases, instructors were enthusiastic about Andes and assigned Andes to their students, but a significant number of students had troubles installing the software, getting it to run reliably, or did not have Microsoft Windows available to them.  We believe we are losing many other potential clients due to this architecture.  Thus, we have begun the development of a new web-based user interface to allow delivery of Andes as a true web application.&lt;br /&gt;
&lt;br /&gt;
===Improvements to Andes itself===&lt;br /&gt;
Based on conversations with potential instructors as they view demonstrations of Andes and on instructors who have dropped Andes after using it, we have identified several aspects of Andes itself that we need to improve:&lt;br /&gt;
* Instructors want a user interface that appears to be simple to learn.  The new version of the user interface that we are developing will have a very simple design, making it similar to a generic drawing program (like Powerpoint).&lt;br /&gt;
* Instructors want Andes to be a commercial product.  In particular, they are worried about the long-term stability of the software product and that user support may be sporadic or unprofessional.  The new user web-based interface will allow us to deliver Andes via our partners, [http://www.webassign.net WebAssign] and [http://www.lon-capa.org LON-CAPA].  Furthermore, we plan to offer Andes under an Open Source License, to ensure long-term availability and allow others to contribute to the future development of Andes.&lt;br /&gt;
* Instructors want all reasonable student actions to be accepted.   The new user interface will feature free text input, allowing greater flexibility.&lt;br /&gt;
* Instructors want good, effective hints.  We plan to make instructor evaluations of hint sequences an integral part of future workshops and summer schools.  However, to really improve the hint quality would require that Andes maintain a model of the student across problems.  This is one aspect of expert human tutoring that we can&#039;t capture with the existing system.&lt;br /&gt;
* Other improvements requests that we hear regularly:&lt;br /&gt;
** Allow sensitivity to lengths of vectors.&lt;br /&gt;
** Allow vector equations (currently, Andes equations are all scalar).&lt;br /&gt;
** Instructor control over policy for student actions that are correct but don&#039;t contribute to a solution.&lt;br /&gt;
&lt;br /&gt;
===Grading policy===&lt;br /&gt;
Unfortunately, the current grading rubric is opaque and complicated and we are not always happy with the validity of the scores.&lt;br /&gt;
There are two problems:&lt;br /&gt;
* We don’t have any mechanism for an instructor to understand or modify the scoring rubric.&lt;br /&gt;
* Some students become focused on raising their scores and, due to various weaknesses of (or incorrect inferences about) the scoring rubric, engage in behaviors that may raise their scores but do not constitute good problem-solving practice.  For instance, a student will put in the final answer to a problem, and then go back and add problem-solving steps until their score is acceptably high.&lt;br /&gt;
Since one of the main goals of a grading policy is to encourage students to engage in productive problem solving behavior, any changes to the grading policy must be accompanied by log file analysis.&lt;br /&gt;
&lt;br /&gt;
===Supporting existing Andes users===   &lt;br /&gt;
There are a number of non-PLLC instructors using Andes in their classrooms as well as a number of users not affiliated with any OLI course.&lt;br /&gt;
* Provide instructor support for setting up and running classes and user support for difficulties installing and running Andes.&lt;br /&gt;
* Add instructor requested homework problems.  We will continue our policy of adding new content based on instructor requests.&lt;br /&gt;
* Add instructor requested problem types (such as graph drawing).&lt;br /&gt;
* Fixing instructor reported bugs and complaints promptly.  In particular, Andes sometimes gives hint sequences that are not helpful.  Also, it sometimes won&#039;t accept solution steps that instructors would allow.&lt;br /&gt;
* Develop log file analysis to detect ineffective hint sequences, common student difficulties, and plain old bugs.&lt;br /&gt;
* Eventually, hold some instructor workshops for existing instructors, so that they feel part of the Andes development process and connect with other Andes users.&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The [[knowledge component]]s (KCs) used by Andes generally do not produce the nice learning curves that one would expect, which makes it problematic for experimenters to use them as dependent measures.  We suspect that the present physics KCs implicitly contain the knowledge needed for applying a principle within a problem context along with the principle itself.  Thus, when a KC that has been practiced several times in simple problems is used for the first time in a complex problem, the associated assistance score may be higher than expected.  In fact, it is common practice in physics homework assignments to exercise students in applying physics principles in widely varying problem contexts.  Thus, as the problem context varies, the difficulty of applying our present KCs vary widely, resulting in widely varying assistance scores.  We have been doing data mining to test this hypothesis, but this has been a backburner activity and is moving slowly.&lt;br /&gt;
&lt;br /&gt;
Here are some continuing activities associated with log files: &lt;br /&gt;
* Download log files from OLI, anonymize them, and load them into the DataShop.&lt;br /&gt;
* Andes raw logs can be converted to the DataShop format, but the converted logs often do not have the right information in them for the kinds of analysis experimenters want to do, so the converter scripts must be changed.&lt;br /&gt;
* Finish investigating whether time spent can be a useful metric of student learning.&lt;br /&gt;
* Continue investigating why the present KC’s produce learning curves that do not match current theoretical predictions.&lt;br /&gt;
* Continue promoting Log file analysis as an interesting area of research, especially for those interested in developing cognitive models of student learning.&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8788</id>
		<title>Physics</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8788"/>
		<updated>2009-01-22T18:15:23Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Studies Conducted */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Physics LearnLab Course =&lt;br /&gt;
&lt;br /&gt;
The Physics LearnLab Course (PLLC) is a research facility for studying how students learn introductory physics.  It provides baseline data on student activities throughout the physics course, and it hosts specific research studies that measure the improvement in students’ learning caused by changes in the instruction.  At this time, it is sited in the two-semester Introductory Physics courses at the US Naval Academy in Annapolis, MD and three courses at Watchung Hills Regional High School in Warren, NJ.&lt;br /&gt;
&lt;br /&gt;
In order to increase the number of LearnLab sites, it is &lt;br /&gt;
essential that we increase the number of students using Andes.&lt;br /&gt;
During 2009, we plan to create a completely new web-based user interface.  This will allow us to integrate Andes into [http://www.webassign.net WebAssign], the leading commercial provider of physics online homework making Andes easily accessible to over a hundred thousand students.&lt;br /&gt;
&lt;br /&gt;
Students in PLLC classes use the [http://www.andestutor.org Andes] intelligent tutoring system to do their homework.  [http://www.andestutor.org Andes] allows the PLLC to collect fine-grained data on student activity through the entire semester.  The remainder of the course is taught the usual way, with lectures, labs, and a commercial paper-based textbook.  &#039;&#039;In vivo&#039;&#039; experiments take place either by modifying Andes or by running studies during lab sessions that instructors have “donated” to the PLLC.  &lt;br /&gt;
&lt;br /&gt;
== Studies Conducted ==&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: center;&amp;quot;&lt;br /&gt;
|+ &#039;&#039;&#039;Summary of Studies&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
!&lt;br /&gt;
! colspan=2 | &#039;&#039;In Vivo&#039;&#039; &lt;br /&gt;
! colspan=2 | Pull Out &lt;br /&gt;
! colspan=2 | Lab &lt;br /&gt;
! colspan=4 | Capacity&lt;br /&gt;
|-&lt;br /&gt;
! Course || Run || Planned || Run || Planned || Run || Planned&lt;br /&gt;
! Total # Sections&lt;br /&gt;
! Total # Students&lt;br /&gt;
! Max # Studies / Year&lt;br /&gt;
! Max # Students / Study&lt;br /&gt;
|-&lt;br /&gt;
| Physics || 10 || 2 || 0 || 0 || 3 || 1 || 5 || 130 || 4 || 65&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Capacity was determined by counting the number of students who solved more than 40 [[Andes]] problems in Fall 2007.  There are about 25 students in a section and each LearnLab site has about 65 students.&lt;br /&gt;
&lt;br /&gt;
Completed studies:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg_Examples-as-Help | Scaffolding Problem Solving with Embedded Example to Promote Deep Learning (Ringenberg &amp;amp; VanLehn, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Diss|The effects of elaborative dialog on problem solving and learning (Hausmann &amp;amp; Chi, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Post-practice reflection (Katz)|Post-practice reflection (Katz &amp;amp; Connelly, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_questions|Deep-level questions during example studying (Craig &amp;amp; Chi, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study|Does it matter who generates the explanations? (Hausmann &amp;amp; VanLehn, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_observing|Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Reflective Dialogues (Katz)|Reflective Dialogues (Katz, Connelly &amp;amp; Treacy, 2006-2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study2|The effects of interaction on robust learning (Hausmann &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Bridging_Principles_and_Examples_through_Analogy_and_Explanation | Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Extending Reflective Dialogue Support (Katz &amp;amp; Connelly)|Extending Automated Dialogue Support for Robust Learning of Physics (Katz &amp;amp; Connelly, 2007-2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Plateau_study|The Interaction Plateau: A comparison between human tutoring, Andes, and computer-aided instruction (Hausmann, van de Sande, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Self-explanation: Meta-cognitive vs. justification prompts|Self-explanation: Meta-cognitive vs. justification prompts (Hausmann, van de Sande, Gershman, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Analogical Scaffolding in Collaborative Learning|Analogical Scaffolding in Collaborative Learning (Gadgil &amp;amp;Nokes, 2008-2009)]]&lt;br /&gt;
&lt;br /&gt;
In progress or planned:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg Ill-Defined Physics|Does Solving Ill-Defined Physics Problems Elicit More Learning than Conventional Problem Solving? (Ringenberg &amp;amp; VanLehn)]]&lt;br /&gt;
&lt;br /&gt;
*Comparing two homework systems, Sophie Gershman, 2008--2009.&lt;br /&gt;
&lt;br /&gt;
*[[Analogical Scaffolding in Collaborative Learning|Analogical Scaffolding in Collaborative Learning]], lab study (Gadgil &amp;amp;Nokes, 2008-2009)&lt;br /&gt;
&lt;br /&gt;
==Achievements==&lt;br /&gt;
&lt;br /&gt;
From its inception in January 2005 to the present, we have achieved the following: &lt;br /&gt;
&lt;br /&gt;
===Content development milestones===&lt;br /&gt;
* The number of Andes problems assigned by instructors at the Naval Academy has increased from 58% to 100% in the Fall semester, and from 42% to 75% in the Spring semester. &lt;br /&gt;
* We have increased the total number of  working Andes problems from 350 to 556.&lt;br /&gt;
* The number of physics principles has increased from 126 to 219.  The number of  rules in the physics “Knowledge Base” (the AI system) has increased from 619 to 915.  The number of scalar quantities defined in Andes has increased from 85 to 126.  &lt;br /&gt;
* We shot videos of problems being solved—at least one per problem set—and revised many of the older videos.  These act as worked examples.  Students who view the videos in a problem set before solving any problems have a much easier time of it. &lt;br /&gt;
&lt;br /&gt;
===Enabling Technologies===&lt;br /&gt;
* We developed a way to run Andes under [http://www.cmu.edu/oli/ OLI].  In particular, we found ways to get them to communicate through the USNA firewall, to upload log data and solution files, and to recover gracefully from most crashes. &lt;br /&gt;
* We developed a method to control the data that the OLI gradebook exports to spreadsheets so that only the data that instructors wanted was exported in a format they specified.&lt;br /&gt;
* Implemented “gating,” a method to force students to solve Andes problems in a pre-determined order.  This was needed for the Sandy Katz experiment in fall 2006.&lt;br /&gt;
* Andes raw logs can now be converted to the [[http://learnlab.web.cmu.edu/ DataShop]] format at the [[knowledge component]] level (June 2007).  The knowledge components associated with each correct student action (corresponding with a [[step]]) and must incorrect action (see [[transaction]]) is determined by [[Andes]].&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The Andes log files represent a rich source of information about student problem solving but have not been studied in depth, outside the needs of specific experiments.  We have begun to study the log files and begun to promote such work in the Physics Education Research (PER) community.&lt;br /&gt;
* Studied time usage (how long does it take to apply a KC?) and time-on-task (are they really working?).  Investigated whether time-on-task could be used as a metric for student learning of KC&#039;s. &lt;br /&gt;
* Begun comparing the Log data to end-of-semester surveys administered at the USNA.  The surveys were not anonymous, so individual survey results can be matched with the associated log files.&lt;br /&gt;
* Conducted a [http://www.andestutor.org/AAPT-2007/ workshop on log file analysis] at [http://web.phys.ksu.edu/perc2007/ PERC 2007].  Two senior members of the PER community, Joe Redish and Gerd Kortemeyer, attended, expressed initial interest and corresponded with us after the conference, but no firm plans have been made.&lt;br /&gt;
&lt;br /&gt;
===Adoption of Andes=== &lt;br /&gt;
As of Fall 2008, [[Andes]] is being used at the following institutions:&lt;br /&gt;
* St. Anselm college, Manchester NH (1 instructor).&lt;br /&gt;
* US Naval Academy (1 instructor, several sections).&lt;br /&gt;
* SUNY Fredonia (1 instructor).&lt;br /&gt;
* Gannon University, Erie PA (1 instructor, several sections).&lt;br /&gt;
* Conant High School, Hoffman Estates, IL (1 instructor).&lt;br /&gt;
* Watchung Hills Regional High School, Warren NJ (2 instructors, several sections).&lt;br /&gt;
We see a shift in usage relative to previous years.  Currently, the Naval Academy &lt;br /&gt;
accounts for only 25% of our users; 32% of our users are now from High Schools.&lt;br /&gt;
&lt;br /&gt;
We observe steadily growing use of Andes by individuals not enrolled in any [http://www.cmu.edu/oli/courses/physics/ OLI course].  From January to April 2008, between 90 and 278 different users (some use is anonymous, precluding an exact count) solved a total of 1647 Andes problems.  The previous semester, a total of 1260 problems were solved.&lt;br /&gt;
&lt;br /&gt;
===Advertising Andes in the physics community=== &lt;br /&gt;
We have focused our efforts on meetings of the [http://www.aapt.org American Association of Physics Teachers (AAPT)] and the [http://www.aps.org/meetings/ American Physical Society (APS)] where we have presented numerous talks, posters, and a workshop.  &lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor for Introductory Physics Homework. Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor Homework System for Introductory Physics. Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Hausman, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Andes: An Intelligent Homework System for Introductory Physics. Contributed talk at the [http://www.aapt.org/Events/wm2007 &#039;&#039;2007 AAPT Winter Meeting&#039;&#039;], Seattle WA, January 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Changing Student Attitudes using Andes, An Intelligent Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2007 2007 AAPT Winter Meeting],&#039;&#039; Seattle WA, January 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.A21.10 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An intelligent homework helper].  Contributed talk at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.K1.199 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Changing Student Attitudes using Andes, An Intelligent Homework System.]  Poster at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausman.  An Analysis of Student Learning Using the Andes Homework System.  Contributed talk at the [http://www.aapt.org/Events/sm2007 &#039;&#039;2007 AAPT Summer Meeting&#039;&#039;], Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An Intelligent Tutor Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2007 2007 AAPT Summer Meeting],&#039;&#039; Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande &amp;amp; K. VanLehn. Cognitive Analysis of Student Learning Using LearnLab.  Workshop presented at the [http://web.phys.ksu.edu/perc2007/ &#039;&#039;Physics Education Research Conference&#039;&#039;], Greensboro NC, August 2007.  [http://www.andestutor.org/AAPT-2007/ Workshop website].&lt;br /&gt;
* S. Katz &amp;amp; J. Connelly.  Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes.  Poster presented at the &#039;&#039;[http://web.phys.ksu.edu/perc2007/ Physics Education Research Conference],&#039;&#039; Greensboro NC, August 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039;  Baltimore MD, January 2008.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; M. Wintersgill.  Student attitudes towards Andes, an intelligent tutor homework system.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039; Baltimore MD, January 2008.&lt;br /&gt;
* [http://meetings.aps.org/Meeting/OSS08/Event/86059 B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?]  Contributed talk at &#039;&#039;[http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],&#039;&#039;  Youngstown OH, March 2008.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT),&#039;&#039; Lock Haven PA, April 2008.&lt;br /&gt;
These meetings generally do not publish proceedings.&lt;br /&gt;
 &lt;br /&gt;
More recently, we have begun promoting the Physics LearnLab at regional [http://www.aapt.org AAPT] meetings:&lt;br /&gt;
* [http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],  Youngstown OH, March 2008.  Vendor exhibit.&lt;br /&gt;
* Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT), Lock Haven PA, April 2008.  Vendor exhibit.&lt;br /&gt;
*  Fall meeting of the Arizona section of the AAPT, October 2008.  Workshop for instructors.&lt;br /&gt;
In addition, we have presented Andes at other universities:  Southern Methodist University (2006), the Ohio State University (2007), Rutgers University (2007), US Air Force Academy (2007), and the US Naval Academy (2007).&lt;br /&gt;
&lt;br /&gt;
===Publications on Andes===&lt;br /&gt;
* VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., and Wintersgill, M.  The Andes Physics Tutoring System: Lessons Learned.  &#039;&#039;International Journal of Artificial Intelligence and Education,&#039;&#039; 15 (3), 1-47. &lt;br /&gt;
* Vanlehn, K., Lynch, C., Schulze, K., Shapiro, J. A., Shelby, R. H., Taylor, L., Treacy, D. J., Weinstein, A., and Wintersgill, M. C.  The Andes physics tutoring system: Five years of evaluations.  In G. McCalla, C. K. Looi, B. Bredeweg &amp;amp; J. Breuker (Eds.), &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  (pp. 678-685) Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* VanLehn, K., &amp;amp; van de Sande, B.  (in press) Expertise in elementary physics, and how to acquire it. In K. A. Ericsson (Ed.), &#039;&#039;Development of professional expertise:  Toward measurement of expert performance and design of optimal learning environments.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===Publications on PLLC experiments===&lt;br /&gt;
* Connelly, J. &amp;amp; Katz, S. (2006).  Intelligent dialogue support for physics problem solving:  Some preliminary mixed results.  &#039;&#039;Technology, Instruction, Cognition, and Learning,&#039;&#039; 4, 1-29.&lt;br /&gt;
* Ringenberg, M. &amp;amp; VanLehn, K. (2006). Scaffolding problem solving with annotated, worked-out examples to promote deep learning. In K. Ashley &amp;amp; M. Ikeda (Eds.), &#039;&#039;Intelligent Tutoring Systems: 8th International Conference, ITS2006.&#039;&#039; pp. 625-634. Amsterdam: IOS Press.  &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K. (2007) The impact of explicit strategy instruction on problem-solving behaviors across intelligent tutoring systems. In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp. 167-172 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Domain-specific and domain-independent interactive behaviors in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 548-550. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Porting an intelligent tutoring system across domains. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 551-553.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Accelerated future learning via explicit instruction of a problem solving strategy. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 409-416.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Craig, S. D., VanLehn, K., Gadgil, S., &amp;amp; Chi, M. T. H. (2007). Learning from collaboratively observing videos during problem solving with Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 554-556. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Explaining self-explaining:  A contrast between content and generation.  In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 417-424. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Self-explaining in the classroom:  Learning curve evidence   In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp 1067-1072 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Katz, S., Connelly, J., &amp;amp; Wilson, C. (2007). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.), &#039;&#039;Artificial Intelligence in Education 2007&#039;&#039;.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, May). Trialog: How Peer Collaboration Helps Remediate Errors in an ITS. Paper presented at the 21st meeting of the International FLAIRS Conference, Coconut Grove, FL.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, June). Shall we explain? Augmenting Learning from Intelligent Tutoring Systems and Peer Collaboration. Paper presented at the 9th meeting of the International Conference on Intelligent Tutoring Systems, Montréal, Canada.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., van de Sande, C., &amp;amp; VanLehn, K. (2008, June). Productive Dialog During Collaborative Problem Solving. Paper presented at the 2008 International Conference for the Learning Sciences, Utrecht, Netherlands.&lt;br /&gt;
&lt;br /&gt;
==Current Status==&lt;br /&gt;
&lt;br /&gt;
The PLLC at the US Naval Academy is currently comprised of 3-5 sections (depending on the semester) of 25 students each.  The sections are taught by Professors Mary Wintersgill and Ted McClanahan.  At Watchung Hills Regional High School, the instructors are Sophie Gershmann and Brian Brown who teach three different levels of physics courses, mostly for Juniors and Seniors.&lt;br /&gt;
The students use [http://www.cmu.edu/oli/ Open Learning Initiative (OLI)] to access [[Andes]], and the instructors use OLI to view gradebooks.  Both high school and college students use Andes at home to do their regular homework assignments.  Occasionally, Andes is used in class, but such “seat work” is not common.&lt;br /&gt;
&lt;br /&gt;
Raw log data from Andes is stored on OLI servers.  The raw data is periodically converted to [http://learnlab.web.cmu.edu/ DataShop] format, but the conversion process is still not completely satisfactory, as some information is still available only from the raw log data.  Researchers thus refer to both types of data.&lt;br /&gt;
&lt;br /&gt;
All user identification is encrypted.  The mapping between encrypted identities and student names is held by the Andes development programmer, Anders Weinstein.  Instructors see only the students’ user identification before encryption; researchers see only the encrypted identities.  Non-log data, such as hard-copies of midterm exams or audio files from verbal protocols, are collected as needed for specific experiments.  They are anonymized by Anders Weinstein and stored in locked file cabinets or secure servers.   &lt;br /&gt;
&lt;br /&gt;
Although most experiments are in vivo experiments conducted in the PLLC courses, some studies are conventional lab studies.   For instance, an experimenter might first run a study in the lab with paid volunteers and later do an improved version of the study in one or more PLLC classes.&lt;br /&gt;
&lt;br /&gt;
==Plans==&lt;br /&gt;
&lt;br /&gt;
Our major goal continues to be to expand the number of sites and instructors involved in the PLLC.  There are simply not enough lab slots and students to meet the existing demand from PLLC experimenters.  In order to increase involvement in the PLLC, we first need to increase the number of instructors using Andes in their courses, and make their experience a positive one.&lt;br /&gt;
&lt;br /&gt;
===Increase awareness of Andes===  &lt;br /&gt;
We need to increase awareness of Andes in the physics community.&lt;br /&gt;
To date, we have focused our efforts on national meetings of&lt;br /&gt;
the AAPT and APS.  However, we plan to broaden our efforts:&lt;br /&gt;
* We have begun to promote Andes at regional AAPT meetings and hope to expand this effort in the future.  &lt;br /&gt;
* We plan to arrange a summer school targeted mainly at regional high-school teachers of physics.  Our long-term desire is for the summer school activity to eventually grow into a community of users consisting of both high school and college level instructors.&lt;br /&gt;
* Continue visiting physics departments at other universities.&lt;br /&gt;
* Publish PLLC-related research in the physics education journals.&lt;br /&gt;
&lt;br /&gt;
===Web-based delivery===&lt;br /&gt;
Andes currently runs on Microsoft Windows machines as a Windows executable, requiring a software download/installation before it can be run.  We lost at least two potential sites (Paul Perkins’ High School class in Bellevue WA and the US Air Force Academy) due to issues associated with this.  In both cases, instructors were enthusiastic about Andes and assigned Andes to their students, but a significant number of students had troubles installing the software, getting it to run reliably, or did not have Microsoft Windows available to them.  We believe we are losing many other potential clients due to this architecture.  Thus, we have begun the development of a new web-based user interface to allow delivery of Andes as a true web application.&lt;br /&gt;
&lt;br /&gt;
===Improvements to Andes itself===&lt;br /&gt;
Based on conversations with potential instructors as they view demonstrations of Andes and on instructors who have dropped Andes after using it, we have identified several aspects of Andes itself that we need to improve:&lt;br /&gt;
* Instructors want a user interface that appears to be simple to learn.  The new version of the user interface that we are developing will have a very simple design, making it similar to a generic drawing program (like Powerpoint).&lt;br /&gt;
* Instructors want Andes to be a commercial product.  In particular, they are worried about the long-term stability of the software product and that user support may be sporadic or unprofessional.  The new user web-based interface will allow us to deliver Andes via our partners, [http://www.webassign.net WebAssign] and [http://www.lon-capa.org LON-CAPA].  Furthermore, we plan to offer Andes under an Open Source License, to ensure long-term availability and allow others to contribute to the future development of Andes.&lt;br /&gt;
* Instructors want all reasonable student actions to be accepted.   The new user interface will feature free text input, allowing greater flexibility.&lt;br /&gt;
* Instructors want good, effective hints.  We plan to make instructor evaluations of hint sequences an integral part of future workshops and summer schools.  However, to really improve the hint quality would require that Andes maintain a model of the student across problems.  This is one aspect of expert human tutoring that we can&#039;t capture with the existing system.&lt;br /&gt;
* Other improvements requests that we hear regularly:&lt;br /&gt;
** Allow sensitivity to lengths of vectors.&lt;br /&gt;
** Allow vector equations (currently, Andes equations are all scalar).&lt;br /&gt;
** Instructor control over policy for student actions that are correct but don&#039;t contribute to a solution.&lt;br /&gt;
&lt;br /&gt;
===Grading policy===&lt;br /&gt;
Unfortunately, the current grading rubric is opaque and complicated and we are not always happy with the validity of the scores.&lt;br /&gt;
There are two problems:&lt;br /&gt;
* We don’t have any mechanism for an instructor to understand or modify the scoring rubric.&lt;br /&gt;
* Some students become focused on raising their scores and, due to various weaknesses of (or incorrect inferences about) the scoring rubric, engage in behaviors that may raise their scores but do not constitute good problem-solving practice.  For instance, a student will put in the final answer to a problem, and then go back and add problem-solving steps until their score is acceptably high.&lt;br /&gt;
Since one of the main goals of a grading policy is to encourage students to engage in productive problem solving behavior, any changes to the grading policy must be accompanied by log file analysis.&lt;br /&gt;
&lt;br /&gt;
===Supporting existing Andes users===   &lt;br /&gt;
There are a number of non-PLLC instructors using Andes in their classrooms as well as a number of users not affiliated with any OLI course.&lt;br /&gt;
* Provide instructor support for setting up and running classes and user support for difficulties installing and running Andes.&lt;br /&gt;
* Add instructor requested homework problems.  We will continue our policy of adding new content based on instructor requests.&lt;br /&gt;
* Add instructor requested problem types (such as graph drawing).&lt;br /&gt;
* Fixing instructor reported bugs and complaints promptly.  In particular, Andes sometimes gives hint sequences that are not helpful.  Also, it sometimes won&#039;t accept solution steps that instructors would allow.&lt;br /&gt;
* Develop log file analysis to detect ineffective hint sequences, common student difficulties, and plain old bugs.&lt;br /&gt;
* Eventually, hold some instructor workshops for existing instructors, so that they feel part of the Andes development process and connect with other Andes users.&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The [[knowledge component]]s (KCs) used by Andes generally do not produce the nice learning curves that one would expect, which makes it problematic for experimenters to use them as dependent measures.  We suspect that the present physics KCs implicitly contain the knowledge needed for applying a principle within a problem context along with the principle itself.  Thus, when a KC that has been practiced several times in simple problems is used for the first time in a complex problem, the associated assistance score may be higher than expected.  In fact, it is common practice in physics homework assignments to exercise students in applying physics principles in widely varying problem contexts.  Thus, as the problem context varies, the difficulty of applying our present KCs vary widely, resulting in widely varying assistance scores.  We have been doing data mining to test this hypothesis, but this has been a backburner activity and is moving slowly.&lt;br /&gt;
&lt;br /&gt;
Here are some continuing activities associated with log files: &lt;br /&gt;
* Download log files from OLI, anonymize them, and load them into the DataShop.&lt;br /&gt;
* Andes raw logs can be converted to the DataShop format, but the converted logs often do not have the right information in them for the kinds of analysis experimenters want to do, so the converter scripts must be changed.&lt;br /&gt;
* Finish investigating whether time spent can be a useful metric of student learning.&lt;br /&gt;
* Continue investigating why the present KC’s produce learning curves that do not match current theoretical predictions.&lt;br /&gt;
* Continue promoting Log file analysis as an interesting area of research, especially for those interested in developing cognitive models of student learning.&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8787</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8787"/>
		<updated>2009-01-22T18:11:27Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on [[collaboration| collaborative learning]] has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the [[sense making]] processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8786</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8786"/>
		<updated>2009-01-22T18:09:23Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the [[sense making]] processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8785</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8785"/>
		<updated>2009-01-22T18:09:13Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the [[sense-making]] processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8784</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8784"/>
		<updated>2009-01-22T18:08:10Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Hypothesis */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively.&lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8783</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8783"/>
		<updated>2009-01-22T18:07:47Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* [[Normal post-test]]: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* [[Robust learning]]&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8782</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8782"/>
		<updated>2009-01-22T18:07:15Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via [[robust learning|robust]] measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8781</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8781"/>
		<updated>2009-01-22T18:06:00Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant [[knowledge components]]. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8780</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8780"/>
		<updated>2009-01-22T18:05:18Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanation|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8779</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8779"/>
		<updated>2009-01-22T18:04:28Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explanations|self-explain]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8778</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8778"/>
		<updated>2009-01-22T18:03:50Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and [[transfer]]. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], [[self-explain|self-explanations]] [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8777</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8777"/>
		<updated>2009-01-22T18:00:48Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Background and Significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read [[worked examples]], self-explain [[worked examples]], and engage in [[analogical comparison]] of [[worked examples]]. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8776</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8776"/>
		<updated>2009-01-22T17:59:37Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from [[worked examples]] across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8775</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8775"/>
		<updated>2009-01-22T17:58:23Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra learning and collaboration through collaborative extensions to the algebra tutor. Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and explanation. In the Proceedings of the 8th International Conference of the Learning Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
*Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8774</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8774"/>
		<updated>2009-01-22T17:57:03Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: /* Further Information */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
*Azmitia, M. (1988). Peer Interaction and Problem Solving: When Are Two Heads Better Than One? Child Development, 59(1), 87-96.&lt;br /&gt;
*Barron, B. (2000). Achieving Coordination in Collaborative Problem-Solving Groups. Journal of the Learning Sciences 9(4), 403-436.&lt;br /&gt;
*Catrambone, R. (1996). Generalizing solution procedures learned from examples. Journal of Experimental Psychology: Learning, Memory, and Cognition, 22, 1020-1031.&lt;br /&gt;
*Catrambone, R. (1998). The subgoal learning model: Creating better examples so that students can solve novel problems. Journal of Experimental Psychology: General, 127, 355-376.&lt;br /&gt;
*Catrambone, R., &amp;amp; Holyoak, K. J. (1989). Overcoming contextual limitations on problem solving transfer. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15, 1147-1156.&lt;br /&gt;
*Chase, W. G., &amp;amp; Simon, H. A. (1973).  Perception in chess. Cognitive Psychology, 4, 55-81.&lt;br /&gt;
*Chen, Z. (1999). Schema induction in children’s analogical problem solving. Journal of Educational Psychology, 91, 703-715.&lt;br /&gt;
*Chi, M. T. H., Feltovich, P. J., &amp;amp; Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.&lt;br /&gt;
*Chi, M. T. H., Roy, M., Hausmann, R. G. M. (2008). Observing Tutorial Dialogues Collaboratively: Insights About Human Tutoring Effectiveness From Vicarious Learning. Cognitive Science, 32, 301-341.&lt;br /&gt;
*Cooke, N., Salas, E., Cannon-Bowers, J.A., Stout, R. (2000). Measuring team knowledge. Human Factors 42, 151-173.&lt;br /&gt;
*Cummins, D. D. (1992). Role of analogical reasoning in the induction of problem categories. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 1103-1124.&lt;br /&gt;
*Dunbar, K. (1999). The Scientist InVivo: How scientists think and reason in the laboratory. In Magnani, L., Nersessian, N., &amp;amp; Thagard, P. Model-based reasoning in scientific discovery. Plenum Press.&lt;br /&gt;
*Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, Holyoak, K.J., &amp;amp; Kokinov, B. Analogy: Perspectives from Cognitive Science. MIT press.&lt;br /&gt;
*Gentner, D., Loewenstein, J., &amp;amp; Thompson, L. (2003). Learning and transfer: A general role for analogical encoding. Journal of Educational Psychology, 95, 393-408.&lt;br /&gt;
*Gick, M. L., &amp;amp; Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15, 1-38.&lt;br /&gt;
*Hausmann, R. G. M., Chi, M. T. H. &amp;amp; Roy, M. (2004). Learning from collaborative problem &lt;br /&gt;
solving: An analysis of three dialogue patterns. In the Twenty-sixth Cognitive Science &lt;br /&gt;
Proceedings.  &lt;br /&gt;
*Hausmann, R. G. M. (2006, July). Why do elaborative dialogs lead to effective problem solving &lt;br /&gt;
and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science &lt;br /&gt;
Conference, Vancouver, Canada. &lt;br /&gt;
*Hummel, J. E., &amp;amp; Holyoak, K. J. (2003). A symbolic-connectionist theory of relational inference and generalization. Psychological Review, 110, 220-264.&lt;br /&gt;
*Kurtz, K. J., Miao, C. H., &amp;amp; Gentner, D. (2001). Learning by analogical bootstrapping. Journal of the Learning Sciences, 10, 417-446.&lt;br /&gt;
*Larkin, J., McDermott, J., Simon, D. P., &amp;amp; Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335-1342.&lt;br /&gt;
*Lin, X. (2001). Designing metacognitive activities. Educational Technology Research &amp;amp; &lt;br /&gt;
Development, 49, 1042-1629. &lt;br /&gt;
*McLaren, B., Walker, E., Koedinger, K., Rummel, N., &amp;amp; Spada, H. (2005). Improving algebra &lt;br /&gt;
learning and collaboration through collaborative extensions to the algebra tutor. &lt;br /&gt;
Conference on computer supported collaborative learning. &lt;br /&gt;
*Nokes, T. J. &amp;amp; VanLehn, K. (2008). Bridging principles and examples through analogy and &lt;br /&gt;
explanation. In the Proceedings of the 8th International Conference of the Learning &lt;br /&gt;
Sciences. Mahwah, Erlbaum. &lt;br /&gt;
*Novick, L. R., &amp;amp; Holyoak, K. J. (1991). Mathematical problem solving by analogy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 3, 398-415.&lt;br /&gt;
Ohlsson, S. (1996). Learning from performance errors. Psychological Review, 103, 241-262.&lt;br /&gt;
*Paas, F. G. W. C., &amp;amp; Van Merrienboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem solving skills: A cognitive-load approach. Journal of Educational Psychology, 86, 122-133.&lt;br /&gt;
*Palincsar, A. S., &amp;amp; Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and &lt;br /&gt;
Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175. &lt;br /&gt;
*Schwarz, B. B., Neuman, Y., &amp;amp; Biezuner, S. (2000). Two Wrongs May Make a Right ... If They Argue Together! Cognition &amp;amp; Instruction, 18, 461-494.&lt;br /&gt;
*Ward, M., &amp;amp; Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1-39.&lt;br /&gt;
&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8773</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8773"/>
		<updated>2009-01-22T17:54:27Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test: Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. &lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on rotational dynamics. Log data from the rotational dynamics homework will be analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8772</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8772"/>
		<updated>2009-01-22T17:52:59Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
* Normal post-test&lt;br /&gt;
Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
* Robust learning&lt;br /&gt;
**Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
**Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
**Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8771</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8771"/>
		<updated>2009-01-22T17:52:04Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
*Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
*Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
*Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8770</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8770"/>
		<updated>2009-01-22T17:50:08Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can [[analogical comparison]] help students collaborate effectively?&lt;br /&gt;
* Can [[analogical comparison]] facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8769</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8769"/>
		<updated>2009-01-22T17:44:45Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can analogical comparison help students collaborate effectively?&lt;br /&gt;
* Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8768</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8768"/>
		<updated>2009-01-22T17:44:27Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can analogical comparison help students collaborate effectively?&lt;br /&gt;
* Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent Variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8767</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8767"/>
		<updated>2009-01-22T17:43:51Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* How can analogical comparison help students collaborate effectively?&lt;br /&gt;
* Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8766</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8766"/>
		<updated>2009-01-22T17:42:59Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
* [[How can analogical comparison help students collaborate effectively?]]&lt;br /&gt;
* [[Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration??]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8765</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8765"/>
		<updated>2009-01-22T17:40:48Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8764</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8764"/>
		<updated>2009-01-22T17:40:26Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Analogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated: -----&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8763</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8763"/>
		<updated>2009-01-22T17:39:42Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== SAnalogical Scaffolding in Collaborative Learning ==&lt;br /&gt;
&#039;&#039;Soniya Gadgil &amp;amp; Timothy Nokes&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated: -----&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8762</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8762"/>
		<updated>2009-01-22T17:38:02Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt) &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated: -----&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8761</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8761"/>
		<updated>2009-01-22T17:37:45Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt), &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated: -----&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8760</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8760"/>
		<updated>2009-01-22T17:37:09Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Soniya Gadgil (Pitt), Timothy Nokes (Pitt), &amp;lt;Br&amp;gt; &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || Robert Shelby&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || Sept. 1, 2008&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || Aug. 31, 2009&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || United States Naval Academy (USNA)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Physics&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || &#039;&#039;N&#039;&#039; = 72&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 144 hrs.&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || Anticipated: -----&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
 In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8759</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8759"/>
		<updated>2009-01-22T17:34:46Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
 In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
===Further Information===&lt;br /&gt;
==== References ====&lt;br /&gt;
==== Connections ====&lt;br /&gt;
==== Future Plans ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8758</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8758"/>
		<updated>2009-01-22T17:33:07Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
 In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent Variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
==== References ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8757</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8757"/>
		<updated>2009-01-22T17:32:28Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
 In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
===  Independent variables ===&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
===Dependent variables===&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
==== References ====&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8756</id>
		<title>Analogical Scaffolding in Collaborative Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Analogical_Scaffolding_in_Collaborative_Learning&amp;diff=8756"/>
		<updated>2009-01-22T17:29:03Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: New page: Analogical Scaffolding in Collaborative Learning  Abstract Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the co...&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Analogical Scaffolding in Collaborative Learning&lt;br /&gt;
&lt;br /&gt;
Abstract&lt;br /&gt;
Past research has shown that collaboration can enhance learning in certain conditions. However, not much work has explored the cognitive mechanisms that underlie such learning. Chi, Hausmann and Roy (2004) propose three mechanisms including: self-explaining, other-directed explaining, and co-construction. In the current study, we will examine the use of these mechanisms when participants learn from worked examples across different collaborative contexts. We compare the effects of adding prompts that encourage analogical comparison to prompts that focus on single examples (non-comparison) to a traditional instruction condition, as students learn to solve Physics problems in the domain of rotational kinematics. Students learning processes will be analyzed by examining their verbal protocols. Learning will be assessed via robust measures such as long-term retention and transfer. &lt;br /&gt;
&lt;br /&gt;
Background and Significance&lt;br /&gt;
Collaborative learning&lt;br /&gt;
Much research on collaborative learning has been conducted over the past few decades. The idea that putting two heads together could be better than one seems intuitive, and research has shown that when students learn in groups of two or more, they show better learning gains (at the group level) than working alone. Much of the past research has focused on identifying conditions that underlie successful collaboration. For example, we know that presence of cognitive conflict is an important variable underlying collaboration. Schwartz, Neuman, and Biezuner (2000) showed that when students with misconceptions distinct from each others’ collaborated, they were more likely to learn compared to those with the same misconception, or without a misconception. Studies have also found that establishing common ground is an important factor in learning from collaboration (Clark, 2000). &lt;br /&gt;
We also know that that scaffolding (or structuring) collaborative interaction is often critical for achieving effective learning gains (Palincsar &amp;amp; Brown, 1984; Hausmann, 2006; see Lin, 2001 for a review). For example, Hausmann (2006) conducted an experiment in which students solved a design problem in one of the three conditions: individually, in collaboration with a peer, and in collaboration with a peer but with specific instructions on conducting elaborative dialogues. Students in the elaborative dialogues condition outperformed the individuals and dyads who received no scaffolding. This is consistent with other results that show that providing scripted problem solving activities (e.g., one participant plays the role of the tutor vs. tutee and then switch) facilitate collaborative learning compared to an individual or unscripted conditions (McLaren, Walker, Koedinger, Rummel, Spada, &amp;amp; Kalchman, 2007). &lt;br /&gt;
These results are typically explained in terms of the sense-making processes in which the structured collaborative environments provide the learner more opportunities to construct the relevant knowledge components. &lt;br /&gt;
&lt;br /&gt;
Learning Mechanisms Underlying Collaboration&lt;br /&gt;
Although much work has focused on improving learning through collaboration, little research has examined the cognitive processes underlying successful collaboration. Most of the prior has focused on the outcome or product of the group and less has been concerned with the underlying processes that give rise to the product. If we can uncover the cognitive processes underlying collaborative learning, it can further our understanding of how to improve collaborative learning environments. &lt;br /&gt;
Hausmann, Chi, and Roy (2004) have identified three mechanisms in which collaboration can work. The first is “other directed explaining” and occurs when one partner explains to the other how to solve a problem. The second is explanation through “co-construction” in which both partners equally share the responsibility of sense-making. Collaborators extend each others’ ideas and jointly work towards a common goal. The third mechanism is “self-explanation” in which one partner is engaged in a knowledge-building activity for their own learning. Data from physics problem-solving by undergrads showed that all three mechanisms are at play in collaborative problem-solving. However, the former two are more beneficial to both partners while the third is only beneficial to the partner doing the self-explaining.&lt;br /&gt;
In the current work we aim to build upon this research by examining dyads verbal protocols for how they engage in collaboration and the degree to which they use each of these mechanisms. In addition, we examine other cognitive factors that impact learning including error-correction (Ohlsson, 1996), constructing a joint mental model (Clark, 2000), and schema acquisition (Gick &amp;amp; Holyoak, 1983). In addition, the current work extends previous research by systematically investigating the degree to which analogical comparisons improve successful collaboration. &lt;br /&gt;
&lt;br /&gt;
Schema Acquisition and Analogical Comparison&lt;br /&gt;
A problem schema is a knowledge organization of the information associated with a particular problem category. Problem schemas typically include declarative knowledge of principles, concepts, and formulae, as well as the procedural knowledge for how to apply that knowledge to solve a problem. Schemas have been hypothesized as the underlying knowledge organization of expert knowledge (Chase &amp;amp; Simon, 1973; Chi et al., 1981; Larkin et al., 1980). One way in which schemas can be acquired is through analogical comparison (Gick &amp;amp; Holyoak, 1983). Analogical comparison operates through aligning and mapping two example problem representations to one another and then extracting their commonalities (Gentner, 1983; Gick &amp;amp; Holyoak, 1983; Hummel &amp;amp; Holyoak, 2003). This process discards the elements of the knowledge representation that do not overlap between two examples but preserves the common elements. The resulting knowledge organization typically consists of fewer superficial similarities (than the examples) but retains the deep causal structure of the problems. &lt;br /&gt;
Research on analogy and schema learning has shown that the acquisition of schematic knowledge promotes flexible transfer to novel problems. Many researchers have found a positive relationship between the quality of the abstracted schema and transfer to a novel problem that is an instance of that schema (Catrambone &amp;amp; Holyoak, 1989; Gick &amp;amp; Holyoak, 1983; Novick &amp;amp; Holyoak, 1991). For example, Gick and Holyoak (1983) found that transfer of a solution procedure was greater when participants’ schemas contained more relevant structural features. Analogical comparison has also been shown to improve learning even when both examples are not initially well understood (Kurtz, Miao, &amp;amp; Gentner, 2001; Gentner Lowenstein, &amp;amp; Thompson, 2003). By comparing the commonalities between two examples, students could focus on the causal structure and improve their learning about the concept. Kurtz et al. (2001) showed that students who were learning about the concept of heat transfer learned more when comparing examples than when studying each example separately.&lt;br /&gt;
Several factors have been shown to improve schema acquisition including: increasing the number of examples (Gick &amp;amp; Holyoak, 1983), increasing the variability of the examples (Chen, 1999; Paas &amp;amp; Merrienboer, 1994), using instructions that focus the learner on structural commonalities (Cummins, 1992; Gentner et al., 2003), focusing the learner on the subgoals of the problems (Catrambone, 1996, 1998), and using examples that minimize students cognitive load (Ward &amp;amp; Sweller, 1990). &lt;br /&gt;
An ongoing project by Nokes and VanLehn in the Physics LearnLab explores how students’ learning and understanding of conceptual relations between principles and examples can be facilitated (Nokes &amp;amp; VanLehn, 2008). Students in this research, learned to solve problems on rotational kinematics in one of the three conditions: read worked examples, self-explain worked examples, and engage in analogical comparison of worked examples. Preliminary results showed that the groups that self-explained and engaged in analogical comparison outperformed the read-only control on the far transfer tests. Our current project builds upon these results by applying them in a collaborative setting. &lt;br /&gt;
 In summary, prior work has shown that analogical comparison can facilitate schema abstraction and transfer of that knowledge to new problems. However, this work has not examined whether analogical scaffolding can lead to effective collaboration. The current work examines how analogical comparison may help students collaborate effectively. We hypothesize that analogical prompts will facilitate not only analogical learning, but also other learning mechanisms such as explanation, co-construction, and error-correction.&lt;br /&gt;
&lt;br /&gt;
Research questions&lt;br /&gt;
1. How can analogical comparison help students collaborate effectively?&lt;br /&gt;
2. Can analogical comparison facilitate but also other learning mechanisms such as explanation, co-construction, and error-correction during collaboration?&lt;br /&gt;
&lt;br /&gt;
Independent variables&lt;br /&gt;
The only independent variable was Experimental Condition. There were three conditions: Compare, Non-compare, and Problem-solving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Compare Condition: Participants in this condition first read through and explained two worked examples. The worked examples did not have explanations for the solution steps and students were encouraged to generate the explanations and justifications for each step of the problem. They then performed the analogical comparison task, in which they were told that their task was to explicitly compare each part of the solution procedure to one another noting the similarities and differences between the two (e.g., goals, concepts, and solution procedures). Prompts in the form of questions to guide them through this process were provided. After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers.&lt;br /&gt;
&lt;br /&gt;
Non-Compare Condition: Participants in this condition first read through a worked-out example. Similar to the non-compare condition, they were not given the explanations of the steps, and generated the explanations while working collaboratively. After reading through and explaining the first example they answered questions designed to act as prompts for the students to explain the worked example. These prompts were equivalent to the comparison prompts however they were only focused on a single problem (e.g., “what is the goal of this problem”). After a fixed amount of time, they were given the model answers to the questions and asked to check them against their own answers. They were then given a second worked example isomorphic to the first one. Again, students studied the example and generated explanations. They then answered questions based on the second worked example. After a fixed amount of time, they were provided answers to those questions. &lt;br /&gt;
&lt;br /&gt;
Problem-Solving Condition: The problem-solving condition served as a control condition and collaborated to solve problems without any scaffolding. Students in This condition received the same worked examples as the two experimental groups, but without any prompts to guide them through the problem-solving process. They were given additional problems for practice, to equate the time on task with the other two conditions.&lt;br /&gt;
&lt;br /&gt;
Hypotheses&lt;br /&gt;
The following hypotheses are tested in the experiment:&lt;br /&gt;
&lt;br /&gt;
1.	Analogical scaffolding will serve as a script to enhance learning via collaboration, therefore students in the compare condition will outperform students in the other two conditions. Students in the compare and non-compare conditions will both outperform students in the control condition.&lt;br /&gt;
&lt;br /&gt;
2.	Students learning gains will differ by the kinds of learning processes they engaged in. Specifically, students engaging in self-explaining, other-directed explaining, and co-construction will show differential learning gains. This is an exploratory hypothesis and will be tested by undertaking a fine-grained analysis of verbal protocols generated by students as they solve problems collaboratively. &lt;br /&gt;
&lt;br /&gt;
Dependent variables&lt;br /&gt;
	Normal post-test&lt;br /&gt;
	Near transfer, immediate: After training, students were given a post-test that assessed their learning on various measures. Specifically, 5 kinds of questions were included in the post-test. &lt;br /&gt;
	Robust learning&lt;br /&gt;
	Long-term retention: On the student’s regular mid-term exam, one problem was similar to the training. Since this exam occurred a week after the training, and the training took place in just under 2 hours, the student’s performance on this problem is considered a test of long-term retention.&lt;br /&gt;
	Near and far transfer: After training, students did their regular homework problems using Andes. Students did them whenever they wanted, but most completed them just before the exam. The homework problems were divided based on similarity to the training problems, and assistance scores were calculated.&lt;br /&gt;
Accelerated future learning: The training was on electrical fields, and it was followed in the course by a unit on magnetic fields. Log data from the magnetic field homework was analyzed as a measure of acceleration of future learning.&lt;br /&gt;
&lt;br /&gt;
References&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8755</id>
		<title>Physics</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Physics&amp;diff=8755"/>
		<updated>2009-01-22T17:28:24Z</updated>

		<summary type="html">&lt;p&gt;130.49.138.225: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Physics LearnLab Course =&lt;br /&gt;
&lt;br /&gt;
The Physics LearnLab Course (PLLC) is a research facility for studying how students learn introductory physics.  It provides baseline data on student activities throughout the physics course, and it hosts specific research studies that measure the improvement in students’ learning caused by changes in the instruction.  At this time, it is sited in the two-semester Introductory Physics courses at the US Naval Academy in Annapolis, MD and three courses at Watchung Hills Regional High School in Warren, NJ.&lt;br /&gt;
&lt;br /&gt;
In order to increase the number of LearnLab sites, it is &lt;br /&gt;
essential that we increase the number of students using Andes.&lt;br /&gt;
During 2009, we plan to create a completely new web-based user interface.  This will allow us to integrate Andes into [http://www.webassign.net WebAssign], the leading commercial provider of physics online homework making Andes easily accessible to over a hundred thousand students.&lt;br /&gt;
&lt;br /&gt;
Students in PLLC classes use the [http://www.andestutor.org Andes] intelligent tutoring system to do their homework.  [http://www.andestutor.org Andes] allows the PLLC to collect fine-grained data on student activity through the entire semester.  The remainder of the course is taught the usual way, with lectures, labs, and a commercial paper-based textbook.  &#039;&#039;In vivo&#039;&#039; experiments take place either by modifying Andes or by running studies during lab sessions that instructors have “donated” to the PLLC.  &lt;br /&gt;
&lt;br /&gt;
== Studies Conducted ==&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: center;&amp;quot;&lt;br /&gt;
|+ &#039;&#039;&#039;Summary of Studies&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
!&lt;br /&gt;
! colspan=2 | &#039;&#039;In Vivo&#039;&#039; &lt;br /&gt;
! colspan=2 | Pull Out &lt;br /&gt;
! colspan=2 | Lab &lt;br /&gt;
! colspan=4 | Capacity&lt;br /&gt;
|-&lt;br /&gt;
! Course || Run || Planned || Run || Planned || Run || Planned&lt;br /&gt;
! Total # Sections&lt;br /&gt;
! Total # Students&lt;br /&gt;
! Max # Studies / Year&lt;br /&gt;
! Max # Students / Study&lt;br /&gt;
|-&lt;br /&gt;
| Physics || 10 || 2 || 0 || 0 || 3 || 1 || 5 || 130 || 4 || 65&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Capacity was determined by counting the number of students who solved more than 40 [[Andes]] problems in Fall 2007.  There are about 25 students in a section and each LearnLab site has about 65 students.&lt;br /&gt;
&lt;br /&gt;
Completed studies:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg_Examples-as-Help | Scaffolding Problem Solving with Embedded Example to Promote Deep Learning (Ringenberg &amp;amp; VanLehn, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Diss|The effects of elaborative dialog on problem solving and learning (Hausmann &amp;amp; Chi, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Post-practice reflection (Katz)|Post-practice reflection (Katz &amp;amp; Connelly, 2005)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_questions|Deep-level questions during example studying (Craig &amp;amp; Chi, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study|Does it matter who generates the explanations? (Hausmann &amp;amp; VanLehn, 2006)]]&lt;br /&gt;
&lt;br /&gt;
*[[Craig_observing|Learning from Problem Solving while Observing Worked Examples (Craig Gadgil, &amp;amp; Chi, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Reflective Dialogues (Katz)|Reflective Dialogues (Katz, Connelly &amp;amp; Treacy, 2006-2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Hausmann_Study2|The effects of interaction on robust learning (Hausmann &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Bridging_Principles_and_Examples_through_Analogy_and_Explanation | Bridging Principles and Examples through Analogy and Explanation (Nokes &amp;amp; VanLehn, 2007)]]&lt;br /&gt;
&lt;br /&gt;
*[[Extending Reflective Dialogue Support (Katz &amp;amp; Connelly)|Extending Automated Dialogue Support for Robust Learning of Physics (Katz &amp;amp; Connelly, 2007-2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Plateau_study|The Interaction Plateau: A comparison between human tutoring, Andes, and computer-aided instruction (Hausmann, van de Sande, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Self-explanation: Meta-cognitive vs. justification prompts|Self-explanation: Meta-cognitive vs. justification prompts (Hausmann, van de Sande, Gershman, &amp;amp; VanLehn, 2008)]]&lt;br /&gt;
&lt;br /&gt;
*[[Analogical Scaffolding in Collaborative Learning|Analogical Scaffolding in Collaborative Learning (Gadgil &amp;amp;Nokes, 2008-2009)]]&lt;br /&gt;
&lt;br /&gt;
In progress or planned:&lt;br /&gt;
&lt;br /&gt;
*[[Ringenberg Ill-Defined Physics|Does Solving Ill-Defined Physics Problems Elicit More Learning than Conventional Problem Solving? (Ringenberg &amp;amp; VanLehn)]]&lt;br /&gt;
&lt;br /&gt;
*Comparing two homework systems, Sophie Gershman, 2008--2009.&lt;br /&gt;
&lt;br /&gt;
*Nokes and Gadgil lab study.&lt;br /&gt;
&lt;br /&gt;
==Achievements==&lt;br /&gt;
&lt;br /&gt;
From its inception in January 2005 to the present, we have achieved the following: &lt;br /&gt;
&lt;br /&gt;
===Content development milestones===&lt;br /&gt;
* The number of Andes problems assigned by instructors at the Naval Academy has increased from 58% to 100% in the Fall semester, and from 42% to 75% in the Spring semester. &lt;br /&gt;
* We have increased the total number of  working Andes problems from 350 to 556.&lt;br /&gt;
* The number of physics principles has increased from 126 to 219.  The number of  rules in the physics “Knowledge Base” (the AI system) has increased from 619 to 915.  The number of scalar quantities defined in Andes has increased from 85 to 126.  &lt;br /&gt;
* We shot videos of problems being solved—at least one per problem set—and revised many of the older videos.  These act as worked examples.  Students who view the videos in a problem set before solving any problems have a much easier time of it. &lt;br /&gt;
&lt;br /&gt;
===Enabling Technologies===&lt;br /&gt;
* We developed a way to run Andes under [http://www.cmu.edu/oli/ OLI].  In particular, we found ways to get them to communicate through the USNA firewall, to upload log data and solution files, and to recover gracefully from most crashes. &lt;br /&gt;
* We developed a method to control the data that the OLI gradebook exports to spreadsheets so that only the data that instructors wanted was exported in a format they specified.&lt;br /&gt;
* Implemented “gating,” a method to force students to solve Andes problems in a pre-determined order.  This was needed for the Sandy Katz experiment in fall 2006.&lt;br /&gt;
* Andes raw logs can now be converted to the [[http://learnlab.web.cmu.edu/ DataShop]] format at the [[knowledge component]] level (June 2007).  The knowledge components associated with each correct student action (corresponding with a [[step]]) and must incorrect action (see [[transaction]]) is determined by [[Andes]].&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The Andes log files represent a rich source of information about student problem solving but have not been studied in depth, outside the needs of specific experiments.  We have begun to study the log files and begun to promote such work in the Physics Education Research (PER) community.&lt;br /&gt;
* Studied time usage (how long does it take to apply a KC?) and time-on-task (are they really working?).  Investigated whether time-on-task could be used as a metric for student learning of KC&#039;s. &lt;br /&gt;
* Begun comparing the Log data to end-of-semester surveys administered at the USNA.  The surveys were not anonymous, so individual survey results can be matched with the associated log files.&lt;br /&gt;
* Conducted a [http://www.andestutor.org/AAPT-2007/ workshop on log file analysis] at [http://web.phys.ksu.edu/perc2007/ PERC 2007].  Two senior members of the PER community, Joe Redish and Gerd Kortemeyer, attended, expressed initial interest and corresponded with us after the conference, but no firm plans have been made.&lt;br /&gt;
&lt;br /&gt;
===Adoption of Andes=== &lt;br /&gt;
As of Fall 2008, [[Andes]] is being used at the following institutions:&lt;br /&gt;
* St. Anselm college, Manchester NH (1 instructor).&lt;br /&gt;
* US Naval Academy (1 instructor, several sections).&lt;br /&gt;
* SUNY Fredonia (1 instructor).&lt;br /&gt;
* Gannon University, Erie PA (1 instructor, several sections).&lt;br /&gt;
* Conant High School, Hoffman Estates, IL (1 instructor).&lt;br /&gt;
* Watchung Hills Regional High School, Warren NJ (2 instructors, several sections).&lt;br /&gt;
We see a shift in usage relative to previous years.  Currently, the Naval Academy &lt;br /&gt;
accounts for only 25% of our users; 32% of our users are now from High Schools.&lt;br /&gt;
&lt;br /&gt;
We observe steadily growing use of Andes by individuals not enrolled in any [http://www.cmu.edu/oli/courses/physics/ OLI course].  From January to April 2008, between 90 and 278 different users (some use is anonymous, precluding an exact count) solved a total of 1647 Andes problems.  The previous semester, a total of 1260 problems were solved.&lt;br /&gt;
&lt;br /&gt;
===Advertising Andes in the physics community=== &lt;br /&gt;
We have focused our efforts on meetings of the [http://www.aapt.org American Association of Physics Teachers (AAPT)] and the [http://www.aps.org/meetings/ American Physical Society (APS)] where we have presented numerous talks, posters, and a workshop.  &lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor for Introductory Physics Homework. Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.    Andes: An Intelligent Tutor Homework System for Introductory Physics. Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2006 2006 AAPT Summer Meeting],&#039;&#039; Syracuse NY, July 2006.&lt;br /&gt;
* B. van de Sande, R. Hausman, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Andes: An Intelligent Homework System for Introductory Physics. Contributed talk at the [http://www.aapt.org/Events/wm2007 &#039;&#039;2007 AAPT Winter Meeting&#039;&#039;], Seattle WA, January 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; K. VanLehn.  Changing Student Attitudes using Andes, An Intelligent Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2007 2007 AAPT Winter Meeting],&#039;&#039; Seattle WA, January 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.A21.10 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An intelligent homework helper].  Contributed talk at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* [http://meetings.aps.org/link/BAPS.2007.MAR.K1.199 B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Changing Student Attitudes using Andes, An Intelligent Homework System.]  Poster at the [http://meetings.aps.org/Meeting/MAR07 &#039;&#039;2007 APS March Meeting&#039;&#039;], Denver CO, March 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausman.  An Analysis of Student Learning Using the Andes Homework System.  Contributed talk at the [http://www.aapt.org/Events/sm2007 &#039;&#039;2007 AAPT Summer Meeting&#039;&#039;], Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, K. VanLehn, &amp;amp; M. Wintersgill.  Andes: An Intelligent Tutor Homework System.  Poster at the &#039;&#039;[http://www.aapt.org/Events/sm2007 2007 AAPT Summer Meeting],&#039;&#039; Greensboro NC, July 2007.&lt;br /&gt;
* B. van de Sande &amp;amp; K. VanLehn. Cognitive Analysis of Student Learning Using LearnLab.  Workshop presented at the [http://web.phys.ksu.edu/perc2007/ &#039;&#039;Physics Education Research Conference&#039;&#039;], Greensboro NC, August 2007.  [http://www.andestutor.org/AAPT-2007/ Workshop website].&lt;br /&gt;
* S. Katz &amp;amp; J. Connelly.  Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes.  Poster presented at the &#039;&#039;[http://web.phys.ksu.edu/perc2007/ Physics Education Research Conference],&#039;&#039; Greensboro NC, August 2007.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039;  Baltimore MD, January 2008.&lt;br /&gt;
* B. van de Sande, R. Shelby, D. Treacy, &amp;amp; M. Wintersgill.  Student attitudes towards Andes, an intelligent tutor homework system.  Poster at the &#039;&#039;[http://www.aapt.org/Events/wm2008 2008 AAPT Winter Meeting],&#039;&#039; Baltimore MD, January 2008.&lt;br /&gt;
* [http://meetings.aps.org/Meeting/OSS08/Event/86059 B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?]  Contributed talk at &#039;&#039;[http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],&#039;&#039;  Youngstown OH, March 2008.&lt;br /&gt;
* B. van de Sande, &amp;amp; R. Hausmann.  Does an intelligent tutor homework system encourage beneficial collaboration?  Contributed talk at the &#039;&#039;Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT),&#039;&#039; Lock Haven PA, April 2008.&lt;br /&gt;
These meetings generally do not publish proceedings.&lt;br /&gt;
 &lt;br /&gt;
More recently, we have begun promoting the Physics LearnLab at regional [http://www.aapt.org AAPT] meetings:&lt;br /&gt;
* [http://www.ysu.edu/osaps2008/ Touring the Electromagnetic Spectrum (OSAPS 2008)],  Youngstown OH, March 2008.  Vendor exhibit.&lt;br /&gt;
* Central Pennsylvania Section of the American Association of Physics Teachers (CPS/AAPT), Lock Haven PA, April 2008.  Vendor exhibit.&lt;br /&gt;
*  Fall meeting of the Arizona section of the AAPT, October 2008.  Workshop for instructors.&lt;br /&gt;
In addition, we have presented Andes at other universities:  Southern Methodist University (2006), the Ohio State University (2007), Rutgers University (2007), US Air Force Academy (2007), and the US Naval Academy (2007).&lt;br /&gt;
&lt;br /&gt;
===Publications on Andes===&lt;br /&gt;
* VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., and Wintersgill, M.  The Andes Physics Tutoring System: Lessons Learned.  &#039;&#039;International Journal of Artificial Intelligence and Education,&#039;&#039; 15 (3), 1-47. &lt;br /&gt;
* Vanlehn, K., Lynch, C., Schulze, K., Shapiro, J. A., Shelby, R. H., Taylor, L., Treacy, D. J., Weinstein, A., and Wintersgill, M. C.  The Andes physics tutoring system: Five years of evaluations.  In G. McCalla, C. K. Looi, B. Bredeweg &amp;amp; J. Breuker (Eds.), &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  (pp. 678-685) Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* VanLehn, K., &amp;amp; van de Sande, B.  (in press) Expertise in elementary physics, and how to acquire it. In K. A. Ericsson (Ed.), &#039;&#039;Development of professional expertise:  Toward measurement of expert performance and design of optimal learning environments.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===Publications on PLLC experiments===&lt;br /&gt;
* Connelly, J. &amp;amp; Katz, S. (2006).  Intelligent dialogue support for physics problem solving:  Some preliminary mixed results.  &#039;&#039;Technology, Instruction, Cognition, and Learning,&#039;&#039; 4, 1-29.&lt;br /&gt;
* Ringenberg, M. &amp;amp; VanLehn, K. (2006). Scaffolding problem solving with annotated, worked-out examples to promote deep learning. In K. Ashley &amp;amp; M. Ikeda (Eds.), &#039;&#039;Intelligent Tutoring Systems: 8th International Conference, ITS2006.&#039;&#039; pp. 625-634. Amsterdam: IOS Press.  &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K. (2007) The impact of explicit strategy instruction on problem-solving behaviors across intelligent tutoring systems. In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp. 167-172 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Domain-specific and domain-independent interactive behaviors in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 548-550. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Porting an intelligent tutoring system across domains. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 551-553.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Chi, Min &amp;amp; VanLehn, K.  (2007) Accelerated future learning via explicit instruction of a problem solving strategy. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 409-416.  Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* Craig, S. D., VanLehn, K., Gadgil, S., &amp;amp; Chi, M. T. H. (2007). Learning from collaboratively observing videos during problem solving with Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039;  pp. 554-556. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Explaining self-explaining:  A contrast between content and generation.  In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp. 417-424. Amsterdam, Netherlands: IOS Press. &lt;br /&gt;
* Hausmann, R. G. M. &amp;amp; VanLehn, K. (2007).  Self-explaining in the classroom:  Learning curve evidence   In D. McNamara &amp;amp; G. Trafton (Eds.) &#039;&#039;Proceedings of the 29th Annual Conference of the Cognitive Science Society.&#039;&#039; pp 1067-1072 Mahwah, NJ: Erlbaum.&lt;br /&gt;
* Katz, S., Connelly, J., &amp;amp; Wilson, C. (2007). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.), &#039;&#039;Artificial Intelligence in Education 2007&#039;&#039;.&lt;br /&gt;
* Nwaigwe, A., Koedinger, K.,VanLehn, K., Hausmann, R. G. M. &amp;amp; Weinstein, A.  (2007) Exploring alternative methods for error attribution in learning curves analyses in intelligent tutoring systems. In R. Luckin, K. R. Koedinger &amp;amp; J. Greer (Eds.)  &#039;&#039;Artificial Intelligence in Education.&#039;&#039; pp 246-253. Amsterdam, Netherlands: IOS Press.&lt;br /&gt;
* VanLehn, K., Koedinger, K., Skogsholm, A., Nwaigwe, A., Hausmann, R.G.M., Weinstein, A. &amp;amp; Billings, B. (2007). What’s in a step?  Toward general, abstract representations of tutoring system log data.  In C. Conati &amp;amp; K. McCoy (eds).  &#039;&#039;Proceedings of User Modelling 2007.&#039;&#039; &lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, May). Trialog: How Peer Collaboration Helps Remediate Errors in an ITS. Paper presented at the 21st meeting of the International FLAIRS Conference, Coconut Grove, FL.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., &amp;amp; VanLehn, K. (2008, June). Shall we explain? Augmenting Learning from Intelligent Tutoring Systems and Peer Collaboration. Paper presented at the 9th meeting of the International Conference on Intelligent Tutoring Systems, Montréal, Canada.&lt;br /&gt;
* Hausmann, R. G. M., van de Sande, B., van de Sande, C., &amp;amp; VanLehn, K. (2008, June). Productive Dialog During Collaborative Problem Solving. Paper presented at the 2008 International Conference for the Learning Sciences, Utrecht, Netherlands.&lt;br /&gt;
&lt;br /&gt;
==Current Status==&lt;br /&gt;
&lt;br /&gt;
The PLLC at the US Naval Academy is currently comprised of 3-5 sections (depending on the semester) of 25 students each.  The sections are taught by Professors Mary Wintersgill and Ted McClanahan.  At Watchung Hills Regional High School, the instructors are Sophie Gershmann and Brian Brown who teach three different levels of physics courses, mostly for Juniors and Seniors.&lt;br /&gt;
The students use [http://www.cmu.edu/oli/ Open Learning Initiative (OLI)] to access [[Andes]], and the instructors use OLI to view gradebooks.  Both high school and college students use Andes at home to do their regular homework assignments.  Occasionally, Andes is used in class, but such “seat work” is not common.&lt;br /&gt;
&lt;br /&gt;
Raw log data from Andes is stored on OLI servers.  The raw data is periodically converted to [http://learnlab.web.cmu.edu/ DataShop] format, but the conversion process is still not completely satisfactory, as some information is still available only from the raw log data.  Researchers thus refer to both types of data.&lt;br /&gt;
&lt;br /&gt;
All user identification is encrypted.  The mapping between encrypted identities and student names is held by the Andes development programmer, Anders Weinstein.  Instructors see only the students’ user identification before encryption; researchers see only the encrypted identities.  Non-log data, such as hard-copies of midterm exams or audio files from verbal protocols, are collected as needed for specific experiments.  They are anonymized by Anders Weinstein and stored in locked file cabinets or secure servers.   &lt;br /&gt;
&lt;br /&gt;
Although most experiments are in vivo experiments conducted in the PLLC courses, some studies are conventional lab studies.   For instance, an experimenter might first run a study in the lab with paid volunteers and later do an improved version of the study in one or more PLLC classes.&lt;br /&gt;
&lt;br /&gt;
==Plans==&lt;br /&gt;
&lt;br /&gt;
Our major goal continues to be to expand the number of sites and instructors involved in the PLLC.  There are simply not enough lab slots and students to meet the existing demand from PLLC experimenters.  In order to increase involvement in the PLLC, we first need to increase the number of instructors using Andes in their courses, and make their experience a positive one.&lt;br /&gt;
&lt;br /&gt;
===Increase awareness of Andes===  &lt;br /&gt;
We need to increase awareness of Andes in the physics community.&lt;br /&gt;
To date, we have focused our efforts on national meetings of&lt;br /&gt;
the AAPT and APS.  However, we plan to broaden our efforts:&lt;br /&gt;
* We have begun to promote Andes at regional AAPT meetings and hope to expand this effort in the future.  &lt;br /&gt;
* We plan to arrange a summer school targeted mainly at regional high-school teachers of physics.  Our long-term desire is for the summer school activity to eventually grow into a community of users consisting of both high school and college level instructors.&lt;br /&gt;
* Continue visiting physics departments at other universities.&lt;br /&gt;
* Publish PLLC-related research in the physics education journals.&lt;br /&gt;
&lt;br /&gt;
===Web-based delivery===&lt;br /&gt;
Andes currently runs on Microsoft Windows machines as a Windows executable, requiring a software download/installation before it can be run.  We lost at least two potential sites (Paul Perkins’ High School class in Bellevue WA and the US Air Force Academy) due to issues associated with this.  In both cases, instructors were enthusiastic about Andes and assigned Andes to their students, but a significant number of students had troubles installing the software, getting it to run reliably, or did not have Microsoft Windows available to them.  We believe we are losing many other potential clients due to this architecture.  Thus, we have begun the development of a new web-based user interface to allow delivery of Andes as a true web application.&lt;br /&gt;
&lt;br /&gt;
===Improvements to Andes itself===&lt;br /&gt;
Based on conversations with potential instructors as they view demonstrations of Andes and on instructors who have dropped Andes after using it, we have identified several aspects of Andes itself that we need to improve:&lt;br /&gt;
* Instructors want a user interface that appears to be simple to learn.  The new version of the user interface that we are developing will have a very simple design, making it similar to a generic drawing program (like Powerpoint).&lt;br /&gt;
* Instructors want Andes to be a commercial product.  In particular, they are worried about the long-term stability of the software product and that user support may be sporadic or unprofessional.  The new user web-based interface will allow us to deliver Andes via our partners, [http://www.webassign.net WebAssign] and [http://www.lon-capa.org LON-CAPA].  Furthermore, we plan to offer Andes under an Open Source License, to ensure long-term availability and allow others to contribute to the future development of Andes.&lt;br /&gt;
* Instructors want all reasonable student actions to be accepted.   The new user interface will feature free text input, allowing greater flexibility.&lt;br /&gt;
* Instructors want good, effective hints.  We plan to make instructor evaluations of hint sequences an integral part of future workshops and summer schools.  However, to really improve the hint quality would require that Andes maintain a model of the student across problems.  This is one aspect of expert human tutoring that we can&#039;t capture with the existing system.&lt;br /&gt;
* Other improvements requests that we hear regularly:&lt;br /&gt;
** Allow sensitivity to lengths of vectors.&lt;br /&gt;
** Allow vector equations (currently, Andes equations are all scalar).&lt;br /&gt;
** Instructor control over policy for student actions that are correct but don&#039;t contribute to a solution.&lt;br /&gt;
&lt;br /&gt;
===Grading policy===&lt;br /&gt;
Unfortunately, the current grading rubric is opaque and complicated and we are not always happy with the validity of the scores.&lt;br /&gt;
There are two problems:&lt;br /&gt;
* We don’t have any mechanism for an instructor to understand or modify the scoring rubric.&lt;br /&gt;
* Some students become focused on raising their scores and, due to various weaknesses of (or incorrect inferences about) the scoring rubric, engage in behaviors that may raise their scores but do not constitute good problem-solving practice.  For instance, a student will put in the final answer to a problem, and then go back and add problem-solving steps until their score is acceptably high.&lt;br /&gt;
Since one of the main goals of a grading policy is to encourage students to engage in productive problem solving behavior, any changes to the grading policy must be accompanied by log file analysis.&lt;br /&gt;
&lt;br /&gt;
===Supporting existing Andes users===   &lt;br /&gt;
There are a number of non-PLLC instructors using Andes in their classrooms as well as a number of users not affiliated with any OLI course.&lt;br /&gt;
* Provide instructor support for setting up and running classes and user support for difficulties installing and running Andes.&lt;br /&gt;
* Add instructor requested homework problems.  We will continue our policy of adding new content based on instructor requests.&lt;br /&gt;
* Add instructor requested problem types (such as graph drawing).&lt;br /&gt;
* Fixing instructor reported bugs and complaints promptly.  In particular, Andes sometimes gives hint sequences that are not helpful.  Also, it sometimes won&#039;t accept solution steps that instructors would allow.&lt;br /&gt;
* Develop log file analysis to detect ineffective hint sequences, common student difficulties, and plain old bugs.&lt;br /&gt;
* Eventually, hold some instructor workshops for existing instructors, so that they feel part of the Andes development process and connect with other Andes users.&lt;br /&gt;
&lt;br /&gt;
===Log file analysis===&lt;br /&gt;
The [[knowledge component]]s (KCs) used by Andes generally do not produce the nice learning curves that one would expect, which makes it problematic for experimenters to use them as dependent measures.  We suspect that the present physics KCs implicitly contain the knowledge needed for applying a principle within a problem context along with the principle itself.  Thus, when a KC that has been practiced several times in simple problems is used for the first time in a complex problem, the associated assistance score may be higher than expected.  In fact, it is common practice in physics homework assignments to exercise students in applying physics principles in widely varying problem contexts.  Thus, as the problem context varies, the difficulty of applying our present KCs vary widely, resulting in widely varying assistance scores.  We have been doing data mining to test this hypothesis, but this has been a backburner activity and is moving slowly.&lt;br /&gt;
&lt;br /&gt;
Here are some continuing activities associated with log files: &lt;br /&gt;
* Download log files from OLI, anonymize them, and load them into the DataShop.&lt;br /&gt;
* Andes raw logs can be converted to the DataShop format, but the converted logs often do not have the right information in them for the kinds of analysis experimenters want to do, so the converter scripts must be changed.&lt;br /&gt;
* Finish investigating whether time spent can be a useful metric of student learning.&lt;br /&gt;
* Continue investigating why the present KC’s produce learning curves that do not match current theoretical predictions.&lt;br /&gt;
* Continue promoting Log file analysis as an interesting area of research, especially for those interested in developing cognitive models of student learning.&lt;/div&gt;</summary>
		<author><name>130.49.138.225</name></author>
	</entry>
</feed>