https://learnlab.org/wiki/api.php?action=feedcontributions&user=Katz&feedformat=atomLearnLab - User contributions [en]2024-03-28T18:37:12ZUser contributionsMediaWiki 1.31.12https://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6782Post-practice reflection2008-01-13T18:46:14Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
== References ==<br />
<br />
Katz, S., & Allbritton, D., & Connelly, J. (2003). Going beyond the problem given: How human tutors use post-solution discussions to support transfer. International Journal of Artificial Intelligence and Education, 13 (1), 79-116.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes. In K. Koedinger and R. Luckin (Eds.), Proceedings of AI in Education 2007.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). An Evaluation of Reflective Dialogue in Andes. Poster presented at the Physics Education Research Conference (PERC 2007), Greensboro, NC.<br><br />
<br />
Lee, A. Y., & Hutchison, L. (1998). Improving learning from examples through reflection. Journal of Experimental Psychology: Applied, 4 (3), 187-210.<br><br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence and Education, 15 (3).<br><br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K. Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Five years of evaluations. In G. McCalla, C. K. Looi, B. Bredeweg & J. Breuker (Eds.), Artificial Intelligence in Education (pp. 678-685). Amsterdam, Netherlands: IOS Press.<br><br><br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]<br />
[[Category:Instructional Principles]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6781Post-practice reflection2008-01-13T18:38:32Z<p>Katz: /* References */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
== References ==<br />
<br />
Katz, S., & Allbritton, D., & Connelly, J. (2003). Going beyond the problem given: How human tutors use post-solution discussions to support transfer. International Journal of Artificial Intelligence and Education, 13 (1), 79-116.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes. In K. Koedinger and R. Luckin (Eds.), Proceedings of AI in Education 2007.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). An Evaluation of Reflective Dialogue in Andes. Poster presented at the Physics Education Research Conference (PERC 2007), Greensboro, NC.<br><br />
<br />
Lee, A. Y., & Hutchison, L. (1998). Improving learning from examples through reflection. Journal of Experimental Psychology: Applied, 4 (3), 187-210.<br><br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence and Education, 15 (3).<br><br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K. Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Five years of evaluations. In G. McCalla, C. K. Looi, B. Bredeweg & J. Breuker (Eds.), Artificial Intelligence in Education (pp. 678-685). Amsterdam, Netherlands: IOS Press.<br><br><br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]<br />
[[Category:Instructional Principles]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6780Post-practice reflection2008-01-13T18:38:15Z<p>Katz: /* References */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
== References ==<br />
<br />
Katz, S., & Allbritton, D., & Connelly, J. (2003). Going beyond the problem given: How human tutors use post-solution discussions to support transfer. International Journal of Artificial Intelligence and Education, 13 (1), 79-116.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes. In K. Koedinger and R. Luckin (Eds.), Proceedings of AI in Education 2007.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). An Evaluation of Reflective Dialogue in Andes. Poster presented at the Physics Education Research Conference (PERC 2007), Greensboro, NC.<br><br />
<br />
Lee, A. Y., & Hutchison, L. (1998). Improving learning from examples through reflection. Journal of Experimental Psychology: Applied, 4 (3), 187-210.<br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence and Education, 15 (3).<br><br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K. Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Five years of evaluations. In G. McCalla, C. K. Looi, B. Bredeweg & J. Breuker (Eds.), Artificial Intelligence in Education (pp. 678-685). Amsterdam, Netherlands: IOS Press.<br><br><br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]<br />
[[Category:Instructional Principles]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6779Post-practice reflection2008-01-13T18:37:27Z<p>Katz: /* References */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
== References ==<br />
<br />
Katz, S., & Allbritton, D., & Connelly, J. (2003). Going beyond the problem given: How human tutors use post-solution discussions to support transfer. International Journal of Artificial Intelligence and Education, 13 (1), 79-116.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes. In K. Koedinger and R. Luckin (Eds.), Proceedings of AI in Education 2007.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). An Evaluation of Reflective Dialogue in Andes. Poster presented at the Physics Education Research Conference (PERC 2007), Greensboro, NC.<br><br />
<br />
<br />
Lee, A. Y., & Hutchison, L. (1998). Improving learning from examples through reflection. Journal of Experimental Psychology: Applied, 4 (3), 187-210.<br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence and Education, 15 (3).<br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K. Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Five years of evaluations. In G. McCalla, C. K. Looi, B. Bredeweg & J. Breuker (Eds.), Artificial Intelligence in Education (pp. 678-685). Amsterdam, Netherlands: IOS Press.<br><br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]<br />
[[Category:Instructional Principles]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6778Post-practice reflection2008-01-13T18:36:50Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
== References ==<br />
<br />
Katz, S., & Allbritton, D., & Connelly, J. (2003). Going beyond the problem given: How human tutors use post-solution discussions to support transfer. International Journal of Artificial Intelligence and Education, 13 (1), 79-116.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). Out of the Lab and into the Classroom: An Evaluation of Reflective Dialogue in Andes. In K. Koedinger and R. Luckin (Eds.), Proceedings of AI in Education 2007.<br><br />
<br />
Katz, S., Connelly, J., & Wilson, C. (2007). An Evaluation of Reflective Dialogue in Andes. Poster presented at the Physics Education Research Conference (PERC 2007), Greensboro, NC.<br><br />
<br />
<br />
Lee, A. Y., & Hutchison, L. (1998). Improving learning from examples through reflection. Journal of Experimental Psychology: Applied, 4 (3), 187-210.<br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K., Shapiro, J.A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence and Education, 15 (3).<br><br />
<br />
VanLehn, K., Lynch, C., Schulze, K. Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Five years of evaluations. In G. McCalla, C. K. Looi, B. Bredeweg & J. Breuker (Eds.), Artificial Intelligence in Education (pp. 678-685). Amsterdam, Netherlands: IOS Press.<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]<br />
[[Category:Instructional Principles]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6777Post-practice reflection2008-01-13T18:24:11Z<p>Katz: /* Examples */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease...<br><br />
<br />
|}<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6776Post-practice reflection2008-01-13T18:23:40Z<p>Katz: /* Examples */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br><br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Correct. All else being the same, how would the velocity at the lowest point of the swing change if the length of the rope were doubled to 4.0 m? <br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would also double <br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Well, it would increase, but it would not double; recall that the formula for centripetal acceleration involves the square of the velocity. If the radius doubles, the square of the velocity would double also, to maintain the same centripetal acceleration.<br><br><br />
<br />
&nbsp; &nbsp; All else being the same, how would the velocity at the lowest point of the swing change if the mass of the ball were doubled to 10 kg?<br><br><br />
<br />
&nbsp; &nbsp; STUDENT: it would decrease<br><br />
<br />
|}<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6775Post-practice reflection2008-01-13T18:18:09Z<p>Katz: /* Examples */</p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br />
<br />
&nbsp; &nbsp; TUTOR: Yes. The main difference between this problem and most of the previous N2L problems is the fact that the direction of acceleration is __________. (fill in the blank)<br> <br />
<br />
&nbsp; &nbsp; STUDENT: centripetal<br />
<br />
&nbsp; &nbsp; <br />
<br />
|}<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6774Post-practice reflection2008-01-13T18:14:45Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br />
<br />
&nbsp; &nbsp; TUTOR: Which major principle did you need to solve the problem? [Just the name, please]<br><br />
<br />
&nbsp; &nbsp; STUDENT: Newton's 2nd law<br><br />
<br />
&nbsp; &nbsp; We know that there is an electric field. If there is an electric field, <br> and there is a charged particle located in that region, then we can infer <br> that there is an electric force on the particle. The direction of the <br> electric force is in the opposite direction as the electric field because <br> the charge on the particle is negative.<br />
<br />
&nbsp; &nbsp; We use the Force tool from the vector tool bar to draw the electric force. <br> This brings up a dialog box. The force is on the particle and it is due to some <br> unspecified source. We do know, however, that the type of force is electric, so <br> we choose “electric” from the pull-down menu. For the orientation, we need to <br> add 180 degrees to 22 degrees to get a force that is in a direction that is <br> opposite of the direction of the electric field. Therefore we put 202 degrees. <br> Finally, we use “Fe” to designate this as an electric force.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
&nbsp; &nbsp; Now that the direction of the electric force has been indicated, we can work on <br>finding the magnitude. We must choose a principle that relates the magnitude <br> of the electric force to the strength of the electric field, and the charge on the <br> particle. The definition of an electric field is only equation that relates these <br> three variables. We write this equation, in the equation window.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
|}<br />
Note. PROMPT = "Please begin your self-explanation."<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6773Post-practice reflection2008-01-13T18:11:57Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br><br />
<br />
&nbsp; &nbsp; One way to start is to ask ourselves, “What quantity is the problem seeking?” <br> In this case, the answer is the magnitude of the force on the particle due to <br> the electric field.<br><br />
<br />
&nbsp; &nbsp; We know that there is an electric field. If there is an electric field, <br> and there is a charged particle located in that region, then we can infer <br> that there is an electric force on the particle. The direction of the <br> electric force is in the opposite direction as the electric field because <br> the charge on the particle is negative.<br />
<br />
&nbsp; &nbsp; We use the Force tool from the vector tool bar to draw the electric force. <br> This brings up a dialog box. The force is on the particle and it is due to some <br> unspecified source. We do know, however, that the type of force is electric, so <br> we choose “electric” from the pull-down menu. For the orientation, we need to <br> add 180 degrees to 22 degrees to get a force that is in a direction that is <br> opposite of the direction of the electric field. Therefore we put 202 degrees. <br> Finally, we use “Fe” to designate this as an electric force.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
&nbsp; &nbsp; Now that the direction of the electric force has been indicated, we can work on <br>finding the magnitude. We must choose a principle that relates the magnitude <br> of the electric force to the strength of the electric field, and the charge on the <br> particle. The definition of an electric field is only equation that relates these <br> three variables. We write this equation, in the equation window.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
|}<br />
Note. PROMPT = "Please begin your self-explanation."<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6772Post-practice reflection2008-01-13T18:10:54Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
=== Examples ===<br />
<br />
Here is a sample Reflective Dialogue, incorporated within the Andes physics tutoring system (VanLehn et al., 2005):<br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of a Post-Practice Reflective Dialogue'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; <br>ANDES PROBLEM ROTS4A: A 5.00 kg ball is attached to a 2.00 m rope which will break if the tension exceeds 100 N. If the ball is made to swing in a vertical circle, what is the maximum velocity with which the ball can pass through the lowest point?<br />
<br />
&nbsp; &nbsp; One way to start is to ask ourselves, “What quantity is the problem seeking?” <br> In this case, the answer is the magnitude of the force on the particle due to <br> the electric field.<br><br />
<br />
&nbsp; &nbsp; We know that there is an electric field. If there is an electric field, <br> and there is a charged particle located in that region, then we can infer <br> that there is an electric force on the particle. The direction of the <br> electric force is in the opposite direction as the electric field because <br> the charge on the particle is negative.<br />
<br />
&nbsp; &nbsp; We use the Force tool from the vector tool bar to draw the electric force. <br> This brings up a dialog box. The force is on the particle and it is due to some <br> unspecified source. We do know, however, that the type of force is electric, so <br> we choose “electric” from the pull-down menu. For the orientation, we need to <br> add 180 degrees to 22 degrees to get a force that is in a direction that is <br> opposite of the direction of the electric field. Therefore we put 202 degrees. <br> Finally, we use “Fe” to designate this as an electric force.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
&nbsp; &nbsp; Now that the direction of the electric force has been indicated, we can work on <br>finding the magnitude. We must choose a principle that relates the magnitude <br> of the electric force to the strength of the electric field, and the charge on the <br> particle. The definition of an electric field is only equation that relates these <br> three variables. We write this equation, in the equation window.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
|}<br />
Note. PROMPT = "Please begin your self-explanation."<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6771Post-practice reflection2008-01-13T17:59:30Z<p>Katz: </p>
<hr />
<div>== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
Here are the instructions to self-explain, taken from Chi et al. (1994):<br />
<br />
"We would like you to read each sentence out loud and then explain what it means to you. That is, what<br><br />
new information does each line provide for you, how does it relate to what you've already read, does it give<br><br />
you a new insight into your understanding of how the circulatory system works, or does it raise a question<br><br />
in your mind. Tell us whatever is going through your mind–even if it seems unimportant."<br><br />
<br />
These prompts were reworded to be used in Hausmann & VanLehn (2007):<br />
<br />
* What new information does each step provide for you?<br />
* How does it relate to what you've already seen?<br />
* Does it give you a new insight into your understanding of how to solve the problems?<br />
* Does it raise a question in your mind?<br />
<br />
These prompts were then included as text, just below a worked-out example. The example was presented as a video taken of the Andes interface, with a voice-over narration describing the user-interface actions (see Table below). In this example, the student is learning how to solve the following problem:<br />
<br />
<Blockquote>A charged particle is in a region where there is an electric field E of magnitude<br><br />
14.3 V/m at an angle of 22 degrees above the positive x-axis. If the charge on the particle<br><br />
is -7.9 C, find the magnitude of the force on the particle P due to the electric field E.</Blockquote><br />
<br />
<br><br />
<br />
{| cellspacing="0" cellpadding="5" border="1"<br />
|+ '''An example of prompting for self-explanining'''<br />
|-<br />
| style="border-bottom: 3px solid grey;" | <br />
&nbsp; &nbsp; Now that all the given information has been entered, we need to apply<br> our knowledge of physics to solve the problem.<br><br />
<br />
&nbsp; &nbsp; One way to start is to ask ourselves, “What quantity is the problem seeking?” <br> In this case, the answer is the magnitude of the force on the particle due to <br> the electric field.<br><br />
<br />
&nbsp; &nbsp; We know that there is an electric field. If there is an electric field, <br> and there is a charged particle located in that region, then we can infer <br> that there is an electric force on the particle. The direction of the <br> electric force is in the opposite direction as the electric field because <br> the charge on the particle is negative.<br />
<br />
&nbsp; &nbsp; We use the Force tool from the vector tool bar to draw the electric force. <br> This brings up a dialog box. The force is on the particle and it is due to some <br> unspecified source. We do know, however, that the type of force is electric, so <br> we choose “electric” from the pull-down menu. For the orientation, we need to <br> add 180 degrees to 22 degrees to get a force that is in a direction that is <br> opposite of the direction of the electric field. Therefore we put 202 degrees. <br> Finally, we use “Fe” to designate this as an electric force.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
&nbsp; &nbsp; Now that the direction of the electric force has been indicated, we can work on <br>finding the magnitude. We must choose a principle that relates the magnitude <br> of the electric force to the strength of the electric field, and the charge on the <br> particle. The definition of an electric field is only equation that relates these <br> three variables. We write this equation, in the equation window.<br />
<br />
<center>[ PROMPT ]</center><br />
<br />
|}<br />
Note. PROMPT = "Please begin your self-explanation."<br />
<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=6770Post-practice reflection2008-01-13T17:55:36Z<p>Katz: </p>
<hr />
<div><br />
== Brief statement of principle ==<br />
Post-practice reflection involves activities that follow successful completion of a quantitative problem aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema. Such schema are a kind of [[knowledge component]] that if acquired with high [[feature validity]] will help students with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
Post-practice reflection activities often involve some kind of dialogue between the student and another agent (teacher, peer, or computer tutor).<br />
<br />
[[Category:Glossary]]<br />
[[Category:Independent Variables]]<br />
[[Category:Interactive Communication]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6692Reflective Dialogues (Katz)2008-01-10T04:03:30Z<p>Katz: /* Future plans */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we tested the effectiveness of engaging students in reflective dialogues after they solve problems in [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we conducted a Physics LearnLab study in which we compared the effectiveness of an experimental version of Andes that engaged students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gave students additional problem-solving practice after they solved an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predicted that students in the explicit condition would outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that targeted far transfer. Consistent with prior research, the dialogues promoted conceptual understanding of physics. However, measures of retention and transfer to problem-solving ability showed only a marginal effect of completing the reflective dialogues.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We suggest that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tested the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly targeted the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). Explicit instruction on these knowledge components proved to be effective and more efficient than giving students repeated practice in solving different types of problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis tested was that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, would be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for January 2008 - December 2008:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6691Reflective Dialogues (Katz)2008-01-10T04:01:15Z<p>Katz: /* Findings */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we tested the effectiveness of engaging students in reflective dialogues after they solve problems in [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we conducted a Physics LearnLab study in which we compared the effectiveness of an experimental version of Andes that engaged students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gave students additional problem-solving practice after they solved an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predicted that students in the explicit condition would outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that targeted far transfer. Consistent with prior research, the dialogues promoted conceptual understanding of physics. However, measures of retention and transfer to problem-solving ability showed only a marginal effect of completing the reflective dialogues.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We suggest that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tested the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly targeted the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). Explicit instruction on these knowledge components proved to be effective and more efficient than giving students repeated practice in solving different types of problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis tested was that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, would be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for June 2007 - December 2007:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6690Reflective Dialogues (Katz)2008-01-10T03:55:56Z<p>Katz: /* Hypothesis */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we tested the effectiveness of engaging students in reflective dialogues after they solve problems in [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we conducted a Physics LearnLab study in which we compared the effectiveness of an experimental version of Andes that engaged students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gave students additional problem-solving practice after they solved an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predicted that students in the explicit condition would outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that targeted far transfer. Consistent with prior research, the dialogues promoted conceptual understanding of physics. However, measures of retention and transfer to problem-solving ability showed only a marginal effect of completing the reflective dialogues.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We suggest that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tested the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly targeted the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). Explicit instruction on these knowledge components proved to be effective and more efficient than giving students repeated practice in solving different types of problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis tested was that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, would be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability. Analyses of the final exam data are in progress.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for June 2007 - December 2007:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6689Reflective Dialogues (Katz)2008-01-10T03:51:08Z<p>Katz: /* Background and Significance */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we tested the effectiveness of engaging students in reflective dialogues after they solve problems in [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we conducted a Physics LearnLab study in which we compared the effectiveness of an experimental version of Andes that engaged students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gave students additional problem-solving practice after they solved an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predicted that students in the explicit condition would outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that targeted far transfer. Consistent with prior research, the dialogues promoted conceptual understanding of physics. However, measures of retention and transfer to problem-solving ability showed only a marginal effect of completing the reflective dialogues.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We suggest that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tested the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly targeted the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). Explicit instruction on these knowledge components proved to be effective and more efficient than giving students repeated practice in solving different types of problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis to be tested is that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems. <br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability. Analyses of the final exam data are in progress.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for June 2007 - December 2007:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6688Reflective Dialogues (Katz)2008-01-10T03:43:49Z<p>Katz: /* Abstract */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we tested the effectiveness of engaging students in reflective dialogues after they solve problems in [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we conducted a Physics LearnLab study in which we compared the effectiveness of an experimental version of Andes that engaged students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gave students additional problem-solving practice after they solved an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predicted that students in the explicit condition would outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that targeted far transfer. Consistent with prior research, the dialogues promoted conceptual understanding of physics. However, measures of retention and transfer to problem-solving ability showed only a marginal effect of completing the reflective dialogues.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We propose that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tests the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). If explicit instruction on these knowledge components proves to be effective, we will have identified an instructional strategy that facilitates the acquisition of problem-solving schema—one that is more efficient than repeated practice in solving many problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis to be tested is that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems. <br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability. Analyses of the final exam data are in progress.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for June 2007 - December 2007:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflective_Dialogues_(Katz)&diff=6687Reflective Dialogues (Katz)2008-01-10T03:38:35Z<p>Katz: /* Summary Table */</p>
<hr />
<div>== Do Reflective Dialogues that Explicitly Target the “What? How? and Why (not)?” Knowledge of Physics Problem Solving Promote Expert-like Planning Ability? ==<br />
Sandra Katz<br />
<br />
=== Summary Table ===<br />
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"<br />
| '''PIs''' || Sandra Katz, John Connelly, Donald Treacy<br />
|-<br />
| '''Study Start Date''' || 3/1/06<br />
|-<br />
| '''Study End Date''' || 6/30/07<br />
|-<br />
| '''LearnLab Site''' || USNA<br />
|-<br />
| '''LearnLab Course''' || General Physics I<br />
|-<br />
| '''Number of Students''' || ''N'' = 67<br />
|-<br />
| '''Total Participant Hours''' || approx. 750 hrs<br />
|-<br />
| '''DataShop''' || Yes<br />
|}<br />
<br />
=== Abstract ===<br />
One of the main differences between experts and novices in physics is that experts are more adept at identifying relevant principles and generating solution plans before starting to solve a problem (Chi, Glaser & Rees, 1982; Dufresne, Gerace, Hardiman, & Mestre, 1992; Priest & Lindsay, 1992). We propose that this difference may be due to two reasons: (1) in traditional physics courses, students are not explicitly asked to plan or are they given scaffolding to support planning during problem solving, and (2) many students lack the basic knowledge of physics concepts, principles, and procedures that is prerequisite for effective planning. <br />
<br />
In this project, we are testing the effectiveness of engaging students in reflective dialogues after they solve problems in the [[Andes]] that explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why (or why not) to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). The main hypothesis to be tested is that explicit training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems.<br />
<br />
To test this hypothesis, we will conduct a Physics LearnLab study in which we compare the effectiveness of an experimental version of Andes that engages students in reflective “what? how? and why?” dialogues after physics problem solving ([[explicit instruction]] condition) with a control version that gives students additional problem-solving practice after they solve an Andes problem—e.g., by having them identify bugs in a sample student solution to a problem ([[implicit instruction]] condition). We predict that students in the explicit condition will outperform students in the implicit condition with respect to gain score from pre-test to post-test and scores on course exams that target far transfer. The main contribution of this study will be to determine if explicit instruction in planning and in the component knowledge and skills needed for planning via reflective dialogues can promote students’ acquisition of solution schemata.<br />
<br />
=== Glossary ===<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research question ===<br />
Does explicit training in the three main components of problem-solving knowledge—i.e., knowledge about what principles apply to a problem, how to apply these principles, and why to apply them—enhance students’ problem-solving ability?<br />
<br />
=== Background and Significance ===<br />
Because experts develop schemata largely through a good deal of practice with solving various types of problems, traditional introductory college physics courses take a similar approach. They are structured so that students are introduced to concepts, principles, and procedures in their text and through lectures, which they are then asked to apply to many problem-solving exercises. The hope is that, with extensive practice, students will integrate conceptual and [[procedural]] knowledge and develop expert-like schemata and planning skills. Unfortunately, many students don’t. In addition to exiting these courses with lingering naïve misconceptions (e.g., Halloun & Hestenes, 1985; McDermott, 1984), they continue to solve problems largely by manipulating equations until the desired quantity is isolated—that is, by means-end analysis—instead of by identifying relevant principles and generating solution plans before starting to solve a problem, as experts do (Dufresne et al., 1992; Larkin, McDermott, Simon, & Simon, 1980; Priest & Lindsay, 1992). <br />
<br />
We refer to the traditional approach to physics instruction described in the preceding paragraph as implicit instruction, because it encourages the inductive development of abstractions (concepts, principles, and schemata) through repeated exposure to instances, instead of by explicitly reifying these abstractions (O'Malley & Chamot, 1994). In response to the limitations of implicit approaches to physics instruction that neither ask students to plan nor scaffold them in doing it, several instructional scientists have proposed methods that explicitly engage students in planning exercises (Dufresne et al., 1992; Leonard, Dufresne, & Mestre, 1996; Mestre, Dufresne, Gerace, & Hardiman, 1993). These methods have met with modest success when tested mainly using high-achieving students (B or above in an introductory college physics course). We propose that the main reason that many students, even high achievers, are unable to use explicit planning methods effectively is that they lack the basic concepts, principles, and procedures that are prerequisites for effective planning. <br />
<br />
This project tests the effectiveness of engaging students in reflective dialogues after they solve problems in Andes (e.g., Gertner & VanLehn, 2000) which explicitly target the three main types of knowledge that experts employ during planning: knowledge about what principle(s) to apply to a given problem, how to apply these principles (e.g., what equations to use), and why to apply them—that is, what the applicability conditions are (Leonard, Dufresne, & Mestre, 1996). If explicit instruction on these knowledge components proves to be effective, we will have identified an instructional strategy that facilitates the acquisition of problem-solving schema—one that is more efficient than repeated practice in solving many problems.<br />
<br />
=== Dependent Variables ===<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far [[transfer]] items.<br />
* ''Short-term retention''. Performance on course exams that cover target topics (e.g., work and energy, translational dynamics, rotation, momentum)<br />
* ''[[Long-term retention]]''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
Students within each course section were block-randomly assigned to one of two dialogue conditions: standard [[Knowledge Construction Dialogues]] (KCDs) or control with no KCDs. Students in the control condition were assigned an additional five Andes problems to attempt to better equate time on task.<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
* Number of problems completed before the post-test was administered<br />
* Number of dialogues that the student completed<br />
* Grade point average (CQPR)<br />
* College major<br />
* Pre-test score<br />
<br />
=== Hypothesis ===<br />
The main hypothesis to be tested is that explicit, part-task training on the “what? how? and why?” components of planning, via strategically staged reflective dialogues, will be more effective and efficient than the traditional approach of letting students acquire these components implicitly, through lots of practice with solving problems. <br />
<br />
=== Findings ===<br />
Data analyses are still in progress, but below are our findings to date:<br />
<br />
Analyses of pre- and post-test scores were more encouraging than for last year's study ([[Post-practice reflection (Katz)]]). After omitting scores from one student with a post-test duration of less than two minutes, we were left with treatment and control groups of equal size (''n'' = 33). We also re-classified two treatment subjects who did no dialogues as control subjects (effective treatment and control ''n''s = 31 and 35, respectively), although these subjects were not able to access the five extra control-group Andes problems. As we had hoped, ANOVA showed no significant differences between the effective treatment and control groups on pre-test score (respective ''M''s = 12.10 and 11.77; ''F'' < 1), but treatment subjects had higher mean post-test scores (17.97 vs. 15.57; ''F''(1, 64) = 4.89, ''p'' = .031), mean raw gain scores (5.87 vs. 3.80; ''F'' = 5.62, ''p'' = .021), and mean Estes gain scores (0.330 vs. 0.208; ''F'' = 6.74, ''p'' = .012) than did control subjects. In short, without regard to problem-solving practice, subjects who did KCDs did significantly better on the post-test than those who did not.<br />
<br />
Student participation in both homework problem solving and dialogues was much improved over last year, with 21 of 34 control subjects (62%) and 23 of 33 treatment subjects (70%) finishing at least 80% of the assigned target problems (of 31 for control, 26 for treatment) prior to the post-test and with 25 treatment subjects (76%) finishing at least 80% of the 26 associated KCDs. However, participation was<br />
still far from perfect; 7 treatment subjects (21%) completed half or fewer of the assigned KCDs on time, and 2 of them completed no target problems or KCDs on time. We again treated KCD completion and target problem completion as continuous independent variables in regression analyses. Regressing post-test score on pre-test score, QPA, number of KCDs completed, and number of target problems completed<br />
(''R''<SUP>2</SUP> = .52, ''F''(4, 61) = 16.70, ''p'' < .00001) showed positive contributions of all factors, but only pre-test score, QPA, and KCD completion were statistically significant (''p''s < .001, .05, & .05, respectively); problem completion was ''ns'' (''p'' = .54). Therefore, across all subjects it was KCD completion, as opposed to target homework problem completion, that significantly predicted post-test performance.<br />
<br />
To measure retention and transfer, we also analyzed scores from an hourly exam administered by one instructor to his two sections (''n''=47). All problems on this exam were quantitative and covered a subset of the units targeted during the intervention. Scores on this exam ranged from 176 to 382 (of 400), and were highly correlated with<br />
pre- and post-test scores; ''r''s(45) = .54 and .71, ''p''s < .0001 and .00001. However,<br />
ANOVAs showed no differences between groups (Fs < 1), and regression of subscores on QPA, KCDs completed, and target problems completed (''R''<SUP>2</SUP> = .54, ''F''(3, 43) = 16.58,<br />
''p'' < .00001) showed positive contributions of all factors but a significant effect of only QPA (''p'' < .00001). Therefore, student performance on the hourly exam was not significantly affected by either KCD completion or target problem completion.<br />
<br />
In summary, student performance on the post-test relative to the pre-test was significantly influenced by the number of dialogues they completed, as opposed to the number of target problems they completed prior to the post-test. However, neither measure had a significant effect on student performance on an exam that focused on<br />
problem-solving ability. Analyses of the final exam data are in progress.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. More students took the learning-by doing path than in our prior study, perhaps because this year instructors ''required'' them to do so (i.e., this time there were negative consequences for avoiding them).<br />
<br />
Analyses of retention and transfer to problem-solving ability did not reveal a significant effect of the reflective dialogues. We had hoped that the capstone dialogues that treatment students completed before selected problems, at the end of each unit, would support problem-solving ability. It is possible that the intervention consisted of too few capstone dialogues (only five) to observe any effect.<br />
<br />
What seems to be the critical feature of post-practice reflection (PPR) in supporting conceptual understanding is the explicit instruction that it provides in domain knowledge. PPR may help students to fill in knowledge gaps, resolve misconceptions,<br />
and abstract from the case at hand so that they are better prepared to engage in constructive activity (e.g., self-explanation) in future problems. Preliminary research comparing Andes (which encourages implicit learning of problem-solving strategies) with Pyrenees, a system that teaches problem-solving strategies explicitly, also<br />
suggests an advantage for explicit instruction (VanLehn et al., 2004). Further research is needed to identify the mechanisms that drive learning from reflective dialogue, and to increase its potential to enhance problem-solving ability in addition to conceptual knowledge.<br />
<br />
=== Further Information ===<br />
==== Annotated bibliography ====<br />
* Presentation to site visitors, 2005<br />
* Full paper accepted at AIED 2007:<br />
**Katz, S., Connelly, J., & Wilson, C. (in press). Out of the lab and into the classroom: An evaluation of reflective dialogue in Andes. ''Proceedings of AIED07''.<br />
<br />
==== References ====<br />
*Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), ''Advances in the Psychology of Human Intelligence, Vol. 1'' (pp. 7-75). Hillsdale, NJ: Erlbaum.<br />
*Dufresne, R. J., Gerace, P., Hardiman, T., & Mestre, J. P. (1992). Constraining novices to perform expertlike analyses: Effects on schema acquisition. ''The Journal of the Learning Sciences, 2'' (3), 307-331.<br />
*Gertner, A. S., & VanLehn, K. (2000). Andes: A coached problem solving environment for physics. In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), ''ITS 2000: Proceedings of the 5th International Conference on Intelligent Tutoring Systems'' (pp. 133-142). Berlin: Springer-Verlag.<br />
*Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. ''Science, 208'', 1335-1342.<br />
*Leonard, W. J., Dufresne, R. J., & Mestre, J. P. (1996). Using qualitative problem-solving strategies to highlight the role of conceptual knowledge in solving problems. ''American Journal of Physics, 64'' (12), 1495-1503.<br />
*Mestre, J. P., Dufresne, R. J., Gerace, W. J., & Hardiman, P. T. (1993). Promoting skilled problem-solving behavior among beginning physics students. ''Journal of Research in Science Teaching, 30'', 303-317.<br />
*O’Malley, M., and Chamot, A. (1994). ''The CALLA Handbook''. Reading, MA: Addison-Wesley.<br />
*Priest, A. G., & Lindsay, R. O. (1992). New light on novice-expert differences in physics problem solving. ''British Journal of Psychology, 83'', 389-405.<br />
*VanLehn, K., Bhembe, D., Chi, M., Lynch, C., Schulze, K., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2004). Implicit versus explicit learning of strategies in a nonprocedural cognitive skill. In J. C. Lester, R. M. Vicari, & F. Paraguacu, (Eds.), ''Intelligent Tutoring Systems: 7th International Conference'' (pp. 521-530). Berlin: Springer-Verlag.<br />
<br />
==== Connections ====<br />
This project shares features with the following research projects:<br />
<br />
Use of Questions during learning<br />
* [[Post-practice reflection (Katz)]] <br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[FrenchCulture | FrenchCulture (Amy Ogan, Christopher Jones, Vincent Aleven)]]<br />
* [[Rummel_Scripted_Collaborative_Problem_Solving|Collaborative Extensions to the Cognitive Tutor Algebra: Scripted Collaborative Problem Solving (Rummel, Diziol, McLaren, & Spada)]]<br />
<br />
Self explanations during learning<br />
* [[Craig_questions|Deep-level questions during example studying (Craig & Chi)]]<br />
* [[Hausmann Study2 | The Effects of Interaction on Robust Learning (Hausmann & Chi)]]<br />
* [[Hausmann Study | A comparison of self-explanation to instructional explanation (Hausmann & Vanlehn)]]<br />
<br />
==== Future plans ====<br />
Our future plans for June 2007 - December 2007:<br />
* write journal article expanding AIED conference paper<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection_(Katz)&diff=2239Post-practice reflection (Katz)2006-11-30T19:39:32Z<p>Katz: /* Dependent Variables */</p>
<hr />
<div>== Post-practice reflection in a first-year physics course: Does mixed-initiative interaction support robust learning better than tutor-led interaction or canned text? ==<br />
Sandra Katz<br />
<br />
=== Abstract ===<br />
We conducted an in vivo study within the PSLC Physics LearnLab to address two questions about reflection on recent problem-solving activity within the Andes physics tutoring system: (1) Does post-practice reflection support robust learning of physics—that is, students’ ability to transfer what they learned during instruction to novel situations? and (2) What is the preferred delivery mode for post-practice reflection in intelligent tutoring systems? We compared pre-test to post-test learning gains of students who were randomly assigned to one of four conditions that represent points on a scale of increasingly generative student activity: (1) a control group who solved a set of problems in Andes without any reflective activity, (2) a self-explanation group, who solved Andes problems, responded to several “reflection questions” (Lee & Hutchison, 1998) after each problem and received a canned, expert-generated explanation that they were prompted to compare with their response, (3) a “tutor-led interaction” group, who solved the same Andes problems and reflection questions (RQs) as the didactic group, but the questions were embedded in Knowledge Construction Dialogues (KCD’s) that guided students in generating a correct response, and (4) a “scaffolded mixed initiative KCD group,” which solved the same problems and responded to the same reflection questions as the other reflection groups, embedded in similar KCD’s as those presented to the “tutor-led interaction” group, with the added support of follow-up questioning menus that allowed students to take initiative. <br />
Unfortunately, participation in the three experimental conditions was too low to compare the effectiveness of these treatments. However, a yoked pairs analysis revealed that the more reflection questions that students did, of any type, the better they did, with respect to both post-test scores and pre-test to post-test gain scores. Similarly, regression analysis revealed that the number of dialogues a student completed had a significant positive effect on post-test score, independent of the number of problems he or she completed. These findings provide empirical support for post-practice reflection in an ITS that is a central component of a course.<br />
<br />
=== Glossary ===<br />
<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research questions ===<br />
<br />
Do reflection questions after physics problem solving support robust learning? Is more interactive reflection, between the automated tutor and student, more effective in supporting robust learning than less interactive reflection?<br />
<br />
=== Background and Significance ===<br />
<br />
Despite the well-established role that post-practice, reflective dialogue plays in apprentice-style learning (Collins, Brown, & Newman, 1989), few studies have been conducted to test their effectiveness. Hence, there is little empirical support for incorporating post-practice reflection in courses led by a human teacher or within intelligent tutoring systems (ITSs). Nor is there much guidance on how to implement reflection effectively, especially within the constraints of natural-language understanding and generation capabilities.<br />
<br />
Research by Lee and Hutchison (1998) focused on the effectiveness of reflection questions posed after students studied worked examples on balancing chemistry equations in a computer-based learning environment. This study found that reflection questions enhanced students’ problem-solving ability. However, the role of reflection in promoting conceptual understanding was not addressed; nor did the study investigate whether reflection questions after problem-solving exercises (as opposed to example studying) support learning. <br />
<br />
Two laboratory studies conducted by Katz and her colleagues addressed these questions. In a longitudinal analysis of student performance on avionics tasks in Sherlock 2 (e.g., Katz et al., 1998), Katz, O’Donnell and Kay (2000) found that discussions that took place between an avionics expert and two students were more effective in resolving misconceptions when they were distributed across problem solving and debrief than were discussions that took place during problem solving alone. However, due to constraints inherent in the research setting, there was no control group in this study—that is, avionics trainees who did not experience debrief led by a domain expert—and no instrument to measure performance gains. A follow-up study by Katz, Allbritton, & Connelly (2003) addressed these limitations in a different domain, first-year college physics. Forty-six students taking college physics solved problems in Andes in one of three conditions: with no reflection questions after problem solving (control group), with reflection questions discussed with human tutors, or with the same reflection questions followed by canned feedback (without a human tutor). A comparison of pre-test and post-test scores was conducted to measure learning gains. The main result was that students learned more with reflection questions than without, but the two conditions with reflection questions (canned feedback and human tutoring) did not differ significantly.<br />
<br />
The current study is significant because it validated the results of these laboratory studies in an actual classroom setting. Specifically, it showed that post-practice reflection supports learning, as measured by pre-test to post-test learning gain scores. Due to low participation, however, this experiment did not shed light on the question of which modality of reflection is most effective.<br />
<br />
=== Dependent Variables ===<br />
<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near transfer (normal learning) and far transfer (robust learning) items.<br />
<br />
* ''Short-term retention''. Performance on course exam that covered target topic (work and energy), 1-2 weeks after the intervention.<br />
<br />
* ''Longer-term retention''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
*Number of problems completed before the post-test was administered.<br />
*Number of reflection questions that the student completed.<br />
*CQPR—grade point average<br />
*College major grouping.<br />
*Pre-test score<br />
<br />
=== Hypotheses ===<br />
<br />
*Because responding to reflection questions is a form of “active learning,” students in any of the three experimental conditions (self-explanation of canned text, standard KCD’s, tutor-led KCDs) should outperform students in the control (no-reflection) condition. This hypothesis was supported.<br />
*The more interactive the reflection modality the better, so mixed-initiative > standard KCD > self-explanation of canned text. We were unable to test this hypothesis, due to low participation.<br />
<br />
=== Findings ===<br />
<br />
*''Gain scores summary'':<br />
**There were no significant differences in mean gain score (or in pre-test and post-test scores) by condition (4 conditions; 3 treatment, 1 control).<br />
**Yoked pairs analysis, comparing students who completed no reflection questions with students who completed five or more questions, showed that “treated” subjects tended to out-gain “untreated” subjects.<br />
*''Regression analysis summary'':<br />
**The number of reflection questions completed had a significant positive effect on post-test scores.<br />
*''Exam score summary'':<br />
**For the final exam, there was no significant impact of the number of reflection questions completed.<br />
**For one course section’s hourly exam on work and energy, there was a significant, positive impact of the number of reflection questions completed, but only when CQPR was dropped from the model. For the other section’s hourly exam, “treated” subjects significantly outperformed “untreated” subjects, despite being outnumbered 3 to 1.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. We are unable to determine why students chose not to take the learning-by doing path. Perhaps doing the reflection questions was too difficult for some students, but we suspect that this was not the case. A more likely explanation is that students did not have the time to complete these questions, and there were no negative consequences for avoiding them. In the follow-up study that we are currently running, students are required to complete the reflection questions in order to get credit for completing a problem.<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection_(Katz)&diff=2238Post-practice reflection (Katz)2006-11-30T19:37:34Z<p>Katz: /* Research questions */</p>
<hr />
<div>== Post-practice reflection in a first-year physics course: Does mixed-initiative interaction support robust learning better than tutor-led interaction or canned text? ==<br />
Sandra Katz<br />
<br />
=== Abstract ===<br />
We conducted an in vivo study within the PSLC Physics LearnLab to address two questions about reflection on recent problem-solving activity within the Andes physics tutoring system: (1) Does post-practice reflection support robust learning of physics—that is, students’ ability to transfer what they learned during instruction to novel situations? and (2) What is the preferred delivery mode for post-practice reflection in intelligent tutoring systems? We compared pre-test to post-test learning gains of students who were randomly assigned to one of four conditions that represent points on a scale of increasingly generative student activity: (1) a control group who solved a set of problems in Andes without any reflective activity, (2) a self-explanation group, who solved Andes problems, responded to several “reflection questions” (Lee & Hutchison, 1998) after each problem and received a canned, expert-generated explanation that they were prompted to compare with their response, (3) a “tutor-led interaction” group, who solved the same Andes problems and reflection questions (RQs) as the didactic group, but the questions were embedded in Knowledge Construction Dialogues (KCD’s) that guided students in generating a correct response, and (4) a “scaffolded mixed initiative KCD group,” which solved the same problems and responded to the same reflection questions as the other reflection groups, embedded in similar KCD’s as those presented to the “tutor-led interaction” group, with the added support of follow-up questioning menus that allowed students to take initiative. <br />
Unfortunately, participation in the three experimental conditions was too low to compare the effectiveness of these treatments. However, a yoked pairs analysis revealed that the more reflection questions that students did, of any type, the better they did, with respect to both post-test scores and pre-test to post-test gain scores. Similarly, regression analysis revealed that the number of dialogues a student completed had a significant positive effect on post-test score, independent of the number of problems he or she completed. These findings provide empirical support for post-practice reflection in an ITS that is a central component of a course.<br />
<br />
=== Glossary ===<br />
<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research questions ===<br />
<br />
Do reflection questions after physics problem solving support robust learning? Is more interactive reflection, between the automated tutor and student, more effective in supporting robust learning than less interactive reflection?<br />
<br />
=== Background and Significance ===<br />
<br />
Despite the well-established role that post-practice, reflective dialogue plays in apprentice-style learning (Collins, Brown, & Newman, 1989), few studies have been conducted to test their effectiveness. Hence, there is little empirical support for incorporating post-practice reflection in courses led by a human teacher or within intelligent tutoring systems (ITSs). Nor is there much guidance on how to implement reflection effectively, especially within the constraints of natural-language understanding and generation capabilities.<br />
<br />
Research by Lee and Hutchison (1998) focused on the effectiveness of reflection questions posed after students studied worked examples on balancing chemistry equations in a computer-based learning environment. This study found that reflection questions enhanced students’ problem-solving ability. However, the role of reflection in promoting conceptual understanding was not addressed; nor did the study investigate whether reflection questions after problem-solving exercises (as opposed to example studying) support learning. <br />
<br />
Two laboratory studies conducted by Katz and her colleagues addressed these questions. In a longitudinal analysis of student performance on avionics tasks in Sherlock 2 (e.g., Katz et al., 1998), Katz, O’Donnell and Kay (2000) found that discussions that took place between an avionics expert and two students were more effective in resolving misconceptions when they were distributed across problem solving and debrief than were discussions that took place during problem solving alone. However, due to constraints inherent in the research setting, there was no control group in this study—that is, avionics trainees who did not experience debrief led by a domain expert—and no instrument to measure performance gains. A follow-up study by Katz, Allbritton, & Connelly (2003) addressed these limitations in a different domain, first-year college physics. Forty-six students taking college physics solved problems in Andes in one of three conditions: with no reflection questions after problem solving (control group), with reflection questions discussed with human tutors, or with the same reflection questions followed by canned feedback (without a human tutor). A comparison of pre-test and post-test scores was conducted to measure learning gains. The main result was that students learned more with reflection questions than without, but the two conditions with reflection questions (canned feedback and human tutoring) did not differ significantly.<br />
<br />
The current study is significant because it validated the results of these laboratory studies in an actual classroom setting. Specifically, it showed that post-practice reflection supports learning, as measured by pre-test to post-test learning gain scores. Due to low participation, however, this experiment did not shed light on the question of which modality of reflection is most effective.<br />
<br />
=== Dependent Variables ===<br />
<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far transfer items.<br />
<br />
* ''Short-term retention''. Performance on course exam that covered target topic (work and energy), 1-2 weeks after the intervention.<br />
<br />
* ''Longer-term retention''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
*Number of problems completed before the post-test was administered.<br />
*Number of reflection questions that the student completed.<br />
*CQPR—grade point average<br />
*College major grouping.<br />
*Pre-test score<br />
<br />
=== Hypotheses ===<br />
<br />
*Because responding to reflection questions is a form of “active learning,” students in any of the three experimental conditions (self-explanation of canned text, standard KCD’s, tutor-led KCDs) should outperform students in the control (no-reflection) condition. This hypothesis was supported.<br />
*The more interactive the reflection modality the better, so mixed-initiative > standard KCD > self-explanation of canned text. We were unable to test this hypothesis, due to low participation.<br />
<br />
=== Findings ===<br />
<br />
*''Gain scores summary'':<br />
**There were no significant differences in mean gain score (or in pre-test and post-test scores) by condition (4 conditions; 3 treatment, 1 control).<br />
**Yoked pairs analysis, comparing students who completed no reflection questions with students who completed five or more questions, showed that “treated” subjects tended to out-gain “untreated” subjects.<br />
*''Regression analysis summary'':<br />
**The number of reflection questions completed had a significant positive effect on post-test scores.<br />
*''Exam score summary'':<br />
**For the final exam, there was no significant impact of the number of reflection questions completed.<br />
**For one course section’s hourly exam on work and energy, there was a significant, positive impact of the number of reflection questions completed, but only when CQPR was dropped from the model. For the other section’s hourly exam, “treated” subjects significantly outperformed “untreated” subjects, despite being outnumbered 3 to 1.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. We are unable to determine why students chose not to take the learning-by doing path. Perhaps doing the reflection questions was too difficult for some students, but we suspect that this was not the case. A more likely explanation is that students did not have the time to complete these questions, and there were no negative consequences for avoiding them. In the follow-up study that we are currently running, students are required to complete the reflection questions in order to get credit for completing a problem.<br />
<br />
[[Category:Study]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Knowledge_Construction_Dialogues&diff=2237Knowledge Construction Dialogues2006-11-30T19:27:55Z<p>Katz: </p>
<hr />
<div>Automated, Socratic-style dialogues, in which the automated tutor presents a series of questions that guide students in answering a more complex qualitative or quantitative question.<br />
<br />
[[Category:Glossary]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Reflection_questions&diff=2236Reflection questions2006-11-30T19:26:59Z<p>Katz: </p>
<hr />
<div>A post-practice reflection activity, in which the student is presented with a question that focuses on the concepts and principles associated with a quantitative problem. A human or automated tutor provides the student with feedback on the student's response, or can guide the student towards a correct response, in Socratic dialogue fashion.<br />
<br />
[[Category:Glossary]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection&diff=2235Post-practice reflection2006-11-30T19:22:01Z<p>Katz: </p>
<hr />
<div>Activities that follow successful completion of a quantitative problem, aimed at helping students to understand the concepts associated with that problem and to develop abstract problem-solving schema that will help with solving similar (near transfer) problems, and perhaps also far-transfer problems.<br />
<br />
[[Category:Glossary]]</div>Katzhttps://learnlab.org/wiki/index.php?title=Post-practice_reflection_(Katz)&diff=2234Post-practice reflection (Katz)2006-11-30T19:18:41Z<p>Katz: /* Abstract */</p>
<hr />
<div>== Post-practice reflection in a first-year physics course: Does mixed-initiative interaction support robust learning better than tutor-led interaction or canned text? ==<br />
Sandra Katz<br />
<br />
=== Abstract ===<br />
We conducted an in vivo study within the PSLC Physics LearnLab to address two questions about reflection on recent problem-solving activity within the Andes physics tutoring system: (1) Does post-practice reflection support robust learning of physics—that is, students’ ability to transfer what they learned during instruction to novel situations? and (2) What is the preferred delivery mode for post-practice reflection in intelligent tutoring systems? We compared pre-test to post-test learning gains of students who were randomly assigned to one of four conditions that represent points on a scale of increasingly generative student activity: (1) a control group who solved a set of problems in Andes without any reflective activity, (2) a self-explanation group, who solved Andes problems, responded to several “reflection questions” (Lee & Hutchison, 1998) after each problem and received a canned, expert-generated explanation that they were prompted to compare with their response, (3) a “tutor-led interaction” group, who solved the same Andes problems and reflection questions (RQs) as the didactic group, but the questions were embedded in Knowledge Construction Dialogues (KCD’s) that guided students in generating a correct response, and (4) a “scaffolded mixed initiative KCD group,” which solved the same problems and responded to the same reflection questions as the other reflection groups, embedded in similar KCD’s as those presented to the “tutor-led interaction” group, with the added support of follow-up questioning menus that allowed students to take initiative. <br />
Unfortunately, participation in the three experimental conditions was too low to compare the effectiveness of these treatments. However, a yoked pairs analysis revealed that the more reflection questions that students did, of any type, the better they did, with respect to both post-test scores and pre-test to post-test gain scores. Similarly, regression analysis revealed that the number of dialogues a student completed had a significant positive effect on post-test score, independent of the number of problems he or she completed. These findings provide empirical support for post-practice reflection in an ITS that is a central component of a course.<br />
<br />
=== Glossary ===<br />
<br />
* [[Post-practice reflection]]<br />
* [[Reflection questions]]<br />
* [[Knowledge Construction Dialogues]] (KCDs)<br />
<br />
=== Research questions ===<br />
<br />
Do reflection questions after physics problem solving support learning? Is more interactive reflection, between the automated tutor and student, more effective than less interactive reflection?<br />
<br />
=== Background and Significance ===<br />
<br />
Despite the well-established role that post-practice, reflective dialogue plays in apprentice-style learning (Collins, Brown, & Newman, 1989), few studies have been conducted to test their effectiveness. Hence, there is little empirical support for incorporating post-practice reflection in courses led by a human teacher or within intelligent tutoring systems (ITSs). Nor is there much guidance on how to implement reflection effectively, especially within the constraints of natural-language understanding and generation capabilities.<br />
<br />
Research by Lee and Hutchison (1998) focused on the effectiveness of reflection questions posed after students studied worked examples on balancing chemistry equations in a computer-based learning environment. This study found that reflection questions enhanced students’ problem-solving ability. However, the role of reflection in promoting conceptual understanding was not addressed; nor did the study investigate whether reflection questions after problem-solving exercises (as opposed to example studying) support learning. <br />
<br />
Two laboratory studies conducted by Katz and her colleagues addressed these questions. In a longitudinal analysis of student performance on avionics tasks in Sherlock 2 (e.g., Katz et al., 1998), Katz, O’Donnell and Kay (2000) found that discussions that took place between an avionics expert and two students were more effective in resolving misconceptions when they were distributed across problem solving and debrief than were discussions that took place during problem solving alone. However, due to constraints inherent in the research setting, there was no control group in this study—that is, avionics trainees who did not experience debrief led by a domain expert—and no instrument to measure performance gains. A follow-up study by Katz, Allbritton, & Connelly (2003) addressed these limitations in a different domain, first-year college physics. Forty-six students taking college physics solved problems in Andes in one of three conditions: with no reflection questions after problem solving (control group), with reflection questions discussed with human tutors, or with the same reflection questions followed by canned feedback (without a human tutor). A comparison of pre-test and post-test scores was conducted to measure learning gains. The main result was that students learned more with reflection questions than without, but the two conditions with reflection questions (canned feedback and human tutoring) did not differ significantly.<br />
<br />
The current study is significant because it validated the results of these laboratory studies in an actual classroom setting. Specifically, it showed that post-practice reflection supports learning, as measured by pre-test to post-test learning gain scores. Due to low participation, however, this experiment did not shed light on the question of which modality of reflection is most effective.<br />
<br />
=== Dependent Variables ===<br />
<br />
* ''Gains in qualitative and quantitative knowledge''. Post-test score,and pre-test to post-test gain scores, on near and far transfer items.<br />
<br />
* ''Short-term retention''. Performance on course exam that covered target topic (work and energy), 1-2 weeks after the intervention.<br />
<br />
* ''Longer-term retention''. Performance on final exam, taken several weeks after the intervention.<br />
<br />
=== Independent Variables ===<br />
<br />
The following variables were entered into a regression analysis, with post-test score as the dependent variable:<br />
<br />
*Number of problems completed before the post-test was administered.<br />
*Number of reflection questions that the student completed.<br />
*CQPR—grade point average<br />
*College major grouping.<br />
*Pre-test score<br />
<br />
=== Hypotheses ===<br />
<br />
*Because responding to reflection questions is a form of “active learning,” students in any of the three experimental conditions (self-explanation of canned text, standard KCD’s, tutor-led KCDs) should outperform students in the control (no-reflection) condition. This hypothesis was supported.<br />
*The more interactive the reflection modality the better, so mixed-initiative > standard KCD > self-explanation of canned text. We were unable to test this hypothesis, due to low participation.<br />
<br />
=== Findings ===<br />
<br />
*''Gain scores summary'':<br />
**There were no significant differences in mean gain score (or in pre-test and post-test scores) by condition (4 conditions; 3 treatment, 1 control).<br />
**Yoked pairs analysis, comparing students who completed no reflection questions with students who completed five or more questions, showed that “treated” subjects tended to out-gain “untreated” subjects.<br />
*''Regression analysis summary'':<br />
**The number of reflection questions completed had a significant positive effect on post-test scores.<br />
*''Exam score summary'':<br />
**For the final exam, there was no significant impact of the number of reflection questions completed.<br />
**For one course section’s hourly exam on work and energy, there was a significant, positive impact of the number of reflection questions completed, but only when CQPR was dropped from the model. For the other section’s hourly exam, “treated” subjects significantly outperformed “untreated” subjects, despite being outnumbered 3 to 1.<br />
<br />
=== Explanation ===<br />
<br />
This study is part of the Interactive Communication cluster, and its hypothesis is a specialization of the IC cluster’s central hypothesis. The IC cluster’s central hypothesis is that robust learning occurs when two conditions are met:<br />
<br />
* ''The learning event space should have paths that are mostly learning-by-doing along with alternative paths where a second agent does most of the work''. Since this experiment did not deal with collaboration between agents, it did not test this condition. It did, however, show that the more that students engage in one form of learning-by-doing—mainly, post-practice reflection—the more they learn.<br />
* ''The student takes the learning-by-doing path unless it becomes too difficult''. We are unable to determine why students chose not to take the learning-by doing path. Perhaps doing the reflection questions was too difficult for some students, but we suspect that this was not the case. A more likely explanation is that students did not have the time to complete these questions, and there were no negative consequences for avoiding them. In the follow-up study that we are currently running, students are required to complete the reflection questions in order to get credit for completing a problem.<br />
<br />
[[Category:Study]]</div>Katz