Difference between revisions of "DiBiano Personally Relevant Algebra Problems"

From LearnLab
Jump to: navigation, search
Line 1: Line 1:
== Robust Learning in Culturally and Personally Relevant Algebra Problem Scenarios ==
+
== The Effect of Context Personalization on Problem Solving in Algebra ==
 
  ''Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman''
 
  ''Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman''
  
Line 6: Line 6:
 
| '''PIs''' || Candace Walkington & Anthony Petrosino
 
| '''PIs''' || Candace Walkington & Anthony Petrosino
 
|-
 
|-
| '''Other Contributers''' ||  
+
| '''Other Contributors''' ||  
 
* Graduate Student: Milan Sherman
 
* Graduate Student: Milan Sherman
 
* Staff: Jim Greeno
 
* Staff: Jim Greeno
 
|}
 
|}
 
<br>
 
<br>
'' Pre Study (Think-Alouds) ''
+
'' Pilot Study ''
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
| '''Study Start Date''' || September 2008
 
| '''Study Start Date''' || September 2008
Line 19: Line 19:
 
| '''Study Site''' || Austin, TX
 
| '''Study Site''' || Austin, TX
 
|-
 
|-
| '''Number of Students''' || ''N'' = 29
+
| '''Number of Students''' || ''N'' = 24
 
|-
 
|-
| '''Average # of hours per participant''' || 3 hrs.
+
| '''Average # of hours per participant''' || 2 hrs.
 
|}
 
|}
 
<br>
 
<br>
  
'' Full Study ''
+
'' In Vivo Study ''
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
| '''Study Start Date''' || October 2009
 
| '''Study Start Date''' || October 2009
Line 44: Line 44:
  
 
=== Abstract ===
 
=== Abstract ===
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001).  However, observations and discussions with teachers in Austin ISD suggest that the problem scenarios are disconnected from the lives of typical urban students.  This study will examine whether and the mechanisms by which familiarity with problem scenario context affect comprehension and [[robust learning]].  We will use the medium of [[cognitive tutor|Cognitive Tutor]] Algebra for the in-vivo portion of this study, but our aim is not to improve the quality of the software’s problem scenarios.  It is instead to study how student diversity affects cognition, motivation, and learning, by using the power of a computer system that has the ability to do what classroom teachers cannot – [[personalization|personalize]] each problem to the background and interests of each individual student.
 
  
The research began in Fall of 2008 with a study of the personal interests of urban students at an "Academically Unacceptable" school in Austin, TX (75% free/reduced lunch). Freshman algebra students were surveyed and interviewed over their interests, such as sports, music, movies, etc., as well as how they use mathematics in their everyday lives. Students were also asked to solve a number of cognitive tutor problems, rewritten to have varying levels of "relevancy," while thinking aloud. Results of this study were used to rewrite the algebra problem scenarios in one section of the [[cognitive tutor|Cognitive Tutor]] software, Section 5, "Linear Models and Independent Variables." In Fall of 2009 at the Pittsburgh Learnlab site the [[cognitive tutor|Cognitive Tutor]] software was programmed to give students an initial interests survey, and then select problem scenarios that match user interests.  The resulting [[robust learning]], measured by ''delayed post-test'', ''curriculum progress'' and ''mastery of knowledge components'', will be analyzed with a 2-group design (experimental vs. control) to measure the effects of the [[personalization]].
+
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001).  However, observations and discussions with teachers have suggested that some of the Cognitive Tutor problem scenarios may be disconnected from the lives and experiences of many students.  This study investigated whether students’ personal interest in story contexts affects performance and [[robust learning]].
 +
 
 +
The first stage of this research was a pilot study of the personal interests of students at an urban Texas high school. Freshman algebra students were surveyed and interviewed about their out-of-school interests, and were also asked to describe how they use mathematics in their everyday lives. Twenty-four of these students solved a number of Cognitive Tutor Algebra-style problems while thinking aloud. Results of this pilot study were used to critically examine the idea that personalization of story problems has the potential to support student learning, using qualitative data analysis methods.
 +
 
 +
The second stage of this research was an “in vivo” study that took place in Fall of 2009 at a Pennsylvania Learnlab site. Based on the results of the pilot study and additional student surveys from Pennsylvania, the 27 problems in Section 5 ("Linear Models and Independent Variables)" of the [[cognitive tutor|Cognitive Tutor]] software were rewritten to each have 4 “personalized” versions corresponding to different student interests. The [[cognitive tutor|Cognitive Tutor]] software was programmed to give participating students an initial interests survey, and then select problem scenarios that match their interests.  The resulting [[robust learning]], measured by a delayed post-test (measuring long-term retention), and mastery of knowledge components in a future section (measuring transfer), has been analyzed with a 2-group design (experimental vs. control) to measure the effect of [[personalization]] on learning. Measures from within Section 5 were also analyzed to measure the effect of personalization on performance.
  
 
=== Background and Significance ===
 
=== Background and Significance ===
  
This research direction was initiated by the observation of classrooms in Austin, Texas using the [[cognitive tutor|Cognitive Tutor]] Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching career. Teacher complaints were consistently centered not around the interface, the feedback, or the cognitive model of the software, but on the problem scenarios. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the [[cognitive tutor|Cognitive Tutor]] problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech.  It’s important to note that as part of the development of the PUMP Algebra Tutor (PAT), teachers had designed problems to be "culturally and personally relevant to students" (Koedinger, 2001). This research is designed to empirically test the claim that the cultural and personal relevance of problem scenarios affects [[robust learning]].  
+
This research direction was initiated by the observation of classrooms in Texas using the [[cognitive tutor|Cognitive Tutor]] Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching practice. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the [[cognitive tutor|Cognitive Tutor]] problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech.  A review of the literature showed limited evidence for the potential of relevant story contexts to increase learning, and little research had been done at the secondary school level. This study is designed to empirically test the claim that the personal relevance of story problems affects [[robust learning]] and performance. 
 +
=== Theoretical Framework===
 +
 
 +
This study is situated in the new “Motivation and Metacogntion” thrust.  The foundation of this study is that relevance of problem scenarios affects robust learning through increased intrinsic motivation (Cordova & Lepper, 1996). If learners that have the cognitive capacity to solve algebra story problems, enhancing motivation may increase their likelihood to exert effort to make sense of the scenarios by forming a more elaborated and better connected situation and problem models (Nathan, Kintsch, & Young, 1992), thus encouraging generative processing (Mayer, 2011). Mayer (2011) states the personalization principle as “People learn better when the instructor uses conversational style rather than formal style” (p. 70). Here, we are use the PSLC’s modified version of this principle, which states “Matching up the features of an instructional component with students' personal interests, experiences, or typical patterns of language use, will lead to more robust learning through increased motivation, compared to when instruction is not personalized.” This is related to what Mayer (2011) refers to as the “Anchoring” principle.
 +
 
 +
The construct through which personalization enhances intrinsic motivation is through increase personal interest (also called individual interest). Personal interest is considered to be stable, enduring preferences that individual learners bring with them to different situations (Anderman & Anderman, 2010). Interest promotes more effective processing of information and greater cognitive engagement. Students who have high interest may be more likely to relate new knowledge to prior knowledge and form more connections between ideas. They also may be more likely to generate inferences, examples and applications relating to the subject area they are trying to learn (Ormrod, 2008).
 +
 
 +
=== Pilot Study===
 +
 
 +
The first stage of this research began in Fall of 2008 with a pilot study of personalization at an "Academically Unacceptable" school in Texas (75% free/reduced lunch).  Twenty-four freshman algebra students were interviewed about their out-of-school interests, such as sports, music, movies, etc., and were also asked to describe how they use mathematics in their everyday lives. These interviews were audio recorded, and were used to write each student “personalized” algebra story problems. The research questions being investigated were:
 +
 
 +
1) What is the impact of personalizing algebra story problems to individual student experiences, in terms of strategy use, language comprehension, and students’ epistemological frames about mathematical activity? (qualitative)
 +
2) How does personalizing algebra story problems to individual experiences impact student performance, when compared to their performance on normal story problems from the Cognitive Tutor curriculum with the same underlying structure? (quantitative)
 +
 
 +
A problem set containing five algebra problems on linear functions was written for each student; two of these were story problems that were personalized to the ways in which the individual student described using mathematics in their everyday life during their initial interview. The problem set also contained normal story problems from the Cognitive Tutor curriculum, completely abstract symbolic equations, story problems that contained symbolic equations, and story problems with simplified language and general referents (“generic” story problems). Each problem had four parts – the first two parts were “Result Unknowns” or “concrete cases” (i.e. solve for y given this x), and the fourth and final part was a “Start Unknown” (i.e. solve for x given this y).  For normal, personalized, and generic problems, the third part of each problem asked students to write a general symbolic equation or “algebra rule” representing the story. For normal story problems that already contained equations, students were asked to interpret the parameters in terms of the story.  For completely abstract symbolic problems, students were asked to write a story that could go with the equation.
 +
 
 +
Each of the 24 students was given their problem set of 5 problems, and asked to solve each problem while “thinking aloud” and being audio recorded. Transcripts and student work were blocked such that one block was one student working one part of one problem.  Blocks were coded with strategies, mistakes, and other issues the students had solving story problems (like reading issues); kappa values of 0.79 or higher were obtained using 2 coders.
 +
 
 +
Results showed that students regularly used informal, arithmetic approaches to solve result and start unknown story problems, especially when the problem had been personalized.  Personalized problems had the lowest “No Response” rate (1% No Response), the highest use of informal strategies (80% of time), and students overwhelmingly perceived personalized problems as being “easiest” when asked (82% of time). Personalized problems also had higher success rates and lower student use of “non-coordinative” strategies where situational reasoning was not well-connected to formal problem-solving computations. When asked why they were given story problems in algebra class, students described how these problems would help them in the real world and in the workplace.
 +
 
 +
However, personalized problems still had a relatively high overall use of non-coordinative approaches (16% of time), and students also struggled with reading on personalized problems at similar rates to other problems (also 16% of time; some overlap). Students’ overwhelming use of informal strategies when solving personalized problems could be framed as problematic in a course where the overall goal is to have students use symbolic equations as representational tools. Finally, there was evidence that students still sometimes epistemologically framed personalized problems as “school mathematics” tasks, disconnected from their lived experiences.
 +
 
 +
Quantitative analyses specifically aimed to compare performance on personalized story problems versus normal story problems were carried out replicating the methodology of Koedinger & Nathan (2004). Students solved personalized problem correctly 61% of the time overall, and solved normal story problems correctly 45% of the time overall. However, using two 2-factor mixed model ANOVAs that treated students (ANOVA 1) and items (ANOVA 2) as random effects, no statistically reliable overall differences in performance were found between normal and personalized problems. “Items” in this case described the underlying mathematical structure of the story problem – i.e., the story described the equation “y=4x+11.” The two ANOVAs were repeated using only the hardest items, and using only the weakest students, and statistically reliable (p<.05), positive effects were found for personalization. The effect size (Cohen’s d) for the hardest problems was 0.9, and for the weakest students was 1.5.
 +
 
 +
These results need to be interpreted with caution, as this was a small sample size (24 students), the personalization was done at a level of correspondence to real experiences that a computer could not replicate, and this was a population of students who overall were especially weak in mathematics.
 
   
 
   
=== Research Questions ===
+
=== Research Questions for In Vivo Study===
  
 
* How will performance and time on task be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
 
* How will performance and time on task be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
 
* How will [[robust learning]] be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
 
* How will [[robust learning]] be affected when [[personalization]] through relevant problem scenarios is implemented instead of the current problem scenarios in the [[cognitive tutor|Cognitive Tutor]] Algebra I software?
  
=== Independent variables ===
+
=== Independent Variables for In Vivo Study ===
  
This experiment will manipulate level of [[personalization]] through two treatment groups:
+
This experiment will manipulate level of [[personalization]] through a two grpuo design
*Students recieve current Cognitive Tutor Algebra problems
+
*Control: Students who receive current Cognitive Tutor Algebra story problems for Unit 5
*Students receive matched culturally relevant Cognitive Tutor Algebra problems personalized according to student interest survey
+
*Experimental: Students who receive problems that have the same mathematical structure, but whose cover stories are personalized to individual students based on an interests survey
 
<BR>
 
<BR>
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
 
{| border="1" cellspacing="0" cellpadding="5" style="text-align: left;"
Line 68: Line 95:
 
| Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes.  A run consists of riding the ski lift up the hill, and skiing back down.  If she skiis for 3 hours, how many runs will she have completed? || 54 randomly-assigned Algebra I students at Learnlab site
 
| Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes.  A run consists of riding the ski lift up the hill, and skiing back down.  If she skiis for 3 hours, how many runs will she have completed? || 54 randomly-assigned Algebra I students at Learnlab site
 
|-
 
|-
| Relevant [[personalization|personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)
+
| [[personalization|Personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)
 
You noticed that the reality shows you watch on T.V. are all 30 minutes long.  If you’ve been watching reality shows for 3 hours, how many have you watched?
 
You noticed that the reality shows you watch on T.V. are all 30 minutes long.  If you’ve been watching reality shows for 3 hours, how many have you watched?
 
|| 57 randomly-assigned Algebra I students at Learnlab site
 
|| 57 randomly-assigned Algebra I students at Learnlab site
 
|}
 
|}
 
<BR>
 
<BR>
 +
=== Dependent variables for In Vivo Study ===
  
=== Hypothesis ===
+
[[Robust learning]] was measured through:  
 
 
Students in the treatment with [[personalization|personally relevant]] problem scenarios will show improved performance in terms of some measures of [[robust learning]] as a result of two factors: <BR>
 
* Increased intrinsic motivation (such as with the [[REAP_Study_on_Personalization_of_Readings_by_Topic_%28Fall_2006%29|REAP Tutor study]])<BR>
 
* Formation of a more detailed and meaningful situation model (Nathan, Kintsh, & Young, 1992).
 
 
 
=== Dependent variables ===
 
[[Robust learning]] will be measured through:  
 
* '''''[[Normal_post-test|Normal Post-test]]''''' measuring [[transfer]] of learning to different problem contexts (including abstract problems).
 
 
*'''''Delayed Post-test''''' measuring [[long-term retention]]
 
*'''''Delayed Post-test''''' measuring [[long-term retention]]
* '''''Curriculum progress''''' and '''''Mastery of knowledge components''''' in the [[cognitive tutor|Cognitive Tutor]] software, including in subsequent units:  
+
** A pre-test was administered before Unit 5, and a delayed post-test was administered at the end of Unit 6.
**The students’ progress through the knowledge components in the curriculum will measure [[accelerated future learning]] by reflecting the latency in mastering knowledge components and curriculum sections that build on the knowledge components and curriculum sections affected by the culturally relevant problem scenarios.  
+
* '''''Mastery of knowledge components''''' in the [[cognitive tutor|Cognitive Tutor]] software, including in subsequent units:  
 +
**The students’ performance in Unit 7 was also examined, to see if there were performance differences between the experimental and control group even after the treatment was no longer in effect.
  
 
'''Intrinsic Motivation''' will be measured through:
 
'''Intrinsic Motivation''' will be measured through:
Line 91: Line 112:
 
*Time on task in Cognitive Tutor software
 
*Time on task in Cognitive Tutor software
  
=== Method ===
+
=== Hypotheses for In Vivo Study ===
 +
 
 +
Students in the treatment with [[personalization|personalized]] problem scenarios will:
 +
 
 +
H1) Demonstrate higher levels of correct performance in Section 5
 +
H2)  Show improved “time on task” and fewer instances of “gaming the system” in Section 5
 +
H3) Show improvement on some measures of [[robust learning]], as measured by pre/delayed post differences and by performance in subsequent sections.
 +
 
 +
=== Method for In Vivo Study ===
 +
 
 +
Interest surveys were administered to algebra students in Pennsylvania (N=47) and algebra students in Texas (N=29). The surveys contained sections where students ranked their interest in 9 different topics and answered 20 open response questions about specific topics they were interested in.  The algebra students in Texas also participated in one-on-one interviews about their out-of-school interests (part of pilot study). Based on the results of the surveys and interviews, personally relevant problem scenarios corresponding to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables.  27 problem scenarios from the selected section were rewritten to have 4 different variations for each problem scenario, corresponding to 9 different topics students were interested in (sports, music, movies, computers, stores, food, art, TV, games).  The personally relevant problems had the same underlying mathematical structure as the original problems, with changes made to the objects or nouns (what the problem is about) in the story and the pronouns (who the problem is about).  See the table above for an example of how these changes occurred. The personally relevant problem scenarios were reviewed by two master Algebra I teachers for language and clarity and were modified based on teacher feedback.
 +
 
 +
The new problem scenarios were integrated into Unit 5 the [[cognitive tutor|Cognitive Tutor]] Algebra software at the high school site with the cooperation of Carnegie Learning.  111 students at the school site were randomly assigned to either the experimental group (personalized problems) or the control group (normal problems). The experiment was in-sequence, meaning that all students encountered Section 5 at their own pace (i.e. at the time they naturally reached that point the software). Immediately before students entered Unit 5, they were prompted to answer an interest survey where they ranked their level of interest in the 9 different topics, and took a pre-test where they solved two multi-part normal story problems. After the students completed Unit 6, they were given a delayed-post-test.
 +
 
 +
===Results===
 +
 
 +
H1) Students receiving personalized problems will demonstrate higher levels of performance in Unit 5 than students receiving normal problems.
  
This experiment began in the Fall of 2008 with a study of student interests. An interests survey was administered to high school classes in Austin ISD that contain a high proportion of diverse students, as well as at a Pittsburgh Learnlab. Structured in-depth interviews relating to student interests were conducted with around 29 of the surveyed students. Based on the results of the survey and interviews, culturally relevant problem scenarios that correspond to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables.  Approximately 27 problem scenarios from the selected section were replaced, with 4 variations on each problem scenario that correspond to different student interests, in order to obtain [[personalization]].  I wrote these problem scenarios  while consulting with Jim Greeno and Milan Sherman; they had the same underlying mathematics as the original [[cognitive tutor|Cognitive Tutor]] problems, with changes to the objects or nouns (what the problem is about) and the pronouns (who the problem is about).  See the table above for an example of how these two changes occured.
+
In order to test this hypothesis, a logistic regression model was formulated with the following properties. The unit of analysis was one student solving one part of one problem.
  
The culturally relevant problem scenarios were reviewed by two master Algebra I teachers. In a pilot study, 24 Algebra I students participated in [[think-aloud data|think-aloud protocols]]  where they solved five story problems with varying degrees of relevancy, that were based on Cognitive Tutor problems.
+
1) Dependent Variable – whether the student got the problem part correct on their first attempt, without asking for a hint.
 +
2) Random Effects – the student ID , the item (linear function underlying the problem), and the problem name (which personalized version student was given, or which set of numbers student was given for result and start unknowns)
 +
3) Fixed Effects – Condition (whether the student was in the experimental or control group) and what knowledge component was covered by the problem part
  
The new problem scenarios were integrated into the [[cognitive tutor|Cognitive Tutor]] Algebra software in Summer 2009 with the cooperation of Carnegie LearningOnce the new problem scenarios were placed into the software, they were used in an [[in vivo experiment]] at a Learnlab school site in Pittsburgh by 57 randomly-assigned students during the 09-10 school year. An additional 54 randomly-assigned students received the regular problem scenarios. See table above for a description of the two treatment groups in this study.
+
Each of these effects significantly improved the modelInteractions did not significantly improve the model. The main effect for the treatment (personalization) was statistically significant at the 5% level. Personalization had a positive overall effect on student performance. The size of the overall impact of personalization on performance was around 5.3%. If a student had a 50% base chance of getting a problem correct on the first attempt, personalization would increase that chance to 55.3%.
  
To summarize, the experiment had the following progression:
+
Although interaction terms were not significant in this model, this seemed to be a combination of lack of statistical power and the addition of many parameters when interactions were modeled. Thus a second model was specified where the knowledge components were classified as easy, medium, and hard, and here there was a significant condition by knowledge component interaction.  Personalization had a significantly larger, positive impact on the two most difficult knowledge components relating to writing symbolic expressions, compared to the medium difficulty knowledge components. For the most difficult knowledge components, personalization increased success rates from 50% to 58%.
(1) Survey (paper & online) of student interests administered in Austin ISD and Learnlab site
 
(2) Based on survey data, structured interviews on students' out-of-school interests were conducted
 
(3) Based on interest interview, 24 students participated in think-alouds where they each solved 5 problems with different degrees of relevancy
 
(4) Relevant problem scenarios for Section 5 were written by Candace Walkington & Milan Sherman and reviewed by 2 master algebra teachers
 
(5) One [[cognitive tutor|Cognitive Tutor]] Algebra unit was replaced at a Learnlab site with randomized control (in-sequence) setup, N=111.
 
  
=== Explanation ===
+
More results coming soon.
  
This study is situated in the new “Motivation and Metacogntion’ thrust.  The foundation of this study is that relevance of problem scenarios affects robust learning during the formation situation models, defined as mental representation of relationships, actions, and events in a problem (Nathan, Kintsch, & Young, 1992), as well through intrinsic motivation (Cordova & Lepper, 1996).  Our hypothesis is that personalized problems would cause students to create more detailed and meaningful situation models through enhanced problem comprehension ad implicit problem knowledge.  This would in turn affect the topology of the learning event space and/or path choices, causing students to use different strategies or paths (“blue-line” vs. “red-line”) as relevant problems are more likely to help students to encode deep, relevant features and/or avoid encoding shallow, irrelevant features.  Another facet of this hypothesis is that personalized problems would enhance intrinsic motivation, which would increase focus of attention on the problem, contributing both to the formation of detailed situation models as well as more general enhancement of engagement and time on task (relating to "path effects").
 
  
 
=== References ===
 
=== References ===
 +
 +
Anderman, E., & Anderman, L. (2010). Classroom Motivation. Pearson: Columbus, OH.
  
 
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.
 
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.
Line 121: Line 156:
  
 
Nathan, M., Kintsch, W., & Young, E. (1992).  A theory of algebra-word-problem comprehension and its implications for the design of learning environments.  Cognition and Instruction, 9(4), 329-389.
 
Nathan, M., Kintsch, W., & Young, E. (1992).  A theory of algebra-word-problem comprehension and its implications for the design of learning environments.  Cognition and Instruction, 9(4), 329-389.
 +
 +
Ormrod, J. Human Learning. Pearson/Merrill/Prentice Hall: Columbus, OH.
 +
 +
Mayer, R. (2011). Applying the Science of Learning. Pearson.
  
 
[[Stoichiometry_Study|McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]
 
[[Stoichiometry_Study|McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org]]

Revision as of 21:47, 16 November 2010

The Effect of Context Personalization on Problem Solving in Algebra

Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman

Summary Tables

PIs Candace Walkington & Anthony Petrosino
Other Contributors
  • Graduate Student: Milan Sherman
  • Staff: Jim Greeno


Pilot Study

Study Start Date September 2008
Study End Date May 2009
Study Site Austin, TX
Number of Students N = 24
Average # of hours per participant 2 hrs.


In Vivo Study

Study Start Date October 2009
Study End Date April 2010
LearnLab Site Hopewell High
LearnLab Course Algebra
Number of Students N = 111
Average # of hours per participant 3 hr
Data in DataShop Yes - Personalization Hopewell 2010


Abstract

In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001). However, observations and discussions with teachers have suggested that some of the Cognitive Tutor problem scenarios may be disconnected from the lives and experiences of many students. This study investigated whether students’ personal interest in story contexts affects performance and robust learning.

The first stage of this research was a pilot study of the personal interests of students at an urban Texas high school. Freshman algebra students were surveyed and interviewed about their out-of-school interests, and were also asked to describe how they use mathematics in their everyday lives. Twenty-four of these students solved a number of Cognitive Tutor Algebra-style problems while thinking aloud. Results of this pilot study were used to critically examine the idea that personalization of story problems has the potential to support student learning, using qualitative data analysis methods.

The second stage of this research was an “in vivo” study that took place in Fall of 2009 at a Pennsylvania Learnlab site. Based on the results of the pilot study and additional student surveys from Pennsylvania, the 27 problems in Section 5 ("Linear Models and Independent Variables)" of the Cognitive Tutor software were rewritten to each have 4 “personalized” versions corresponding to different student interests. The Cognitive Tutor software was programmed to give participating students an initial interests survey, and then select problem scenarios that match their interests. The resulting robust learning, measured by a delayed post-test (measuring long-term retention), and mastery of knowledge components in a future section (measuring transfer), has been analyzed with a 2-group design (experimental vs. control) to measure the effect of personalization on learning. Measures from within Section 5 were also analyzed to measure the effect of personalization on performance.

Background and Significance

This research direction was initiated by the observation of classrooms in Texas using the Cognitive Tutor Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching practice. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the Cognitive Tutor problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech. A review of the literature showed limited evidence for the potential of relevant story contexts to increase learning, and little research had been done at the secondary school level. This study is designed to empirically test the claim that the personal relevance of story problems affects robust learning and performance.

Theoretical Framework

This study is situated in the new “Motivation and Metacogntion” thrust. The foundation of this study is that relevance of problem scenarios affects robust learning through increased intrinsic motivation (Cordova & Lepper, 1996). If learners that have the cognitive capacity to solve algebra story problems, enhancing motivation may increase their likelihood to exert effort to make sense of the scenarios by forming a more elaborated and better connected situation and problem models (Nathan, Kintsch, & Young, 1992), thus encouraging generative processing (Mayer, 2011). Mayer (2011) states the personalization principle as “People learn better when the instructor uses conversational style rather than formal style” (p. 70). Here, we are use the PSLC’s modified version of this principle, which states “Matching up the features of an instructional component with students' personal interests, experiences, or typical patterns of language use, will lead to more robust learning through increased motivation, compared to when instruction is not personalized.” This is related to what Mayer (2011) refers to as the “Anchoring” principle.

The construct through which personalization enhances intrinsic motivation is through increase personal interest (also called individual interest). Personal interest is considered to be stable, enduring preferences that individual learners bring with them to different situations (Anderman & Anderman, 2010). Interest promotes more effective processing of information and greater cognitive engagement. Students who have high interest may be more likely to relate new knowledge to prior knowledge and form more connections between ideas. They also may be more likely to generate inferences, examples and applications relating to the subject area they are trying to learn (Ormrod, 2008).

Pilot Study

The first stage of this research began in Fall of 2008 with a pilot study of personalization at an "Academically Unacceptable" school in Texas (75% free/reduced lunch). Twenty-four freshman algebra students were interviewed about their out-of-school interests, such as sports, music, movies, etc., and were also asked to describe how they use mathematics in their everyday lives. These interviews were audio recorded, and were used to write each student “personalized” algebra story problems. The research questions being investigated were:

1) What is the impact of personalizing algebra story problems to individual student experiences, in terms of strategy use, language comprehension, and students’ epistemological frames about mathematical activity? (qualitative) 2) How does personalizing algebra story problems to individual experiences impact student performance, when compared to their performance on normal story problems from the Cognitive Tutor curriculum with the same underlying structure? (quantitative)

A problem set containing five algebra problems on linear functions was written for each student; two of these were story problems that were personalized to the ways in which the individual student described using mathematics in their everyday life during their initial interview. The problem set also contained normal story problems from the Cognitive Tutor curriculum, completely abstract symbolic equations, story problems that contained symbolic equations, and story problems with simplified language and general referents (“generic” story problems). Each problem had four parts – the first two parts were “Result Unknowns” or “concrete cases” (i.e. solve for y given this x), and the fourth and final part was a “Start Unknown” (i.e. solve for x given this y). For normal, personalized, and generic problems, the third part of each problem asked students to write a general symbolic equation or “algebra rule” representing the story. For normal story problems that already contained equations, students were asked to interpret the parameters in terms of the story. For completely abstract symbolic problems, students were asked to write a story that could go with the equation.

Each of the 24 students was given their problem set of 5 problems, and asked to solve each problem while “thinking aloud” and being audio recorded. Transcripts and student work were blocked such that one block was one student working one part of one problem. Blocks were coded with strategies, mistakes, and other issues the students had solving story problems (like reading issues); kappa values of 0.79 or higher were obtained using 2 coders.

Results showed that students regularly used informal, arithmetic approaches to solve result and start unknown story problems, especially when the problem had been personalized. Personalized problems had the lowest “No Response” rate (1% No Response), the highest use of informal strategies (80% of time), and students overwhelmingly perceived personalized problems as being “easiest” when asked (82% of time). Personalized problems also had higher success rates and lower student use of “non-coordinative” strategies where situational reasoning was not well-connected to formal problem-solving computations. When asked why they were given story problems in algebra class, students described how these problems would help them in the real world and in the workplace.

However, personalized problems still had a relatively high overall use of non-coordinative approaches (16% of time), and students also struggled with reading on personalized problems at similar rates to other problems (also 16% of time; some overlap). Students’ overwhelming use of informal strategies when solving personalized problems could be framed as problematic in a course where the overall goal is to have students use symbolic equations as representational tools. Finally, there was evidence that students still sometimes epistemologically framed personalized problems as “school mathematics” tasks, disconnected from their lived experiences.

Quantitative analyses specifically aimed to compare performance on personalized story problems versus normal story problems were carried out replicating the methodology of Koedinger & Nathan (2004). Students solved personalized problem correctly 61% of the time overall, and solved normal story problems correctly 45% of the time overall. However, using two 2-factor mixed model ANOVAs that treated students (ANOVA 1) and items (ANOVA 2) as random effects, no statistically reliable overall differences in performance were found between normal and personalized problems. “Items” in this case described the underlying mathematical structure of the story problem – i.e., the story described the equation “y=4x+11.” The two ANOVAs were repeated using only the hardest items, and using only the weakest students, and statistically reliable (p<.05), positive effects were found for personalization. The effect size (Cohen’s d) for the hardest problems was 0.9, and for the weakest students was 1.5.

These results need to be interpreted with caution, as this was a small sample size (24 students), the personalization was done at a level of correspondence to real experiences that a computer could not replicate, and this was a population of students who overall were especially weak in mathematics.

Research Questions for In Vivo Study

  • How will performance and time on task be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?
  • How will robust learning be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?

Independent Variables for In Vivo Study

This experiment will manipulate level of personalization through a two grpuo design

  • Control: Students who receive current Cognitive Tutor Algebra story problems for Unit 5
  • Experimental: Students who receive problems that have the same mathematical structure, but whose cover stories are personalized to individual students based on an interests survey


Treatment Example Problem Received By
Normal Cognitive Tutor Algebra problem scenarios A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? 54 randomly-assigned Algebra I students at Learnlab site
Personalized problem scenarios (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)

You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched?

57 randomly-assigned Algebra I students at Learnlab site


Dependent variables for In Vivo Study

Robust learning was measured through:

  • Delayed Post-test measuring long-term retention
    • A pre-test was administered before Unit 5, and a delayed post-test was administered at the end of Unit 6.
  • Mastery of knowledge components in the Cognitive Tutor software, including in subsequent units:
    • The students’ performance in Unit 7 was also examined, to see if there were performance differences between the experimental and control group even after the treatment was no longer in effect.

Intrinsic Motivation will be measured through:

  • Hint-seeking and reading behavior in Cognitive Tutor software
  • Time on task in Cognitive Tutor software

Hypotheses for In Vivo Study

Students in the treatment with personalized problem scenarios will:

H1) Demonstrate higher levels of correct performance in Section 5

H2) Show improved “time on task” and fewer instances of “gaming the system” in Section 5 H3) Show improvement on some measures of robust learning, as measured by pre/delayed post differences and by performance in subsequent sections.

Method for In Vivo Study

Interest surveys were administered to algebra students in Pennsylvania (N=47) and algebra students in Texas (N=29). The surveys contained sections where students ranked their interest in 9 different topics and answered 20 open response questions about specific topics they were interested in. The algebra students in Texas also participated in one-on-one interviews about their out-of-school interests (part of pilot study). Based on the results of the surveys and interviews, personally relevant problem scenarios corresponding to current problem scenarios in Cognitive Tutor Algebra I were formulated for Section 5, Linear Models and Independent Variables. 27 problem scenarios from the selected section were rewritten to have 4 different variations for each problem scenario, corresponding to 9 different topics students were interested in (sports, music, movies, computers, stores, food, art, TV, games). The personally relevant problems had the same underlying mathematical structure as the original problems, with changes made to the objects or nouns (what the problem is about) in the story and the pronouns (who the problem is about). See the table above for an example of how these changes occurred. The personally relevant problem scenarios were reviewed by two master Algebra I teachers for language and clarity and were modified based on teacher feedback.

The new problem scenarios were integrated into Unit 5 the Cognitive Tutor Algebra software at the high school site with the cooperation of Carnegie Learning. 111 students at the school site were randomly assigned to either the experimental group (personalized problems) or the control group (normal problems). The experiment was in-sequence, meaning that all students encountered Section 5 at their own pace (i.e. at the time they naturally reached that point the software). Immediately before students entered Unit 5, they were prompted to answer an interest survey where they ranked their level of interest in the 9 different topics, and took a pre-test where they solved two multi-part normal story problems. After the students completed Unit 6, they were given a delayed-post-test.

Results

H1) Students receiving personalized problems will demonstrate higher levels of performance in Unit 5 than students receiving normal problems.

In order to test this hypothesis, a logistic regression model was formulated with the following properties. The unit of analysis was one student solving one part of one problem.

1) Dependent Variable – whether the student got the problem part correct on their first attempt, without asking for a hint. 2) Random Effects – the student ID , the item (linear function underlying the problem), and the problem name (which personalized version student was given, or which set of numbers student was given for result and start unknowns) 3) Fixed Effects – Condition (whether the student was in the experimental or control group) and what knowledge component was covered by the problem part

Each of these effects significantly improved the model. Interactions did not significantly improve the model. The main effect for the treatment (personalization) was statistically significant at the 5% level. Personalization had a positive overall effect on student performance. The size of the overall impact of personalization on performance was around 5.3%. If a student had a 50% base chance of getting a problem correct on the first attempt, personalization would increase that chance to 55.3%.

Although interaction terms were not significant in this model, this seemed to be a combination of lack of statistical power and the addition of many parameters when interactions were modeled. Thus a second model was specified where the knowledge components were classified as easy, medium, and hard, and here there was a significant condition by knowledge component interaction. Personalization had a significantly larger, positive impact on the two most difficult knowledge components relating to writing symbolic expressions, compared to the medium difficulty knowledge components. For the most difficult knowledge components, personalization increased success rates from 50% to 58%.

More results coming soon.


References

Anderman, E., & Anderman, L. (2010). Classroom Motivation. Pearson: Columbus, OH.

Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.

Cordova, D. I. & Lepper, M. R. (1996). Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice. Journal of Educational Psychology, 88(4), 715-730.

Eskenazi, M.; Juffs, A., Heilman, M., Collins-Thompson, K., Wilson, L., & Callen, J. (2006). REAP Study on Personalization of Readings by Topic (Fall 2006). The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org

Koedinger, K. R. (2001). Cognitive tutors as modeling tool and instructional model. In Forbus, K. D. & Feltovich, P. J. (Eds.) Smart Machines in Education: The Coming Revolution in Educational Technology. Menlo Park, CA: AAAI/MIT Press.

Nathan, M., Kintsch, W., & Young, E. (1992). A theory of algebra-word-problem comprehension and its implications for the design of learning environments. Cognition and Instruction, 9(4), 329-389.

Ormrod, J. Human Learning. Pearson/Merrill/Prentice Hall: Columbus, OH.

Mayer, R. (2011). Applying the Science of Learning. Pearson.

McLaren, B., Koedinger, K., & Yaron, D. (2006). Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems. The PSLC Wiki. Retrieved June 21, 2007, from http://www.learnlab.org