Difference between revisions of "DiBiano Personally Relevant Algebra Problems"
Line 66: | Line 66: | ||
| '''Treatment'''|| '''Example Problem''' || '''Received By''' | | '''Treatment'''|| '''Example Problem''' || '''Received By''' | ||
|- | |- | ||
− | | Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? || | + | | Normal Cognitive Tutor Algebra problem scenarios || A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? || 54 randomly-assigned Algebra I students at Learnlab site |
|- | |- | ||
| Relevant [[personalization|personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows) | | Relevant [[personalization|personalized]] problem scenarios || (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows) | ||
You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched? | You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched? | ||
− | || | + | || 57 randomly-assigned Algebra I students at Learnlab site |
|} | |} | ||
<BR> | <BR> | ||
Line 93: | Line 93: | ||
=== Method === | === Method === | ||
− | This experiment began in the Fall of 2008 with a study of student interests. An interests survey was administered to high school classes in Austin ISD that contain a high proportion of diverse students, as well as at a Pittsburgh Learnlab. Structured in-depth interviews relating to student interests were conducted with around 29 of the surveyed students. Based on the results of the survey and interviews, culturally relevant problem scenarios that correspond to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables. Approximately 27 problem scenarios from the selected section | + | This experiment began in the Fall of 2008 with a study of student interests. An interests survey was administered to high school classes in Austin ISD that contain a high proportion of diverse students, as well as at a Pittsburgh Learnlab. Structured in-depth interviews relating to student interests were conducted with around 29 of the surveyed students. Based on the results of the survey and interviews, culturally relevant problem scenarios that correspond to current problem scenarios in [[cognitive tutor|Cognitive Tutor]] Algebra I were formulated for Section 5, Linear Models and Independent Variables. Approximately 27 problem scenarios from the selected section were replaced, with 4 variations on each problem scenario that correspond to different student interests, in order to obtain [[personalization]]. I wrote these problem scenarios while consulting with Jim Greeno and Milan Sherman; they had the same underlying mathematics as the original [[cognitive tutor|Cognitive Tutor]] problems, with changes to the objects or nouns (what the problem is about) and the pronouns (who the problem is about). See the table above for an example of how these two changes occured. |
− | The culturally relevant problem scenarios were reviewed by two master Algebra I teachers. In a pilot study, 24 Algebra I students participated in [[think-aloud data|think-aloud protocols]] where they solved five story problems with varying degrees of relevancy, that were based on Cognitive Tutor problems. | + | The culturally relevant problem scenarios were reviewed by two master Algebra I teachers. In a pilot study, 24 Algebra I students participated in [[think-aloud data|think-aloud protocols]] where they solved five story problems with varying degrees of relevancy, that were based on Cognitive Tutor problems. |
− | The new problem scenarios were integrated into the [[cognitive tutor|Cognitive Tutor]] Algebra software in Summer 2009 with the cooperation of Carnegie Learning. Once the new problem scenarios were placed into the software, they were used in an [[in vivo experiment]] at a Learnlab school site in Pittsburgh by | + | The new problem scenarios were integrated into the [[cognitive tutor|Cognitive Tutor]] Algebra software in Summer 2009 with the cooperation of Carnegie Learning. Once the new problem scenarios were placed into the software, they were used in an [[in vivo experiment]] at a Learnlab school site in Pittsburgh by 57 randomly-assigned students during the 09-10 school year. An additional 54 randomly-assigned students received the regular problem scenarios. See table above for a description of the two treatment groups in this study. |
To summarize, the experiment had the following progression: | To summarize, the experiment had the following progression: | ||
(1) Survey (paper & online) of student interests administered in Austin ISD and Learnlab site | (1) Survey (paper & online) of student interests administered in Austin ISD and Learnlab site | ||
− | (2) Based on survey data, structured interviews on students' out- | + | (2) Based on survey data, structured interviews on students' out-of-school interests were conducted |
− | (3) Based on interest interview, 24 students participated in think-alouds where they each solved 5 problems with different degrees of relevancy | + | (3) Based on interest interview, 24 students participated in think-alouds where they each solved 5 problems with different degrees of relevancy |
(4) Relevant problem scenarios for Section 5 were written by Candace Walkington & Milan Sherman and reviewed by 2 master algebra teachers | (4) Relevant problem scenarios for Section 5 were written by Candace Walkington & Milan Sherman and reviewed by 2 master algebra teachers | ||
− | (5) One [[cognitive tutor|Cognitive Tutor]] Algebra unit replaced at a Learnlab site with randomized control (in-sequence) setup | + | (5) One [[cognitive tutor|Cognitive Tutor]] Algebra unit was replaced at a Learnlab site with randomized control (in-sequence) setup, N=111. |
=== Explanation === | === Explanation === |
Revision as of 19:42, 26 July 2010
Contents
Robust Learning in Culturally and Personally Relevant Algebra Problem Scenarios
Candace Walkington (DiBiano), Anthony Petrosino, Jim Greeno, and Milan Sherman
Summary Tables
PIs | Candace Walkington & Anthony Petrosino |
Other Contributers |
|
Pre Study (Think-Alouds)
Study Start Date | September 2008 |
Study End Date | May 2009 |
Study Site | Austin, TX |
Number of Students | N = 29 |
Average # of hours per participant | 3 hrs. |
Full Study
Study Start Date | October 2009 |
Study End Date | April 2010 |
LearnLab Site | Hopewell High |
LearnLab Course | Algebra |
Number of Students | N = 111 |
Average # of hours per participant | 3 hr |
Data in DataShop | Yes - Personalization Hopewell 2010 |
Abstract
In the original development of the PUMP Algebra Tutor (PAT), teachers had designed the algebra problem scenarios to be "culturally and personally relevant to students" (Koedinger, 2001). However, observations and discussions with teachers in Austin ISD suggest that the problem scenarios are disconnected from the lives of typical urban students. This study will examine whether and the mechanisms by which familiarity with problem scenario context affect comprehension and robust learning. We will use the medium of Cognitive Tutor Algebra for the in-vivo portion of this study, but our aim is not to improve the quality of the software’s problem scenarios. It is instead to study how student diversity affects cognition, motivation, and learning, by using the power of a computer system that has the ability to do what classroom teachers cannot – personalize each problem to the background and interests of each individual student.
The research began in Fall of 2008 with a study of the personal interests of urban students at an "Academically Unacceptable" school in Austin, TX (75% free/reduced lunch). Freshman algebra students were surveyed and interviewed over their interests, such as sports, music, movies, etc., as well as how they use mathematics in their everyday lives. Students were also asked to solve a number of cognitive tutor problems, rewritten to have varying levels of "relevancy," while thinking aloud. Results of this study were used to rewrite the algebra problem scenarios in one section of the Cognitive Tutor software, Section 5, "Linear Models and Independent Variables." In Fall of 2009 at the Pittsburgh Learnlab site the Cognitive Tutor software was programmed to give students an initial interests survey, and then select problem scenarios that match user interests. The resulting robust learning, measured by delayed post-test, curriculum progress and mastery of knowledge components, will be analyzed with a 2-group design (experimental vs. control) to measure the effects of the personalization.
Background and Significance
This research direction was initiated by the observation of classrooms in Austin, Texas using the Cognitive Tutor Algebra I software, as well as discussions with teachers that had implemented this software at some point in their teaching career. Teacher complaints were consistently centered not around the interface, the feedback, or the cognitive model of the software, but on the problem scenarios. Teachers explained that their urban students found problems about harvesting wheat “silly,” “dry,” and irrelevant. Teachers also complained that some of the vocabulary words in the Cognitive Tutor problem scenarios (one example was the word "greenhouse") confused their students because urban freshman do not typically discuss these topics in their everyday speech. It’s important to note that as part of the development of the PUMP Algebra Tutor (PAT), teachers had designed problems to be "culturally and personally relevant to students" (Koedinger, 2001). This research is designed to empirically test the claim that the cultural and personal relevance of problem scenarios affects robust learning.
Research Questions
- How will performance and time on task be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?
- How will robust learning be affected when personalization through relevant problem scenarios is implemented instead of the current problem scenarios in the Cognitive Tutor Algebra I software?
Independent variables
This experiment will manipulate level of personalization through two treatment groups:
- Students recieve current Cognitive Tutor Algebra problems
- Students receive matched culturally relevant Cognitive Tutor Algebra problems personalized according to student interest survey
Treatment | Example Problem | Received By |
Normal Cognitive Tutor Algebra problem scenarios | A skier noticed that she can complete a run in about 30 minutes. A run consists of riding the ski lift up the hill, and skiing back down. If she skiis for 3 hours, how many runs will she have completed? | 54 randomly-assigned Algebra I students at Learnlab site |
Relevant personalized problem scenarios | (student selects personal interest in T.V. shows, cultural survey/interview shows strong interest among urban youth in reality shows)
You noticed that the reality shows you watch on T.V. are all 30 minutes long. If you’ve been watching reality shows for 3 hours, how many have you watched? |
57 randomly-assigned Algebra I students at Learnlab site |
Hypothesis
Students in the treatment with personally relevant problem scenarios will show improved performance in terms of some measures of robust learning as a result of two factors:
- Increased intrinsic motivation (such as with the REAP Tutor study)
- Formation of a more detailed and meaningful situation model (Nathan, Kintsh, & Young, 1992).
Dependent variables
Robust learning will be measured through:
- Normal Post-test measuring transfer of learning to different problem contexts (including abstract problems).
- Delayed Post-test measuring long-term retention
- Curriculum progress and Mastery of knowledge components in the Cognitive Tutor software, including in subsequent units:
- The students’ progress through the knowledge components in the curriculum will measure accelerated future learning by reflecting the latency in mastering knowledge components and curriculum sections that build on the knowledge components and curriculum sections affected by the culturally relevant problem scenarios.
Intrinsic Motivation will be measured through:
- Hint-seeking and reading behavior in Cognitive Tutor software
- Time on task in Cognitive Tutor software
Method
This experiment began in the Fall of 2008 with a study of student interests. An interests survey was administered to high school classes in Austin ISD that contain a high proportion of diverse students, as well as at a Pittsburgh Learnlab. Structured in-depth interviews relating to student interests were conducted with around 29 of the surveyed students. Based on the results of the survey and interviews, culturally relevant problem scenarios that correspond to current problem scenarios in Cognitive Tutor Algebra I were formulated for Section 5, Linear Models and Independent Variables. Approximately 27 problem scenarios from the selected section were replaced, with 4 variations on each problem scenario that correspond to different student interests, in order to obtain personalization. I wrote these problem scenarios while consulting with Jim Greeno and Milan Sherman; they had the same underlying mathematics as the original Cognitive Tutor problems, with changes to the objects or nouns (what the problem is about) and the pronouns (who the problem is about). See the table above for an example of how these two changes occured.
The culturally relevant problem scenarios were reviewed by two master Algebra I teachers. In a pilot study, 24 Algebra I students participated in think-aloud protocols where they solved five story problems with varying degrees of relevancy, that were based on Cognitive Tutor problems.
The new problem scenarios were integrated into the Cognitive Tutor Algebra software in Summer 2009 with the cooperation of Carnegie Learning. Once the new problem scenarios were placed into the software, they were used in an in vivo experiment at a Learnlab school site in Pittsburgh by 57 randomly-assigned students during the 09-10 school year. An additional 54 randomly-assigned students received the regular problem scenarios. See table above for a description of the two treatment groups in this study.
To summarize, the experiment had the following progression: (1) Survey (paper & online) of student interests administered in Austin ISD and Learnlab site (2) Based on survey data, structured interviews on students' out-of-school interests were conducted (3) Based on interest interview, 24 students participated in think-alouds where they each solved 5 problems with different degrees of relevancy (4) Relevant problem scenarios for Section 5 were written by Candace Walkington & Milan Sherman and reviewed by 2 master algebra teachers (5) One Cognitive Tutor Algebra unit was replaced at a Learnlab site with randomized control (in-sequence) setup, N=111.
Explanation
This study is situated in the new “Motivation and Metacogntion’ thrust. The foundation of this study is that relevance of problem scenarios affects robust learning during the formation situation models, defined as mental representation of relationships, actions, and events in a problem (Nathan, Kintsch, & Young, 1992), as well through intrinsic motivation (Cordova & Lepper, 1996). Our hypothesis is that personalized problems would cause students to create more detailed and meaningful situation models through enhanced problem comprehension ad implicit problem knowledge. This would in turn affect the topology of the learning event space and/or path choices, causing students to use different strategies or paths (“blue-line” vs. “red-line”) as relevant problems are more likely to help students to encode deep, relevant features and/or avoid encoding shallow, irrelevant features. Another facet of this hypothesis is that personalized problems would enhance intrinsic motivation, which would increase focus of attention on the problem, contributing both to the formation of detailed situation models as well as more general enhancement of engagement and time on task (relating to "path effects").
References
Clark, R. C. & Mayer, R. E. (2003). E-Learning and the Science of Instruction. Jossey-Bass/Pfeiffer.
Cordova, D. I. & Lepper, M. R. (1996). Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice. Journal of Educational Psychology, 88(4), 715-730.
Koedinger, K. R. (2001). Cognitive tutors as modeling tool and instructional model. In Forbus, K. D. & Feltovich, P. J. (Eds.) Smart Machines in Education: The Coming Revolution in Educational Technology. Menlo Park, CA: AAAI/MIT Press.
Nathan, M., Kintsch, W., & Young, E. (1992). A theory of algebra-word-problem comprehension and its implications for the design of learning environments. Cognition and Instruction, 9(4), 329-389.