Geometry Greatest Hits

From LearnLab
Revision as of 16:44, 18 May 2009 by Kirsten-Butcher (talk | contribs) (Dependent Variables)
Jump to: navigation, search

Geometry Greatest Hits

Summary Table

Study 1

PIs Vincent Aleven, Ryan Baker, Kirsten Butcher, & Ron Salden
Other Contributers Octav Popescu (Research Programmer, CMU HCII), Jessica Kalka (Research Associate, CMU HCII)
Study Start Date January, 2009
Study End Date March, 2009
LearnLab Site Greenville, Riverview, Steel Valley
LearnLab Course Geometry
Number of Students 98
Total Participant Hours
DataShop Log data soon to be uploaded and available in the DataShop


The main idea in the current project is to combine instructional interventions derived from four instructional principles. Each of these interventions has been shown to be effective in separate (PSLC) studies, and can be expected on theoretical grounds to be synergistic (or complementary). We hypothesize that instruction that simultaneously implements several principles will be dramatically more effective than instruction that does not implement any of the targeted principles (e.g. current common practice), especially if the principles are tied to different learning mechanisms. This project will test this hypothesis, focusing on the following four principles:

Building on our prior work that tested these principles individually, we have created a new version of the Geometry Cognitive Tutor that implements these four principles. We have conducted an in-vivo experiment, and will conduct a lab experiment, to test the hypothesis that the combination of these principles produces a large effect size compared to the standard Cognitive Tutor, which does not support any of these principles, or supports them less strongly.

Background & Significance

The PSLC’s in-vivo methodology, as well as standard practice in learning science, generally focuses on testing one principle at a time. This approach is useful for understanding which principles work, how they work, and what their boundary conditions are. However, it is also useful to test combinations of principles, because it elucidates boundary conditions and explores the degree to which principles are complementary or synergistic.

Knowing which instructional interventions and principles are synergistic (as well as when interventions and principles do not have any additive effects) is also an important practical goal within the learning sciences. Instructional designers often use principles in combination (e.g. Anderson et al, 1995; Quintana et al, 2004); knowing which combinations are effective in concert is therefore pragmatically useful.

Intelligent Tutoring Systems have been proven to be more effective than typical classroom instruction. Can principle-oriented research make them even more effective? Can demonstrable impact in the classroom be strengthened by combining principles from successful in vivo studies? And will such a combination lead to a large effect size?



A tutor that uses multiple PSLC learning principles in combination, each of which have been validated to lead to better robust learning when applied, will achieve a significantly higher effect size compared to an unmodified tutor than the principles achieve on their own.

Completed experiments

  • In vivo study: A two-condition in-vivo study (comparing the baseline tutor to a modified tutor with all four improvements). Measures of learning gains (including robust learning measures) and learning efficiency (time taken to complete tutor) were utilized.

Planned experiments

  • Lab study (2 phases):
    • (1) A two-condition study (comparing the baseline tutor to the modified tutor with all five improvements) testing overall student learning (including measures of robust learning) and efficiency in one tutor unit (Angles).
    • (2) Think-aloud (lab) research to determine if worked-examples and visual interaction have the hypothesized, complementary process effects.

Independent variables

The Greatest Hits version of the tutor had the following features, which are supported by prior PSLC research

  • integrated problem format (symbolic information integrated in the diagram; all interaction happens in the diagram)
  • non-interactive conceptual example sets at the beginning of each curricular unit
  • interactive worked examples at the beginning of each curricular, faded in an individualized manner
  • diagrammatic self-explanations of incorrect steps
  • tuned knowledge-tracing parameters to achieve more better individualized problem sequences (avoiding over-practice and under-practice)
  • employed new knowledge-tracing algorithm that estimated the probability of guesses and slips in a contextual manner (to improve the accuracy of student modeling, which in turn better individualized problem sequences)

Dependent Variables

  • Problem-Solving Items.

Problem Solving items have a similar format to the tutor – students must use known information to calculate the measure of an angle and should justify their problem solving step with a relevant geometry rule. These problem solving items also contain several types of new tasks: First, student must make a solvability judgment to determine if enough information is known to solve the step. Second, for “false answers, students must explain how the unsolvable problem could be makes solvable. Third, for solvable items, students must explain which diagram elements apply to the geometry rule that was used in the problem solving step.

  • Reasoning Items.

Students also complete reasoning items, which assess how well they understand the conceptual geometry relationships by which one feature is used to solve others. For these items, students should indicate whether they can find the angles of a certain geometry rule.


Not yet available.


Further Information


Annotated Bibliography


Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of the Learning Sciences, 4 (2) 167-207.

Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., Kyza, E., Edelson, D. C., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The Journal of the Learning Sciences, 13(3), 337-386.

Future Plans