Difference between revisions of "Help Lite (Aleven, Roll)"
(→Glossary) |
|||
(4 intermediate revisions by 2 users not shown) | |||
Line 6: | Line 6: | ||
PI's: Vincent Aleven, Ido Roll | PI's: Vincent Aleven, Ido Roll | ||
− | Other | + | Other Contributors: Ron Salden (post-doc) |
{| border="1" | {| border="1" | ||
− | + | ||
+ | |- | ||
+ | | '''Study Start Date''' || 5/2006 | ||
+ | |- | ||
+ | | '''Study End Date''' || 5/2006 | ||
+ | |- | ||
+ | | '''LearnLab Site''' || Wilkinsburg HS | ||
+ | |- | ||
+ | | '''LearnLab Course''' || Geometry | ||
+ | |- | ||
+ | | '''Number of Students''' || 40 | ||
|- | |- | ||
− | | ''' | + | | '''Total Participant Hours''' || 90 |
+ | |||
+ | |||
+ | |- | ||
+ | | '''Data available in DataShop''' || [https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=136 Dataset: Short Hints Wilkinsburg Spring 2006 (meta)] [https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=137 Dataset: Short Hints Wilkinsburg Spring 2006 (cognitive)]<br> | ||
+ | * '''Pre/Post Test Score Data:''' not available | ||
+ | * '''Paper or Online Tests:''' unknown | ||
+ | * '''Scanned Paper Tests:''' unknown | ||
+ | * '''Blank Tests:''' No | ||
+ | * '''Answer Keys: ''' No | ||
+ | |||
|} | |} | ||
Line 35: | Line 55: | ||
* [[Metacognition]] | * [[Metacognition]] | ||
− | * [[ | + | * [[Gaming the system]] |
* [[Cognitive tutor]] | * [[Cognitive tutor]] | ||
Line 105: | Line 125: | ||
[[Category:Empirical Study]] | [[Category:Empirical Study]] | ||
[[Category:Protected]] | [[Category:Protected]] | ||
+ | [[Category:Data available in DataShop]] |
Latest revision as of 00:11, 16 December 2010
Hints during tutored problem solving – the effect of fewer hint levels with greater conceptual content
Vincent Aleven, Ido Roll, Kenneth Koedinger
Meta-data
PI's: Vincent Aleven, Ido Roll
Other Contributors: Ron Salden (post-doc)
Study Start Date | 5/2006 |
Study End Date | 5/2006 |
LearnLab Site | Wilkinsburg HS |
LearnLab Course | Geometry |
Number of Students | 40 |
Total Participant Hours | 90
|
Data available in DataShop | Dataset: Short Hints Wilkinsburg Spring 2006 (meta) Dataset: Short Hints Wilkinsburg Spring 2006 (cognitive)
|
Abstract
This in vivo experiment compared the effectiveness of two styles of hint sequences during tutored problem solving. The study was carried out in the Geometry LearnLab. Two conditions were compared, each working with its own tutor version. The tutor versions differed only with respect to the content of the hint sequences. A key difference between the hint sequences was that the number of hint levels was reduced from about 7 in a typical hit sequence to 2 or 3. This was achieved by removing hints that merely reminded students of their current goal within the problem, by removing hints that encouraged students to try to address their question by using the Glossary, and by being more concise in explaining how a theorem or definition could be explained. At the same time, conceptual content was added, in the form of explanations of geometry terms.
Glossary
Research question
How is robust learning affected by shorter hint sequences with richer conceptual content?
This was not an independent study, but part of the main Help Seeking study.
Background and Significance
Independent Variables
Number and type of hint levels within the hint sequence:
- Control: Standard Cognitive Tutor hints; Including 7 levels of hints of different types: containing either domain knowledge or metacognitive hints (such as 'search the glossary for ...')
- Experimental condition: Included only 2-3 levels of hints, each of which includes only domain knowledge.
Dependent variables
The study uses two levels of dependent measures:
Assessing Help Seeking behavior:
- Analyzing log-files against a model of ideal help-seeking behavior
Assessing domain learning
- Learning curves while using the tutor
- % correct on attempts following hint requests
Due to technical and administrative errors, some of the tests are lost, and others cannot be attributed to conditions. As a result, no pre- and post-test measures can be used.
Hypothesis
Students pay more attention to short hint sequences as they feel they are more helpful and easier to understand. Thus, the shorter hint sequences reduce hint abuse, such as students’ clicking through hints until they get the answer, without paying attention to why the answer is what it is. The richer conceptual content helps them to make sense out of the tutor’s hints, reducing implicit learning and also making students more likely to attend to the hints. Thus, there are two reasons why the new hints result in better sense making and less implicit learning.
Findings
None. Errors in data logging and data collection do not allow for an extensive analysis.
Explanation
Having informative, relevant and on-time hints provides the student an effective learning trajectory when learning-by-doing becomes to difficult. The original help sequence require the learner for more responsibility - identify relevant hints, search the glossary, etc. These activities require cognitive load. However, the updated hint sequence offer relevant instruction when required, and low on extraneous cognitive load.
Descendents
Annotated bibliography
Aleven, V., & Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag.
Aleven, V., McLaren, B.M., Roll, I., & Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th Int C on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag.
Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., & Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th Int C on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press.
Aleven, V., McLaren, B.M., Roll, I., & Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int J of Artificial Intelligence in Education(16), 101-30
Roll, I., Aleven, V., & Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th Int C on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag.
Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., & Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono, (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag.
Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., & Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th Int C on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag.
Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., & Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students' Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag.