<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://learnlab.org/mediawiki-1.44.2/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gurpreet</id>
	<title>Theory Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://learnlab.org/mediawiki-1.44.2/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gurpreet"/>
	<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Special:Contributions/Gurpreet"/>
	<updated>2026-04-29T11:18:40Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.44.2</generator>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Tutor_w_hint.jpg&amp;diff=7142</id>
		<title>File:Tutor w hint.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Tutor_w_hint.jpg&amp;diff=7142"/>
		<updated>2008-02-13T19:50:05Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Help_facilities&amp;diff=7141</id>
		<title>Help facilities</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Help_facilities&amp;diff=7141"/>
		<updated>2008-02-13T19:49:35Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Help Facilities: The means available to the student, which offer the student support in the learning process. These include: &lt;br /&gt;
* Human help resources: Usually, students can consult with their teacher and peers.&lt;br /&gt;
* Online resources: Online help resources are given either on-demand or proactively, and are usually contextualized or non-contextualized. The table below demonstrates these categories using the Help Seeking facilities of the Geometry Cognitive Tutor, as used in the Help Tutor project:&lt;br /&gt;
{| cellpadding=&amp;quot;20&amp;quot; cellspacing=&amp;quot;0&amp;quot; border=&amp;quot;1&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
! Contextual&lt;br /&gt;
! Non contextual&lt;br /&gt;
|-&lt;br /&gt;
! On demand&lt;br /&gt;
| Student can ask for contextual hints, relevant to the specific step she attempts to solve. Evey hint has several levels the students can browse, with the most elaborated one conveying the answer. &lt;br /&gt;
| The Glossary offers students a de-contextualized help resource, similar to an online dictionary. Students can find relevant information by searching it.&lt;br /&gt;
|-&lt;br /&gt;
! Proactive&lt;br /&gt;
| When students repeat an error more than a predefined number of times, the system presents them automatically with the hint they would have gotten were they to ask for one.&lt;br /&gt;
| The Cognitive Tutor does not have a proactive non-contextual hint. It is unclear how such hint will look like.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
[[Image:tutor_w_hint.jpg]]&lt;br /&gt;
&lt;br /&gt;
For a comprehensive review of help in online system, see&lt;br /&gt;
Aleven, V., Stahl, E., Schworm, S., Fischer, F., &amp;amp; Wallace, R.M. (2003). Help Seeking and Help Design in Interactive Learning Environments. Review of Educational Research, 73(2), 277-320.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Glossary]]&lt;br /&gt;
[[Category:Interactive Communication]]&lt;br /&gt;
[[Category:Help Tutor]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Roll_Pyramid.jpg&amp;diff=7140</id>
		<title>File:Roll Pyramid.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Roll_Pyramid.jpg&amp;diff=7140"/>
		<updated>2008-02-13T19:44:27Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Help-seeking_and_learning.jpg&amp;diff=7139</id>
		<title>File:Help-seeking and learning.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Help-seeking_and_learning.jpg&amp;diff=7139"/>
		<updated>2008-02-13T19:40:26Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Help-seeking_behavior.jpg&amp;diff=7138</id>
		<title>File:Help-seeking behavior.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Help-seeking_behavior.jpg&amp;diff=7138"/>
		<updated>2008-02-13T19:38:56Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7137</id>
		<title>The Help Tutor Roll Aleven McLaren</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7137"/>
		<updated>2008-02-13T19:38:04Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Findings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Towards Tutoring [[Metacognition]] - The Case of Help Seeking ==&lt;br /&gt;
Ido Roll, Vincent Aleven, Bruce M. McLaren, Kenneth Koedinger&lt;br /&gt;
&lt;br /&gt;
=== Meta-data ===&lt;br /&gt;
&lt;br /&gt;
PI&#039;s: Vincent Aleven, Ido Roll, Bruce M. McLaren&lt;br /&gt;
&lt;br /&gt;
Other Contributers: EJ Ryu (programmer)&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Study # !! Start Date !! End Date !! LearnLab Site !!  # of Students !! Total Participant Hours !! DataShop? &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;1&#039;&#039;&#039; || 2004       || 2004     || Analysis of existing data || 40 || 280 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;2&#039;&#039;&#039; || 2005       || 2005     || Analysis of existing data || 70 || 105 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;3&#039;&#039;&#039; || 5/2005     || 5/2005   || Hampton &amp;amp; Wilkinsburg (Geometry) || 60 || 270 || No, incompatible format&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;4&#039;&#039;&#039; || 2/2006     || 4/2006   || CWCTC (Geometry)          || 84 || 1,008 || Yes&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
While working with a tutoring system, students are expected to regulate their own learning process. However, often, they demonstrate inadequate [[metacognition|metacognitive process]] in doing so. For example, students often ask for help too frequently or not frequently enough. &lt;br /&gt;
In this project we built an Intelligent Tutoring System to teach [[metacognition]], and in particular, to improve students&#039; [[help-seeking behavior]].  Our Help Seeking Support Environment includes three components:&lt;br /&gt;
# Direct [[help-seeking behavior|help seeking]] [[explicit instruction|instruction]], given by the teacher&lt;br /&gt;
# A [[Self-Assessment]] Tutor, to help students evaluate their own need for help&lt;br /&gt;
# The Help Tutor - a domain-independent agent that can be added as an adjunct to a [[cognitive tutor]]. Rather than making help-seeking decisions for the students, the Help Tutor teaches better help-seeking skills by tracing students actions on a (meta)cognitive [[help-seeking model]] and giving students appropriate feedback. &lt;br /&gt;
&lt;br /&gt;
In a series of [[in vivo experiment]]s, the Help Tutor accurately detected help-seeking errors that were associated with poorer learning and with poorer [[declarative]] and [[procedural]] [[knowledge component]]s of help seeking.  The main findings were that students made fewer help-seeking errors while working with the Help Tutor and acquired better help seeking [[declarative]] [[knowledge component]]s. &lt;br /&gt;
However, we did not find evidence that this led to an improvement in learning at the domain level or to better [[help-seeking behavior]] in a paper-and-pencil environment. &lt;br /&gt;
We pose a number of hypotheses in an attempt to explain these results. We question the current focus of metacognitive tutoring, and suggest ways to reexamine the role of [[help facilities]] and of metacognitive tutoring within Intelligent Tutoring Systems.&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
&lt;br /&gt;
Not only that teaching [[metacognition]] holds the promise of improving current learning of the domain of interest, but also, or even mainly, it can accelerate future learning and successful regulation of independent learning. One example to metacognitive knowledge is help-seeking [[knowledge component]]s: The ability to identify the need for help, and to elicit appropriate assistance from the [[relevant resources|help facilities].  &lt;br /&gt;
However, considerable evidence shows that metacognitive [[knowledge component]]s are in need of better support. For example, while working with Intelligent Tutoring Systems, students try to &amp;quot;[[game the system]]&amp;quot; or do not [[self-explanation|self-explain]] enough. Similarly, research shows that students&#039; [[help-seeking behavior]] leaves much room for improvement. &lt;br /&gt;
&lt;br /&gt;
==== Shallow help seeking [[knowledge component]]s ====&lt;br /&gt;
Research shows that students do not use their help-seeking konwledge components approrpiately. For example, Aleven et al. (2006) show that 30% of students&#039; actions were consecutive fast help requests (a common form of [[help abuse]], termed &#039;[[clicking through hints]]&#039;), without taking enough time to read the requested hints.  &lt;br /&gt;
Extensive log-file analysis suggests that students apply faulty [[knowledge component]]s such as the following:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[procedural]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
Cognitive aspects:&lt;br /&gt;
  If I don’t know the answer =&amp;gt; &lt;br /&gt;
  I should guess&lt;br /&gt;
&lt;br /&gt;
Motivational aspects:&lt;br /&gt;
  If I get the answer correct =&amp;gt;&lt;br /&gt;
  I achieved the goal&lt;br /&gt;
&lt;br /&gt;
Social aspects:&lt;br /&gt;
  If I ask for help =&amp;gt;&lt;br /&gt;
  I am weak&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[declarative]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
  Asking for hints will always reduce my skill level&lt;br /&gt;
&lt;br /&gt;
  Making an error is better than asking for a hint&lt;br /&gt;
&lt;br /&gt;
  Only weak people ask for help&lt;br /&gt;
&lt;br /&gt;
==== Teaching vs. supporting [[metacognition]] ====&lt;br /&gt;
&lt;br /&gt;
Several systems support students&#039; metacognitive actions in a way that encourages, or even forces, students to learn productively and efficiently. For example, a tutoring system can require the student to self-explain. While this approach is likely to improve domain learning in the supported environment, the effect is not likely to persist beyond the scope of the tutoring system, and therefore is not likely to help students become better future learners. &lt;br /&gt;
&lt;br /&gt;
Towards that end, we chose not to &#039;&#039;&#039;support&#039;&#039;&#039; students&#039; help seeking actions, but to &#039;&#039;&#039;teach&#039;&#039;&#039; them better help-seeking skills. Rather than making the metacognitive decisions for the students (for example, by preventing help-seeking errors or gaming opportunities), this study focuses on helping students refine their Help Seeking [[knowledge component]]s and acquire better [[feature validity]] of their [[help-seeking behavior|help-seeking]] [[metacognition|metacognitive skills]].&lt;br /&gt;
&lt;br /&gt;
By doing so, we examine whether metacognitive knowledge can be taught using familiar conventional domain-level pedagogies.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
See [[:Category:Help Tutor|Help Tutor Glossary]]&lt;br /&gt;
&lt;br /&gt;
=== Research questions ===&lt;br /&gt;
&lt;br /&gt;
# Can conventional and well-established instructional principles in the domain level be used to tutor [[metacognition|metacognitive]] [[knowledge component]]s such as [[help-seeking behavior|Help Seeking]] [[knowledge component]]s?&lt;br /&gt;
# Does the practice of better metacognitive behavior translates, in turn, to better domain learning?&lt;br /&gt;
&lt;br /&gt;
In addition, the project makes the following contributions:&lt;br /&gt;
# An improved understanding of the nature of help-seeking knowledge and its acquisition.&lt;br /&gt;
# A novel framework for the design of  goals, interaction and assessment for metacognitive tutoring.&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
&lt;br /&gt;
Two studies were performed with the Help Tutor. In both studies the independent variable was the existence of help seeking support.&lt;br /&gt;
Control condition used the conventional Geometry Cognitive Tutor&lt;br /&gt;
&lt;br /&gt;
[[Image:Geometry Cognitive Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
The treatment condition varied between studies:&lt;br /&gt;
* Study one: The Geometry Cognitive Tutor + the Help Tutor&lt;br /&gt;
* Study two: The Geometry Cognitive Tutor + the Help Seeking Support Environment (help seeking explicit instruction, self-assessment tutor, and Help Tutor)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Help Tutor:&#039;&#039;&#039;&lt;br /&gt;
The Help Tutor is an a Cognitive Tutor in its own right that identifies recommended types of actions by tracing students’ interaction with the Geometry Cognitive Tutor relative to a metacognitive help-seeking model. When students perform actions that deviate from the recommended ones, the Help Tutor presents a message that stresses the recommended action to be taken. Messages from the metacognitive Help Tutor and the domain-level Cognitive Tutor are coordinated, so that the student receives only the most helpful message at each point [2].   &lt;br /&gt;
&lt;br /&gt;
[[Image:The Help-tutor.jpg]]&lt;br /&gt;
[[image:The Help Seeking Model.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Self-Assessment Tutor:&#039;&#039;&#039;&lt;br /&gt;
The ability to correctly self-assess one’s own knowledge level is correlated with strategic use of help (Tobias and Everson, 2002). The Self-Assessment Tutor is designed to tutor students on &lt;br /&gt;
their self-assessment skills; to help students make appropriative learning decisions based on their self assessment; and mainly, to give students a tutoring environment, low on cognitive load, in which they can practice using their help-seeking skills.  &lt;br /&gt;
The curriculum used by the Treatment group in study two consists of interleaving Self Assessment and Cognitive Tutor + Help Tutor sessions, with the Self Assessment sessions taking about 10% of the students’ time. During each self-assessment session the student assesses the skills to be practiced in the subsequent Cognitive Tutor section.&lt;br /&gt;
  &lt;br /&gt;
[[Image:The Self-Assessment Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Explicit help-seeking instruction:&#039;&#039;&#039;&lt;br /&gt;
As White and Frederiksen demonstrated (1998), reflecting in the classroom environment on the desired metacognitive process helps students internalize it. With that goal in mind, we created a short classroom lesson about help seeking with the following objectives: to give students a better declarative understanding of desired and effective help-seeking behavior; to improve their dispositions and attitudes towards seeking help; and to frame the help-seeking knowledge as an important learning goal, alongside Geometry knowledge, for the coming few weeks. The instruction includes a video presentation with examples of productive and faulty help-seeking behavior and the appropriate help-seeking principles. &lt;br /&gt;
&lt;br /&gt;
[[Image:Explicit help-seeking instruction.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
&lt;br /&gt;
The study uses two levels of dependent measures:&lt;br /&gt;
# Directly assessing Help Seeking skills&lt;br /&gt;
# Assessing domain-level learning, and by that evaluating the contribution of the help-seeking skills.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Assessments of help-seeking konwledge:&lt;br /&gt;
* [[Normal post-test]]: &lt;br /&gt;
** Declarative: hypothetical help-seeking dilemmas&lt;br /&gt;
** Procedural: Help seeking error rate while working with the tutor&lt;br /&gt;
* [[Transfer]]: Ability to use optional hints embedded within certain test items in the paper test.&lt;br /&gt;
&lt;br /&gt;
[[Image:embedded hints.jpg]]&lt;br /&gt;
&lt;br /&gt;
2. Assessments of domain konwledge:&lt;br /&gt;
* [[Normal post-test]]: Problem solving and explanation items like those in the tutor&#039;s instruction.&lt;br /&gt;
* [[Transfer]]: &lt;br /&gt;
** Data insufficiency (or &amp;quot;not enough information&amp;quot;) items.&lt;br /&gt;
** Conceptual understanding items (study two only)&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
&lt;br /&gt;
The combination of explicit help-seeking instruction, on-time feedback on help seeking errors, and raising awareness to knowledge deficits will&lt;br /&gt;
* Improve [[feature validity]] of students&#039; help seeking skills&lt;br /&gt;
and thus, in turn, will&lt;br /&gt;
* Improve learning of domain knowledge by using those skills effectively.&lt;br /&gt;
&lt;br /&gt;
==== Instructional Principles ====&lt;br /&gt;
&lt;br /&gt;
The main principles being evaluated here is whether [[Roll_help seeking principle | instruction should support meta-cognition in the context of problem solving]] by using principles of cognitive tutoring such as:&lt;br /&gt;
* Giving direct instruction&lt;br /&gt;
* Giving immediate feedback on errors&lt;br /&gt;
* Prompting for self-assessment&lt;br /&gt;
&lt;br /&gt;
This utilizes the following instructional principles:&lt;br /&gt;
&lt;br /&gt;
* The Self-Assessment Tutor utilizes the [[Reflection questions]] principle&lt;br /&gt;
* The Help Tutor itself utilizes the [[Tutoring feedback]] principle&lt;br /&gt;
* The Help Seeking Instruction utilizes the [[Explicit instruction]] principle.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
As seen below (adapted from Roll et al. 2006), metacognitive tutoring has the following goals:&lt;br /&gt;
# First, the tutoring system should capture metacognitive errors (in our case, help-seeking errors).&lt;br /&gt;
# Then, it should lead to an improved metacognitive behavior within the tutoring system.&lt;br /&gt;
# This, in turn, should lead to an improvement in the domain learning.&lt;br /&gt;
# The effect should persist beyond the scope of the tutoring system.&lt;br /&gt;
# As a result, students are expected to become better future learners.&lt;br /&gt;
&lt;br /&gt;
[[Image:Roll_Pyramid.jpg]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 1: Capture metacognitive errors ====&lt;br /&gt;
&lt;br /&gt;
In study 1, 17% of students actions were classified as errors. These errors were singificantly negatively correlated with learning (r=-0.42) - the more help-seeking errors captured by the system, the smaller the improvement from pre- to post-test.&lt;br /&gt;
This data suggests that the help-seeking model captures appropriate actions, and that the goal was achieved - the Help Tutor captures help-seeking errors.&lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking_and_learning.jpg]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 2: Improve metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
Log files from study 1 were analyzed to evaluate whether students improved their help-seeking behavior while working with the Help Tutor. While overall there was only a minor decrease in the error rate (from 19% in the control condition to 16% in the Help Tutor condition), there was a significant decrease in the error rate on first hints, following hints, and the portion of bottom-out hints out of all hint-sequences. &lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking_behavior.jpg]]&lt;br /&gt;
&lt;br /&gt;
This data suggests that while the system was not able to improve all aspects of the desired metacognitive behavior, it did improve students&#039; behavior on the common types of errors.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 3: Improve domain learning ====&lt;br /&gt;
&lt;br /&gt;
While students&#039; help seeking behavior improved while working with the Help Tutor (in study 1) or the full Help Seeking Support Environment (in study 2), we did not observe differences in learning between the two conditions on neither study 1 nor study 2.&lt;br /&gt;
&lt;br /&gt;
Study 1 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study_1_results.jpg]]&lt;br /&gt;
&lt;br /&gt;
Study 2 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study_2_results.jpg]]&lt;br /&gt;
&lt;br /&gt;
* Notice, since the tests in times 1 and 2 evaluated different instructional units (Angles vs. Quadrilaterals), lower grades at time 2 do not suggest decrease in knowledge.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 4: Improve future metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
To evaluate whether the effect of the help-seeking curriculum persists beyond the tutored environment, students&#039; help seeking behavior was evaluated in a transfer environment - the paper and pencil tests.&lt;br /&gt;
&lt;br /&gt;
Hypothetical help seeking dilemmas, such as the one described below, were used to evaluate declarative help-seeking knowledge.&lt;br /&gt;
&lt;br /&gt;
  1. You tried to answer a question that you know, but for some reason the tutor says that your answer is wrong. What should you do? &lt;br /&gt;
  [ ] First I would review my calculations. Perhaps I can find the mistake myself? &lt;br /&gt;
  [ ] The Tutor must have made a mistake. I will retype the same answer again. &lt;br /&gt;
  [ ] I would ask for a hint, to understand my mistake.&lt;br /&gt;
&lt;br /&gt;
Procedural help-seeking skills were evaluated using embedded hints in the tests (see figure in the Dependant Measures section above).&lt;br /&gt;
&lt;br /&gt;
In study 1 (which included only the Help Tutor component,) students in the Treatment condition demonstrated neither better declarative nor procedural help-seeking knowledge, compared with the Control condition.&lt;br /&gt;
&lt;br /&gt;
In study 2 (which included the explicit help-seeking instruction and the Self-Assessment tutor in addition to the Help Tutor) students in in the Treatment condition demonstrated better declarative help-seeking knowledge (compared with Control group students) but no better procedural knowledge&lt;br /&gt;
&lt;br /&gt;
[[Image:Declarative_knowledge.jpg]]&lt;br /&gt;
[[Image:Procedural_knowledge.jpg]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 5: Improve future domain learning ====&lt;br /&gt;
&lt;br /&gt;
Due to technical difficulties, this goal was not evaluated in both studies.&lt;br /&gt;
&lt;br /&gt;
==== Summary of results ====&lt;br /&gt;
&lt;br /&gt;
Overall, the following pattern of results emerges from the studies:&lt;br /&gt;
- The Help Seeking Support Environment intervened appropriate actions during the learning process&lt;br /&gt;
- Students improved their help-seeking behavior while working with the system&lt;br /&gt;
- Students acquired better help-seeking declarative knowledge following the system&lt;br /&gt;
- However, students&#039; domain learning did not improve&lt;br /&gt;
- Also, the improvement in students&#039; help seeking behavior did not persist beyond the tutoring system.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
These somewhat disappointing results raise an important question: Why did the environment not lead to an improvement in learning and in help-seeking behavior in the paper-test measures? Why did the improved online help-seeking behavior not lead to improved learning gains?&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 1: Students do not have the skills, but we didn&#039;t teach them right. ==== &lt;br /&gt;
One possible explanation may be that the Help Seeking Support Environment requires excessive cognitive load during problem solving. Clearly, the learning process with the Help Seeking Support Environment is more demanding compared with the conventional Cognitive Tutor alone, since more needs to be learned. However, much of the extra content is introduced during the classroom discussion and self-assessment sessions. The only extra content presented during the problem-solving sessions are the Help Tutor’s error messages, but they are not expected to increase the load much, especially given that a prioritization algorithm makes sure students receive only one message at a time (either from the Help Tutor or the [[cognitive tutor]]). Also, if the Help Seeking Support Environment indeed creates requires too much cognitive load, it should be expected to hinder learning, which we did not observe.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 2: The role of help seeking in ITS ==== &lt;br /&gt;
Hints in tutoring systems have two objectives: to promote learning of challenging skills, and to help students move forward within the curriculum (i.e., to prevent them from getting stuck). While the latter is achieved easily with both the Cognitive Tutor and the Help Seeking Support Environment, achieving the first is much harder. It is not yet clear what makes a hint into a good hint, and how to create an effective [[hint sequence]]. It is possible that the hints, as implemented in the units of the Cognitive Tutor we used, are not optimal. For example, there may be too many levels of hints, with each level adding too little information to the previous one. Also, perhaps the detailed explanations are too demanding with regard to students’ reading comprehension ability. It is quite possible that these hints, regardless of how they are being used, are above students&#039; [[zone of proximal development]], and thus do not contribute much to learning. Support for that idea comes from Schworm and Renkl (2002). They found that offering explanations by the system impaired learning when [[self-explanation]] was required. The Geometry Cognitive Tutor prompts for [[self-explanation]] in certain units. Perhaps elaborated hints are redundant, or even damaging, when [[self-explanation]] is required. &lt;br /&gt;
It is possible also that [[help-seeking behavior]] that we currently view as faulty may actually be useful and desirable, in specific contexts for specific students. For example, perhaps a student who does not know the material should be allowed to view the bottom-out hint immediately, in order to turn the problem into a solved example. Support for that idea can be found in a work by Yudelson et al. (2006), in which medical students in a leading med school successfully learned by repeatedly asking for more elaborated hints. Such “clicking-though hints” behavior would be considered faulty by Help Seeking Support Environment. However, this population of students is known to have good metacognitive skills (without them it is unlikely they would have reached their current position). Thus, it seems that sometimes “misusing” help (e.g., [[help abuse]], according to the Help Seeking Support Environment) can be beneficial to some students. Further evidence can be found in Baker et al. (2004), who showed that some (but not all) students who “[[game the system]]” (i.e., click through hints or guess repeatedly) learn just as much as students who do not game. It may be the case that certain gaming behaviors are adaptive, and not irrational. Students who use these strategies will insist on viewing the [[bottom out hint]] and will ignore all intermediate hints, whether domain-level or metacognitive. Once intermediate hints are ignored, better [[help-seeking behavior]] according to the Help Seeking Support Environment should have no effect whatsoever on domain [[knowledge component]]s, as indeed was seen.&lt;br /&gt;
It is possible that we are overestimating students’ ability to learn from hints. Our first recommendation is to re-evaluate the role of hints in [[cognitive tutor]]s using complementary methodologies such as log-file analysis (e.g., Chang (2006) uses dynamic Bayes nets to evaluate the contribution of hints in a reading tutor); tracing individual students (to evaluate the different uses students make of hints), experimentation of different types of hints (for example, proactive vs. on demand), and analysis of human tutors who aid students while working with [[cognitive tutor]]s.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 3: The focus of metacognitive tutoring in ITS. ====&lt;br /&gt;
The previous hypothesis, focused on students’ tendency to skip hints, suggests that perhaps the main issue is not lack of knowledge, but lack of motivation. In other words, &#039;&#039;&#039;perhaps students already have these skills in place, but choose not to use them.&#039;&#039;&#039; For students who ignore intermediate hints, metacognitive messages offer little incentive. While the Help Seeking Support Environment can increase the probability that a proper hint level appears on the screen, it has no influence on whether it is being read, or whether the student attempts to understand it. Students may ignore the messages for several reasons. For example, they may habitually click through hints, and may resent the changes that the Help Seeking Support Environment imposes. This idea is consistent with the teachers’ observation that the students were not fond of the Help Seeking Support Environment error messages. They may comply with them, in order to make progress, but beyond that will ignore their content. The test data discussed above provides support for this idea. On 7 out of the 12 hint evaluations (seen in the findings section of goal 4) students scored lower on items with hints than on items with no hints. A [[cognitive headroom]] explanation does not account for this difference, since the Request hints did not add much load. A more likely explanation is that students chose to skip the hints since they were new to them in the given context. Baker (2005) reviewed several reasons for why students [[game the system]]. While no clear answer was given, the question is applicable here as well. &lt;br /&gt;
Motivational issues bring us to our final hypothesis. Time preference discount (Feldstein, 1964) is a term coined in economics, that describes behavior in which people would rather have a smaller immediate reward over a distant greater reward. In the tutoring environment, comparing the benefit of immediate correct answer with the delayed benefit (if any) of acting metacognitively correct may often lead the student to choose the first. If that is indeed the case, then students may already have the right metacognitive skills in place. The question we should be asking ourselves is not only how to get students to learn the desired metacognitive skills – but mainly, how to get students to use them.&lt;br /&gt;
&lt;br /&gt;
=== Connections===&lt;br /&gt;
&lt;br /&gt;
* The Help Tutor attempts to extend traditional tutoring beyond the common domains. In that, it is similar to the work of Amy Ogan on tutoring [[FrenchCulture | French Culture]]&lt;br /&gt;
&lt;br /&gt;
* The manipulation of interaction between the student and the tutor, which is &amp;quot;natural&amp;quot; in the control condition, is guided by the help tutor.  This is similar to the scripting manipulation of the [[Rummel Scripted Collaborative Problem Solving]] and the [[Walker A Peer Tutoring Addition]] projects.&lt;br /&gt;
&lt;br /&gt;
* Another example for studying the effects of hints is [[Ringenberg Examples-as-Help|Ringenberg&#039;s study]], in which hints are compared to examples. &lt;br /&gt;
&lt;br /&gt;
* Going to do an in-vivo study at a LearnLab site? Check out how to answer [[FAQ for teachers|teacher&#039;s FAQ]]&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
Plans for June 2007 - Dec. 2007:&lt;br /&gt;
* Present the study in the International Conference on Artificial Intelligence on Education&lt;br /&gt;
* Submit camera ready copy of the paper to the Journal on Metacognition and Instruction&lt;br /&gt;
* Analyze the logfiles for Metacognitive learning&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
# Aleven, V., &amp;amp; Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th International Conference on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., &amp;amp; Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th International Conference on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int Journal of Artificial Intelligence in Education(16), 101-30 [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Baker, R.S., Corbett, A.T., &amp;amp; Koedinger, K.R. (2004) Detecting Student Misuse of Intelligent Tutoring Systems. in proceedings of 7Th International Conference on Intelligent Tutoring Systems, 531-40.&lt;br /&gt;
# Baker, R.S., Roll, I., Corbett, A.T., &amp;amp; Koedinger, K.R. (2005) Do Performance Goals Lead Students to Game the System? in proceedings of 12Th International Conference on Artificial Intelligence in Education, 57-64. Amsterdam, The Netherlands: IOS Press.&lt;br /&gt;
# Chang, K.K., Beck, J.E., Mostow, J., &amp;amp; Corbett, A. (2006) Does Help Help? A Bayes Net Approach to Modeling Tutor Interventions. in proceedings of Workshop on Educational Data Mining at AAAI 2006, 41-6. Menlo Park, California: AAAI.&lt;br /&gt;
# Feldstein, M.S. (1964). The Social Time Presence Discount Rate in Cost-Benefit Analysis. The Economic Journal 74(294), 360-79&lt;br /&gt;
# Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono,  (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., &amp;amp; Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th International Conference on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., &amp;amp; Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students&#039; Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (To appear) Can Help Seeking Be Tutored? Searching for the Secret Sauce of Metacognitive Tutoring. To appear in the proceedings of the International Conference on Artificial Intelligence in Education 2007.&lt;br /&gt;
# Schworm, S., &amp;amp; Renkl, A. (2002) Learning by solved example problems: Instructional explanations reduce self-explanation activity. in proceedings of The 24Th Annual Conference of the Cognitive Science Society, 816-21. Mahwah, NJ: Erlbaum.&lt;br /&gt;
# Yudelson, M.V., Medvedeva, O., Legowski, E., Castine, M., Jukic, D., &amp;amp; Crowley, R.S. (2006) Mining Student Learning Data to Develop High Level Pedagogic Strategy in a Medical ITS. in proceedings of Workshop on Educational Data Mining at AAAI 2006, Menlo Park, CA: AAAI.# Roll, I., Aleven, V., &amp;amp; Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th International Conference on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Empirical Study]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Study_1_results.jpg&amp;diff=7136</id>
		<title>File:Study 1 results.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Study_1_results.jpg&amp;diff=7136"/>
		<updated>2008-02-13T19:36:59Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Study_2_results.jpg&amp;diff=7135</id>
		<title>File:Study 2 results.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Study_2_results.jpg&amp;diff=7135"/>
		<updated>2008-02-13T19:35:56Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7134</id>
		<title>The Help Tutor Roll Aleven McLaren</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7134"/>
		<updated>2008-02-13T19:33:07Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Evaluation of goal 3: Improve domain learning */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Towards Tutoring [[Metacognition]] - The Case of Help Seeking ==&lt;br /&gt;
Ido Roll, Vincent Aleven, Bruce M. McLaren, Kenneth Koedinger&lt;br /&gt;
&lt;br /&gt;
=== Meta-data ===&lt;br /&gt;
&lt;br /&gt;
PI&#039;s: Vincent Aleven, Ido Roll, Bruce M. McLaren&lt;br /&gt;
&lt;br /&gt;
Other Contributers: EJ Ryu (programmer)&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Study # !! Start Date !! End Date !! LearnLab Site !!  # of Students !! Total Participant Hours !! DataShop? &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;1&#039;&#039;&#039; || 2004       || 2004     || Analysis of existing data || 40 || 280 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;2&#039;&#039;&#039; || 2005       || 2005     || Analysis of existing data || 70 || 105 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;3&#039;&#039;&#039; || 5/2005     || 5/2005   || Hampton &amp;amp; Wilkinsburg (Geometry) || 60 || 270 || No, incompatible format&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;4&#039;&#039;&#039; || 2/2006     || 4/2006   || CWCTC (Geometry)          || 84 || 1,008 || Yes&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
While working with a tutoring system, students are expected to regulate their own learning process. However, often, they demonstrate inadequate [[metacognition|metacognitive process]] in doing so. For example, students often ask for help too frequently or not frequently enough. &lt;br /&gt;
In this project we built an Intelligent Tutoring System to teach [[metacognition]], and in particular, to improve students&#039; [[help-seeking behavior]].  Our Help Seeking Support Environment includes three components:&lt;br /&gt;
# Direct [[help-seeking behavior|help seeking]] [[explicit instruction|instruction]], given by the teacher&lt;br /&gt;
# A [[Self-Assessment]] Tutor, to help students evaluate their own need for help&lt;br /&gt;
# The Help Tutor - a domain-independent agent that can be added as an adjunct to a [[cognitive tutor]]. Rather than making help-seeking decisions for the students, the Help Tutor teaches better help-seeking skills by tracing students actions on a (meta)cognitive [[help-seeking model]] and giving students appropriate feedback. &lt;br /&gt;
&lt;br /&gt;
In a series of [[in vivo experiment]]s, the Help Tutor accurately detected help-seeking errors that were associated with poorer learning and with poorer [[declarative]] and [[procedural]] [[knowledge component]]s of help seeking.  The main findings were that students made fewer help-seeking errors while working with the Help Tutor and acquired better help seeking [[declarative]] [[knowledge component]]s. &lt;br /&gt;
However, we did not find evidence that this led to an improvement in learning at the domain level or to better [[help-seeking behavior]] in a paper-and-pencil environment. &lt;br /&gt;
We pose a number of hypotheses in an attempt to explain these results. We question the current focus of metacognitive tutoring, and suggest ways to reexamine the role of [[help facilities]] and of metacognitive tutoring within Intelligent Tutoring Systems.&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
&lt;br /&gt;
Not only that teaching [[metacognition]] holds the promise of improving current learning of the domain of interest, but also, or even mainly, it can accelerate future learning and successful regulation of independent learning. One example to metacognitive knowledge is help-seeking [[knowledge component]]s: The ability to identify the need for help, and to elicit appropriate assistance from the [[relevant resources|help facilities].  &lt;br /&gt;
However, considerable evidence shows that metacognitive [[knowledge component]]s are in need of better support. For example, while working with Intelligent Tutoring Systems, students try to &amp;quot;[[game the system]]&amp;quot; or do not [[self-explanation|self-explain]] enough. Similarly, research shows that students&#039; [[help-seeking behavior]] leaves much room for improvement. &lt;br /&gt;
&lt;br /&gt;
==== Shallow help seeking [[knowledge component]]s ====&lt;br /&gt;
Research shows that students do not use their help-seeking konwledge components approrpiately. For example, Aleven et al. (2006) show that 30% of students&#039; actions were consecutive fast help requests (a common form of [[help abuse]], termed &#039;[[clicking through hints]]&#039;), without taking enough time to read the requested hints.  &lt;br /&gt;
Extensive log-file analysis suggests that students apply faulty [[knowledge component]]s such as the following:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[procedural]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
Cognitive aspects:&lt;br /&gt;
  If I don’t know the answer =&amp;gt; &lt;br /&gt;
  I should guess&lt;br /&gt;
&lt;br /&gt;
Motivational aspects:&lt;br /&gt;
  If I get the answer correct =&amp;gt;&lt;br /&gt;
  I achieved the goal&lt;br /&gt;
&lt;br /&gt;
Social aspects:&lt;br /&gt;
  If I ask for help =&amp;gt;&lt;br /&gt;
  I am weak&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[declarative]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
  Asking for hints will always reduce my skill level&lt;br /&gt;
&lt;br /&gt;
  Making an error is better than asking for a hint&lt;br /&gt;
&lt;br /&gt;
  Only weak people ask for help&lt;br /&gt;
&lt;br /&gt;
==== Teaching vs. supporting [[metacognition]] ====&lt;br /&gt;
&lt;br /&gt;
Several systems support students&#039; metacognitive actions in a way that encourages, or even forces, students to learn productively and efficiently. For example, a tutoring system can require the student to self-explain. While this approach is likely to improve domain learning in the supported environment, the effect is not likely to persist beyond the scope of the tutoring system, and therefore is not likely to help students become better future learners. &lt;br /&gt;
&lt;br /&gt;
Towards that end, we chose not to &#039;&#039;&#039;support&#039;&#039;&#039; students&#039; help seeking actions, but to &#039;&#039;&#039;teach&#039;&#039;&#039; them better help-seeking skills. Rather than making the metacognitive decisions for the students (for example, by preventing help-seeking errors or gaming opportunities), this study focuses on helping students refine their Help Seeking [[knowledge component]]s and acquire better [[feature validity]] of their [[help-seeking behavior|help-seeking]] [[metacognition|metacognitive skills]].&lt;br /&gt;
&lt;br /&gt;
By doing so, we examine whether metacognitive knowledge can be taught using familiar conventional domain-level pedagogies.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
See [[:Category:Help Tutor|Help Tutor Glossary]]&lt;br /&gt;
&lt;br /&gt;
=== Research questions ===&lt;br /&gt;
&lt;br /&gt;
# Can conventional and well-established instructional principles in the domain level be used to tutor [[metacognition|metacognitive]] [[knowledge component]]s such as [[help-seeking behavior|Help Seeking]] [[knowledge component]]s?&lt;br /&gt;
# Does the practice of better metacognitive behavior translates, in turn, to better domain learning?&lt;br /&gt;
&lt;br /&gt;
In addition, the project makes the following contributions:&lt;br /&gt;
# An improved understanding of the nature of help-seeking knowledge and its acquisition.&lt;br /&gt;
# A novel framework for the design of  goals, interaction and assessment for metacognitive tutoring.&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
&lt;br /&gt;
Two studies were performed with the Help Tutor. In both studies the independent variable was the existence of help seeking support.&lt;br /&gt;
Control condition used the conventional Geometry Cognitive Tutor&lt;br /&gt;
&lt;br /&gt;
[[Image:Geometry Cognitive Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
The treatment condition varied between studies:&lt;br /&gt;
* Study one: The Geometry Cognitive Tutor + the Help Tutor&lt;br /&gt;
* Study two: The Geometry Cognitive Tutor + the Help Seeking Support Environment (help seeking explicit instruction, self-assessment tutor, and Help Tutor)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Help Tutor:&#039;&#039;&#039;&lt;br /&gt;
The Help Tutor is an a Cognitive Tutor in its own right that identifies recommended types of actions by tracing students’ interaction with the Geometry Cognitive Tutor relative to a metacognitive help-seeking model. When students perform actions that deviate from the recommended ones, the Help Tutor presents a message that stresses the recommended action to be taken. Messages from the metacognitive Help Tutor and the domain-level Cognitive Tutor are coordinated, so that the student receives only the most helpful message at each point [2].   &lt;br /&gt;
&lt;br /&gt;
[[Image:The Help-tutor.jpg]]&lt;br /&gt;
[[image:The Help Seeking Model.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Self-Assessment Tutor:&#039;&#039;&#039;&lt;br /&gt;
The ability to correctly self-assess one’s own knowledge level is correlated with strategic use of help (Tobias and Everson, 2002). The Self-Assessment Tutor is designed to tutor students on &lt;br /&gt;
their self-assessment skills; to help students make appropriative learning decisions based on their self assessment; and mainly, to give students a tutoring environment, low on cognitive load, in which they can practice using their help-seeking skills.  &lt;br /&gt;
The curriculum used by the Treatment group in study two consists of interleaving Self Assessment and Cognitive Tutor + Help Tutor sessions, with the Self Assessment sessions taking about 10% of the students’ time. During each self-assessment session the student assesses the skills to be practiced in the subsequent Cognitive Tutor section.&lt;br /&gt;
  &lt;br /&gt;
[[Image:The Self-Assessment Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Explicit help-seeking instruction:&#039;&#039;&#039;&lt;br /&gt;
As White and Frederiksen demonstrated (1998), reflecting in the classroom environment on the desired metacognitive process helps students internalize it. With that goal in mind, we created a short classroom lesson about help seeking with the following objectives: to give students a better declarative understanding of desired and effective help-seeking behavior; to improve their dispositions and attitudes towards seeking help; and to frame the help-seeking knowledge as an important learning goal, alongside Geometry knowledge, for the coming few weeks. The instruction includes a video presentation with examples of productive and faulty help-seeking behavior and the appropriate help-seeking principles. &lt;br /&gt;
&lt;br /&gt;
[[Image:Explicit help-seeking instruction.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
&lt;br /&gt;
The study uses two levels of dependent measures:&lt;br /&gt;
# Directly assessing Help Seeking skills&lt;br /&gt;
# Assessing domain-level learning, and by that evaluating the contribution of the help-seeking skills.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Assessments of help-seeking konwledge:&lt;br /&gt;
* [[Normal post-test]]: &lt;br /&gt;
** Declarative: hypothetical help-seeking dilemmas&lt;br /&gt;
** Procedural: Help seeking error rate while working with the tutor&lt;br /&gt;
* [[Transfer]]: Ability to use optional hints embedded within certain test items in the paper test.&lt;br /&gt;
&lt;br /&gt;
[[Image:embedded hints.jpg]]&lt;br /&gt;
&lt;br /&gt;
2. Assessments of domain konwledge:&lt;br /&gt;
* [[Normal post-test]]: Problem solving and explanation items like those in the tutor&#039;s instruction.&lt;br /&gt;
* [[Transfer]]: &lt;br /&gt;
** Data insufficiency (or &amp;quot;not enough information&amp;quot;) items.&lt;br /&gt;
** Conceptual understanding items (study two only)&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
&lt;br /&gt;
The combination of explicit help-seeking instruction, on-time feedback on help seeking errors, and raising awareness to knowledge deficits will&lt;br /&gt;
* Improve [[feature validity]] of students&#039; help seeking skills&lt;br /&gt;
and thus, in turn, will&lt;br /&gt;
* Improve learning of domain knowledge by using those skills effectively.&lt;br /&gt;
&lt;br /&gt;
==== Instructional Principles ====&lt;br /&gt;
&lt;br /&gt;
The main principles being evaluated here is whether [[Roll_help seeking principle | instruction should support meta-cognition in the context of problem solving]] by using principles of cognitive tutoring such as:&lt;br /&gt;
* Giving direct instruction&lt;br /&gt;
* Giving immediate feedback on errors&lt;br /&gt;
* Prompting for self-assessment&lt;br /&gt;
&lt;br /&gt;
This utilizes the following instructional principles:&lt;br /&gt;
&lt;br /&gt;
* The Self-Assessment Tutor utilizes the [[Reflection questions]] principle&lt;br /&gt;
* The Help Tutor itself utilizes the [[Tutoring feedback]] principle&lt;br /&gt;
* The Help Seeking Instruction utilizes the [[Explicit instruction]] principle.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
As seen below (adapted from Roll et al. 2006), metacognitive tutoring has the following goals:&lt;br /&gt;
# First, the tutoring system should capture metacognitive errors (in our case, help-seeking errors).&lt;br /&gt;
# Then, it should lead to an improved metacognitive behavior within the tutoring system.&lt;br /&gt;
# This, in turn, should lead to an improvement in the domain learning.&lt;br /&gt;
# The effect should persist beyond the scope of the tutoring system.&lt;br /&gt;
# As a result, students are expected to become better future learners.&lt;br /&gt;
&lt;br /&gt;
[[Image:Roll Pyramid.png]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 1: Capture metacognitive errors ====&lt;br /&gt;
&lt;br /&gt;
In study 1, 17% of students actions were classified as errors. These errors were singificantly negatively correlated with learning (r=-0.42) - the more help-seeking errors captured by the system, the smaller the improvement from pre- to post-test.&lt;br /&gt;
This data suggests that the help-seeking model captures appropriate actions, and that the goal was achieved - the Help Tutor captures help-seeking errors.&lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking and learning.png]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 2: Improve metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
Log files from study 1 were analyzed to evaluate whether students improved their help-seeking behavior while working with the Help Tutor. While overall there was only a minor decrease in the error rate (from 19% in the control condition to 16% in the Help Tutor condition), there was a significant decrease in the error rate on first hints, following hints, and the portion of bottom-out hints out of all hint-sequences. &lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking behavior.png]]&lt;br /&gt;
&lt;br /&gt;
This data suggests that while the system was not able to improve all aspects of the desired metacognitive behavior, it did improve students&#039; behavior on the common types of errors.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 3: Improve domain learning ====&lt;br /&gt;
&lt;br /&gt;
While students&#039; help seeking behavior improved while working with the Help Tutor (in study 1) or the full Help Seeking Support Environment (in study 2), we did not observe differences in learning between the two conditions on neither study 1 nor study 2.&lt;br /&gt;
&lt;br /&gt;
Study 1 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study_1_results.jpg]]&lt;br /&gt;
&lt;br /&gt;
Study 2 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study_2_results.jpg]]&lt;br /&gt;
&lt;br /&gt;
* Notice, since the tests in times 1 and 2 evaluated different instructional units (Angles vs. Quadrilaterals), lower grades at time 2 do not suggest decrease in knowledge.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 4: Improve future metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
To evaluate whether the effect of the help-seeking curriculum persists beyond the tutored environment, students&#039; help seeking behavior was evaluated in a transfer environment - the paper and pencil tests.&lt;br /&gt;
&lt;br /&gt;
Hypothetical help seeking dilemmas, such as the one described below, were used to evaluate declarative help-seeking knowledge.&lt;br /&gt;
&lt;br /&gt;
  1. You tried to answer a question that you know, but for some reason the tutor says that your answer is wrong. What should you do? &lt;br /&gt;
  [ ] First I would review my calculations. Perhaps I can find the mistake myself? &lt;br /&gt;
  [ ] The Tutor must have made a mistake. I will retype the same answer again. &lt;br /&gt;
  [ ] I would ask for a hint, to understand my mistake.&lt;br /&gt;
&lt;br /&gt;
Procedural help-seeking skills were evaluated using embedded hints in the tests (see figure in the Dependant Measures section above).&lt;br /&gt;
&lt;br /&gt;
In study 1 (which included only the Help Tutor component,) students in the Treatment condition demonstrated neither better declarative nor procedural help-seeking knowledge, compared with the Control condition.&lt;br /&gt;
&lt;br /&gt;
In study 2 (which included the explicit help-seeking instruction and the Self-Assessment tutor in addition to the Help Tutor) students in in the Treatment condition demonstrated better declarative help-seeking knowledge (compared with Control group students) but no better procedural knowledge&lt;br /&gt;
&lt;br /&gt;
[[Image:Declarative_knowledge.jpg]]&lt;br /&gt;
[[Image:Procedural_knowledge.jpg]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 5: Improve future domain learning ====&lt;br /&gt;
&lt;br /&gt;
Due to technical difficulties, this goal was not evaluated in both studies.&lt;br /&gt;
&lt;br /&gt;
==== Summary of results ====&lt;br /&gt;
&lt;br /&gt;
Overall, the following pattern of results emerges from the studies:&lt;br /&gt;
- The Help Seeking Support Environment intervened appropriate actions during the learning process&lt;br /&gt;
- Students improved their help-seeking behavior while working with the system&lt;br /&gt;
- Students acquired better help-seeking declarative knowledge following the system&lt;br /&gt;
- However, students&#039; domain learning did not improve&lt;br /&gt;
- Also, the improvement in students&#039; help seeking behavior did not persist beyond the tutoring system.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
These somewhat disappointing results raise an important question: Why did the environment not lead to an improvement in learning and in help-seeking behavior in the paper-test measures? Why did the improved online help-seeking behavior not lead to improved learning gains?&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 1: Students do not have the skills, but we didn&#039;t teach them right. ==== &lt;br /&gt;
One possible explanation may be that the Help Seeking Support Environment requires excessive cognitive load during problem solving. Clearly, the learning process with the Help Seeking Support Environment is more demanding compared with the conventional Cognitive Tutor alone, since more needs to be learned. However, much of the extra content is introduced during the classroom discussion and self-assessment sessions. The only extra content presented during the problem-solving sessions are the Help Tutor’s error messages, but they are not expected to increase the load much, especially given that a prioritization algorithm makes sure students receive only one message at a time (either from the Help Tutor or the [[cognitive tutor]]). Also, if the Help Seeking Support Environment indeed creates requires too much cognitive load, it should be expected to hinder learning, which we did not observe.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 2: The role of help seeking in ITS ==== &lt;br /&gt;
Hints in tutoring systems have two objectives: to promote learning of challenging skills, and to help students move forward within the curriculum (i.e., to prevent them from getting stuck). While the latter is achieved easily with both the Cognitive Tutor and the Help Seeking Support Environment, achieving the first is much harder. It is not yet clear what makes a hint into a good hint, and how to create an effective [[hint sequence]]. It is possible that the hints, as implemented in the units of the Cognitive Tutor we used, are not optimal. For example, there may be too many levels of hints, with each level adding too little information to the previous one. Also, perhaps the detailed explanations are too demanding with regard to students’ reading comprehension ability. It is quite possible that these hints, regardless of how they are being used, are above students&#039; [[zone of proximal development]], and thus do not contribute much to learning. Support for that idea comes from Schworm and Renkl (2002). They found that offering explanations by the system impaired learning when [[self-explanation]] was required. The Geometry Cognitive Tutor prompts for [[self-explanation]] in certain units. Perhaps elaborated hints are redundant, or even damaging, when [[self-explanation]] is required. &lt;br /&gt;
It is possible also that [[help-seeking behavior]] that we currently view as faulty may actually be useful and desirable, in specific contexts for specific students. For example, perhaps a student who does not know the material should be allowed to view the bottom-out hint immediately, in order to turn the problem into a solved example. Support for that idea can be found in a work by Yudelson et al. (2006), in which medical students in a leading med school successfully learned by repeatedly asking for more elaborated hints. Such “clicking-though hints” behavior would be considered faulty by Help Seeking Support Environment. However, this population of students is known to have good metacognitive skills (without them it is unlikely they would have reached their current position). Thus, it seems that sometimes “misusing” help (e.g., [[help abuse]], according to the Help Seeking Support Environment) can be beneficial to some students. Further evidence can be found in Baker et al. (2004), who showed that some (but not all) students who “[[game the system]]” (i.e., click through hints or guess repeatedly) learn just as much as students who do not game. It may be the case that certain gaming behaviors are adaptive, and not irrational. Students who use these strategies will insist on viewing the [[bottom out hint]] and will ignore all intermediate hints, whether domain-level or metacognitive. Once intermediate hints are ignored, better [[help-seeking behavior]] according to the Help Seeking Support Environment should have no effect whatsoever on domain [[knowledge component]]s, as indeed was seen.&lt;br /&gt;
It is possible that we are overestimating students’ ability to learn from hints. Our first recommendation is to re-evaluate the role of hints in [[cognitive tutor]]s using complementary methodologies such as log-file analysis (e.g., Chang (2006) uses dynamic Bayes nets to evaluate the contribution of hints in a reading tutor); tracing individual students (to evaluate the different uses students make of hints), experimentation of different types of hints (for example, proactive vs. on demand), and analysis of human tutors who aid students while working with [[cognitive tutor]]s.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 3: The focus of metacognitive tutoring in ITS. ====&lt;br /&gt;
The previous hypothesis, focused on students’ tendency to skip hints, suggests that perhaps the main issue is not lack of knowledge, but lack of motivation. In other words, &#039;&#039;&#039;perhaps students already have these skills in place, but choose not to use them.&#039;&#039;&#039; For students who ignore intermediate hints, metacognitive messages offer little incentive. While the Help Seeking Support Environment can increase the probability that a proper hint level appears on the screen, it has no influence on whether it is being read, or whether the student attempts to understand it. Students may ignore the messages for several reasons. For example, they may habitually click through hints, and may resent the changes that the Help Seeking Support Environment imposes. This idea is consistent with the teachers’ observation that the students were not fond of the Help Seeking Support Environment error messages. They may comply with them, in order to make progress, but beyond that will ignore their content. The test data discussed above provides support for this idea. On 7 out of the 12 hint evaluations (seen in the findings section of goal 4) students scored lower on items with hints than on items with no hints. A [[cognitive headroom]] explanation does not account for this difference, since the Request hints did not add much load. A more likely explanation is that students chose to skip the hints since they were new to them in the given context. Baker (2005) reviewed several reasons for why students [[game the system]]. While no clear answer was given, the question is applicable here as well. &lt;br /&gt;
Motivational issues bring us to our final hypothesis. Time preference discount (Feldstein, 1964) is a term coined in economics, that describes behavior in which people would rather have a smaller immediate reward over a distant greater reward. In the tutoring environment, comparing the benefit of immediate correct answer with the delayed benefit (if any) of acting metacognitively correct may often lead the student to choose the first. If that is indeed the case, then students may already have the right metacognitive skills in place. The question we should be asking ourselves is not only how to get students to learn the desired metacognitive skills – but mainly, how to get students to use them.&lt;br /&gt;
&lt;br /&gt;
=== Connections===&lt;br /&gt;
&lt;br /&gt;
* The Help Tutor attempts to extend traditional tutoring beyond the common domains. In that, it is similar to the work of Amy Ogan on tutoring [[FrenchCulture | French Culture]]&lt;br /&gt;
&lt;br /&gt;
* The manipulation of interaction between the student and the tutor, which is &amp;quot;natural&amp;quot; in the control condition, is guided by the help tutor.  This is similar to the scripting manipulation of the [[Rummel Scripted Collaborative Problem Solving]] and the [[Walker A Peer Tutoring Addition]] projects.&lt;br /&gt;
&lt;br /&gt;
* Another example for studying the effects of hints is [[Ringenberg Examples-as-Help|Ringenberg&#039;s study]], in which hints are compared to examples. &lt;br /&gt;
&lt;br /&gt;
* Going to do an in-vivo study at a LearnLab site? Check out how to answer [[FAQ for teachers|teacher&#039;s FAQ]]&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
Plans for June 2007 - Dec. 2007:&lt;br /&gt;
* Present the study in the International Conference on Artificial Intelligence on Education&lt;br /&gt;
* Submit camera ready copy of the paper to the Journal on Metacognition and Instruction&lt;br /&gt;
* Analyze the logfiles for Metacognitive learning&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
# Aleven, V., &amp;amp; Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th International Conference on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., &amp;amp; Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th International Conference on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int Journal of Artificial Intelligence in Education(16), 101-30 [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Baker, R.S., Corbett, A.T., &amp;amp; Koedinger, K.R. (2004) Detecting Student Misuse of Intelligent Tutoring Systems. in proceedings of 7Th International Conference on Intelligent Tutoring Systems, 531-40.&lt;br /&gt;
# Baker, R.S., Roll, I., Corbett, A.T., &amp;amp; Koedinger, K.R. (2005) Do Performance Goals Lead Students to Game the System? in proceedings of 12Th International Conference on Artificial Intelligence in Education, 57-64. Amsterdam, The Netherlands: IOS Press.&lt;br /&gt;
# Chang, K.K., Beck, J.E., Mostow, J., &amp;amp; Corbett, A. (2006) Does Help Help? A Bayes Net Approach to Modeling Tutor Interventions. in proceedings of Workshop on Educational Data Mining at AAAI 2006, 41-6. Menlo Park, California: AAAI.&lt;br /&gt;
# Feldstein, M.S. (1964). The Social Time Presence Discount Rate in Cost-Benefit Analysis. The Economic Journal 74(294), 360-79&lt;br /&gt;
# Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono,  (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., &amp;amp; Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th International Conference on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., &amp;amp; Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students&#039; Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (To appear) Can Help Seeking Be Tutored? Searching for the Secret Sauce of Metacognitive Tutoring. To appear in the proceedings of the International Conference on Artificial Intelligence in Education 2007.&lt;br /&gt;
# Schworm, S., &amp;amp; Renkl, A. (2002) Learning by solved example problems: Instructional explanations reduce self-explanation activity. in proceedings of The 24Th Annual Conference of the Cognitive Science Society, 816-21. Mahwah, NJ: Erlbaum.&lt;br /&gt;
# Yudelson, M.V., Medvedeva, O., Legowski, E., Castine, M., Jukic, D., &amp;amp; Crowley, R.S. (2006) Mining Student Learning Data to Develop High Level Pedagogic Strategy in a Medical ITS. in proceedings of Workshop on Educational Data Mining at AAAI 2006, Menlo Park, CA: AAAI.# Roll, I., Aleven, V., &amp;amp; Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th International Conference on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Empirical Study]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Declarative_knowledge.jpg&amp;diff=7133</id>
		<title>File:Declarative knowledge.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Declarative_knowledge.jpg&amp;diff=7133"/>
		<updated>2008-02-13T19:32:08Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7132</id>
		<title>The Help Tutor Roll Aleven McLaren</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7132"/>
		<updated>2008-02-13T19:31:56Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Evaluation of goal 4: Improve future metacognitive behavior */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Towards Tutoring [[Metacognition]] - The Case of Help Seeking ==&lt;br /&gt;
Ido Roll, Vincent Aleven, Bruce M. McLaren, Kenneth Koedinger&lt;br /&gt;
&lt;br /&gt;
=== Meta-data ===&lt;br /&gt;
&lt;br /&gt;
PI&#039;s: Vincent Aleven, Ido Roll, Bruce M. McLaren&lt;br /&gt;
&lt;br /&gt;
Other Contributers: EJ Ryu (programmer)&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Study # !! Start Date !! End Date !! LearnLab Site !!  # of Students !! Total Participant Hours !! DataShop? &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;1&#039;&#039;&#039; || 2004       || 2004     || Analysis of existing data || 40 || 280 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;2&#039;&#039;&#039; || 2005       || 2005     || Analysis of existing data || 70 || 105 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;3&#039;&#039;&#039; || 5/2005     || 5/2005   || Hampton &amp;amp; Wilkinsburg (Geometry) || 60 || 270 || No, incompatible format&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;4&#039;&#039;&#039; || 2/2006     || 4/2006   || CWCTC (Geometry)          || 84 || 1,008 || Yes&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
While working with a tutoring system, students are expected to regulate their own learning process. However, often, they demonstrate inadequate [[metacognition|metacognitive process]] in doing so. For example, students often ask for help too frequently or not frequently enough. &lt;br /&gt;
In this project we built an Intelligent Tutoring System to teach [[metacognition]], and in particular, to improve students&#039; [[help-seeking behavior]].  Our Help Seeking Support Environment includes three components:&lt;br /&gt;
# Direct [[help-seeking behavior|help seeking]] [[explicit instruction|instruction]], given by the teacher&lt;br /&gt;
# A [[Self-Assessment]] Tutor, to help students evaluate their own need for help&lt;br /&gt;
# The Help Tutor - a domain-independent agent that can be added as an adjunct to a [[cognitive tutor]]. Rather than making help-seeking decisions for the students, the Help Tutor teaches better help-seeking skills by tracing students actions on a (meta)cognitive [[help-seeking model]] and giving students appropriate feedback. &lt;br /&gt;
&lt;br /&gt;
In a series of [[in vivo experiment]]s, the Help Tutor accurately detected help-seeking errors that were associated with poorer learning and with poorer [[declarative]] and [[procedural]] [[knowledge component]]s of help seeking.  The main findings were that students made fewer help-seeking errors while working with the Help Tutor and acquired better help seeking [[declarative]] [[knowledge component]]s. &lt;br /&gt;
However, we did not find evidence that this led to an improvement in learning at the domain level or to better [[help-seeking behavior]] in a paper-and-pencil environment. &lt;br /&gt;
We pose a number of hypotheses in an attempt to explain these results. We question the current focus of metacognitive tutoring, and suggest ways to reexamine the role of [[help facilities]] and of metacognitive tutoring within Intelligent Tutoring Systems.&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
&lt;br /&gt;
Not only that teaching [[metacognition]] holds the promise of improving current learning of the domain of interest, but also, or even mainly, it can accelerate future learning and successful regulation of independent learning. One example to metacognitive knowledge is help-seeking [[knowledge component]]s: The ability to identify the need for help, and to elicit appropriate assistance from the [[relevant resources|help facilities].  &lt;br /&gt;
However, considerable evidence shows that metacognitive [[knowledge component]]s are in need of better support. For example, while working with Intelligent Tutoring Systems, students try to &amp;quot;[[game the system]]&amp;quot; or do not [[self-explanation|self-explain]] enough. Similarly, research shows that students&#039; [[help-seeking behavior]] leaves much room for improvement. &lt;br /&gt;
&lt;br /&gt;
==== Shallow help seeking [[knowledge component]]s ====&lt;br /&gt;
Research shows that students do not use their help-seeking konwledge components approrpiately. For example, Aleven et al. (2006) show that 30% of students&#039; actions were consecutive fast help requests (a common form of [[help abuse]], termed &#039;[[clicking through hints]]&#039;), without taking enough time to read the requested hints.  &lt;br /&gt;
Extensive log-file analysis suggests that students apply faulty [[knowledge component]]s such as the following:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[procedural]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
Cognitive aspects:&lt;br /&gt;
  If I don’t know the answer =&amp;gt; &lt;br /&gt;
  I should guess&lt;br /&gt;
&lt;br /&gt;
Motivational aspects:&lt;br /&gt;
  If I get the answer correct =&amp;gt;&lt;br /&gt;
  I achieved the goal&lt;br /&gt;
&lt;br /&gt;
Social aspects:&lt;br /&gt;
  If I ask for help =&amp;gt;&lt;br /&gt;
  I am weak&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[declarative]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
  Asking for hints will always reduce my skill level&lt;br /&gt;
&lt;br /&gt;
  Making an error is better than asking for a hint&lt;br /&gt;
&lt;br /&gt;
  Only weak people ask for help&lt;br /&gt;
&lt;br /&gt;
==== Teaching vs. supporting [[metacognition]] ====&lt;br /&gt;
&lt;br /&gt;
Several systems support students&#039; metacognitive actions in a way that encourages, or even forces, students to learn productively and efficiently. For example, a tutoring system can require the student to self-explain. While this approach is likely to improve domain learning in the supported environment, the effect is not likely to persist beyond the scope of the tutoring system, and therefore is not likely to help students become better future learners. &lt;br /&gt;
&lt;br /&gt;
Towards that end, we chose not to &#039;&#039;&#039;support&#039;&#039;&#039; students&#039; help seeking actions, but to &#039;&#039;&#039;teach&#039;&#039;&#039; them better help-seeking skills. Rather than making the metacognitive decisions for the students (for example, by preventing help-seeking errors or gaming opportunities), this study focuses on helping students refine their Help Seeking [[knowledge component]]s and acquire better [[feature validity]] of their [[help-seeking behavior|help-seeking]] [[metacognition|metacognitive skills]].&lt;br /&gt;
&lt;br /&gt;
By doing so, we examine whether metacognitive knowledge can be taught using familiar conventional domain-level pedagogies.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
See [[:Category:Help Tutor|Help Tutor Glossary]]&lt;br /&gt;
&lt;br /&gt;
=== Research questions ===&lt;br /&gt;
&lt;br /&gt;
# Can conventional and well-established instructional principles in the domain level be used to tutor [[metacognition|metacognitive]] [[knowledge component]]s such as [[help-seeking behavior|Help Seeking]] [[knowledge component]]s?&lt;br /&gt;
# Does the practice of better metacognitive behavior translates, in turn, to better domain learning?&lt;br /&gt;
&lt;br /&gt;
In addition, the project makes the following contributions:&lt;br /&gt;
# An improved understanding of the nature of help-seeking knowledge and its acquisition.&lt;br /&gt;
# A novel framework for the design of  goals, interaction and assessment for metacognitive tutoring.&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
&lt;br /&gt;
Two studies were performed with the Help Tutor. In both studies the independent variable was the existence of help seeking support.&lt;br /&gt;
Control condition used the conventional Geometry Cognitive Tutor&lt;br /&gt;
&lt;br /&gt;
[[Image:Geometry Cognitive Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
The treatment condition varied between studies:&lt;br /&gt;
* Study one: The Geometry Cognitive Tutor + the Help Tutor&lt;br /&gt;
* Study two: The Geometry Cognitive Tutor + the Help Seeking Support Environment (help seeking explicit instruction, self-assessment tutor, and Help Tutor)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Help Tutor:&#039;&#039;&#039;&lt;br /&gt;
The Help Tutor is an a Cognitive Tutor in its own right that identifies recommended types of actions by tracing students’ interaction with the Geometry Cognitive Tutor relative to a metacognitive help-seeking model. When students perform actions that deviate from the recommended ones, the Help Tutor presents a message that stresses the recommended action to be taken. Messages from the metacognitive Help Tutor and the domain-level Cognitive Tutor are coordinated, so that the student receives only the most helpful message at each point [2].   &lt;br /&gt;
&lt;br /&gt;
[[Image:The Help-tutor.jpg]]&lt;br /&gt;
[[image:The Help Seeking Model.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Self-Assessment Tutor:&#039;&#039;&#039;&lt;br /&gt;
The ability to correctly self-assess one’s own knowledge level is correlated with strategic use of help (Tobias and Everson, 2002). The Self-Assessment Tutor is designed to tutor students on &lt;br /&gt;
their self-assessment skills; to help students make appropriative learning decisions based on their self assessment; and mainly, to give students a tutoring environment, low on cognitive load, in which they can practice using their help-seeking skills.  &lt;br /&gt;
The curriculum used by the Treatment group in study two consists of interleaving Self Assessment and Cognitive Tutor + Help Tutor sessions, with the Self Assessment sessions taking about 10% of the students’ time. During each self-assessment session the student assesses the skills to be practiced in the subsequent Cognitive Tutor section.&lt;br /&gt;
  &lt;br /&gt;
[[Image:The Self-Assessment Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Explicit help-seeking instruction:&#039;&#039;&#039;&lt;br /&gt;
As White and Frederiksen demonstrated (1998), reflecting in the classroom environment on the desired metacognitive process helps students internalize it. With that goal in mind, we created a short classroom lesson about help seeking with the following objectives: to give students a better declarative understanding of desired and effective help-seeking behavior; to improve their dispositions and attitudes towards seeking help; and to frame the help-seeking knowledge as an important learning goal, alongside Geometry knowledge, for the coming few weeks. The instruction includes a video presentation with examples of productive and faulty help-seeking behavior and the appropriate help-seeking principles. &lt;br /&gt;
&lt;br /&gt;
[[Image:Explicit help-seeking instruction.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
&lt;br /&gt;
The study uses two levels of dependent measures:&lt;br /&gt;
# Directly assessing Help Seeking skills&lt;br /&gt;
# Assessing domain-level learning, and by that evaluating the contribution of the help-seeking skills.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Assessments of help-seeking konwledge:&lt;br /&gt;
* [[Normal post-test]]: &lt;br /&gt;
** Declarative: hypothetical help-seeking dilemmas&lt;br /&gt;
** Procedural: Help seeking error rate while working with the tutor&lt;br /&gt;
* [[Transfer]]: Ability to use optional hints embedded within certain test items in the paper test.&lt;br /&gt;
&lt;br /&gt;
[[Image:embedded hints.jpg]]&lt;br /&gt;
&lt;br /&gt;
2. Assessments of domain konwledge:&lt;br /&gt;
* [[Normal post-test]]: Problem solving and explanation items like those in the tutor&#039;s instruction.&lt;br /&gt;
* [[Transfer]]: &lt;br /&gt;
** Data insufficiency (or &amp;quot;not enough information&amp;quot;) items.&lt;br /&gt;
** Conceptual understanding items (study two only)&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
&lt;br /&gt;
The combination of explicit help-seeking instruction, on-time feedback on help seeking errors, and raising awareness to knowledge deficits will&lt;br /&gt;
* Improve [[feature validity]] of students&#039; help seeking skills&lt;br /&gt;
and thus, in turn, will&lt;br /&gt;
* Improve learning of domain knowledge by using those skills effectively.&lt;br /&gt;
&lt;br /&gt;
==== Instructional Principles ====&lt;br /&gt;
&lt;br /&gt;
The main principles being evaluated here is whether [[Roll_help seeking principle | instruction should support meta-cognition in the context of problem solving]] by using principles of cognitive tutoring such as:&lt;br /&gt;
* Giving direct instruction&lt;br /&gt;
* Giving immediate feedback on errors&lt;br /&gt;
* Prompting for self-assessment&lt;br /&gt;
&lt;br /&gt;
This utilizes the following instructional principles:&lt;br /&gt;
&lt;br /&gt;
* The Self-Assessment Tutor utilizes the [[Reflection questions]] principle&lt;br /&gt;
* The Help Tutor itself utilizes the [[Tutoring feedback]] principle&lt;br /&gt;
* The Help Seeking Instruction utilizes the [[Explicit instruction]] principle.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
As seen below (adapted from Roll et al. 2006), metacognitive tutoring has the following goals:&lt;br /&gt;
# First, the tutoring system should capture metacognitive errors (in our case, help-seeking errors).&lt;br /&gt;
# Then, it should lead to an improved metacognitive behavior within the tutoring system.&lt;br /&gt;
# This, in turn, should lead to an improvement in the domain learning.&lt;br /&gt;
# The effect should persist beyond the scope of the tutoring system.&lt;br /&gt;
# As a result, students are expected to become better future learners.&lt;br /&gt;
&lt;br /&gt;
[[Image:Roll Pyramid.png]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 1: Capture metacognitive errors ====&lt;br /&gt;
&lt;br /&gt;
In study 1, 17% of students actions were classified as errors. These errors were singificantly negatively correlated with learning (r=-0.42) - the more help-seeking errors captured by the system, the smaller the improvement from pre- to post-test.&lt;br /&gt;
This data suggests that the help-seeking model captures appropriate actions, and that the goal was achieved - the Help Tutor captures help-seeking errors.&lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking and learning.png]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 2: Improve metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
Log files from study 1 were analyzed to evaluate whether students improved their help-seeking behavior while working with the Help Tutor. While overall there was only a minor decrease in the error rate (from 19% in the control condition to 16% in the Help Tutor condition), there was a significant decrease in the error rate on first hints, following hints, and the portion of bottom-out hints out of all hint-sequences. &lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking behavior.png]]&lt;br /&gt;
&lt;br /&gt;
This data suggests that while the system was not able to improve all aspects of the desired metacognitive behavior, it did improve students&#039; behavior on the common types of errors.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 3: Improve domain learning ====&lt;br /&gt;
&lt;br /&gt;
While students&#039; help seeking behavior improved while working with the Help Tutor (in study 1) or the full Help Seeking Support Environment (in study 2), we did not observe differences in learning between the two conditions on neither study 1 nor study 2.&lt;br /&gt;
&lt;br /&gt;
Study 1 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study 1 results.png]]&lt;br /&gt;
&lt;br /&gt;
Study 2 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study 2 results.png]]&lt;br /&gt;
&lt;br /&gt;
* Notice, since the tests in times 1 and 2 evaluated different instructional units (Angles vs. Quadrilaterals), lower grades at time 2 do not suggest decrease in knowledge. &lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 4: Improve future metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
To evaluate whether the effect of the help-seeking curriculum persists beyond the tutored environment, students&#039; help seeking behavior was evaluated in a transfer environment - the paper and pencil tests.&lt;br /&gt;
&lt;br /&gt;
Hypothetical help seeking dilemmas, such as the one described below, were used to evaluate declarative help-seeking knowledge.&lt;br /&gt;
&lt;br /&gt;
  1. You tried to answer a question that you know, but for some reason the tutor says that your answer is wrong. What should you do? &lt;br /&gt;
  [ ] First I would review my calculations. Perhaps I can find the mistake myself? &lt;br /&gt;
  [ ] The Tutor must have made a mistake. I will retype the same answer again. &lt;br /&gt;
  [ ] I would ask for a hint, to understand my mistake.&lt;br /&gt;
&lt;br /&gt;
Procedural help-seeking skills were evaluated using embedded hints in the tests (see figure in the Dependant Measures section above).&lt;br /&gt;
&lt;br /&gt;
In study 1 (which included only the Help Tutor component,) students in the Treatment condition demonstrated neither better declarative nor procedural help-seeking knowledge, compared with the Control condition.&lt;br /&gt;
&lt;br /&gt;
In study 2 (which included the explicit help-seeking instruction and the Self-Assessment tutor in addition to the Help Tutor) students in in the Treatment condition demonstrated better declarative help-seeking knowledge (compared with Control group students) but no better procedural knowledge&lt;br /&gt;
&lt;br /&gt;
[[Image:Declarative_knowledge.jpg]]&lt;br /&gt;
[[Image:Procedural_knowledge.jpg]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 5: Improve future domain learning ====&lt;br /&gt;
&lt;br /&gt;
Due to technical difficulties, this goal was not evaluated in both studies.&lt;br /&gt;
&lt;br /&gt;
==== Summary of results ====&lt;br /&gt;
&lt;br /&gt;
Overall, the following pattern of results emerges from the studies:&lt;br /&gt;
- The Help Seeking Support Environment intervened appropriate actions during the learning process&lt;br /&gt;
- Students improved their help-seeking behavior while working with the system&lt;br /&gt;
- Students acquired better help-seeking declarative knowledge following the system&lt;br /&gt;
- However, students&#039; domain learning did not improve&lt;br /&gt;
- Also, the improvement in students&#039; help seeking behavior did not persist beyond the tutoring system.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
These somewhat disappointing results raise an important question: Why did the environment not lead to an improvement in learning and in help-seeking behavior in the paper-test measures? Why did the improved online help-seeking behavior not lead to improved learning gains?&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 1: Students do not have the skills, but we didn&#039;t teach them right. ==== &lt;br /&gt;
One possible explanation may be that the Help Seeking Support Environment requires excessive cognitive load during problem solving. Clearly, the learning process with the Help Seeking Support Environment is more demanding compared with the conventional Cognitive Tutor alone, since more needs to be learned. However, much of the extra content is introduced during the classroom discussion and self-assessment sessions. The only extra content presented during the problem-solving sessions are the Help Tutor’s error messages, but they are not expected to increase the load much, especially given that a prioritization algorithm makes sure students receive only one message at a time (either from the Help Tutor or the [[cognitive tutor]]). Also, if the Help Seeking Support Environment indeed creates requires too much cognitive load, it should be expected to hinder learning, which we did not observe.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 2: The role of help seeking in ITS ==== &lt;br /&gt;
Hints in tutoring systems have two objectives: to promote learning of challenging skills, and to help students move forward within the curriculum (i.e., to prevent them from getting stuck). While the latter is achieved easily with both the Cognitive Tutor and the Help Seeking Support Environment, achieving the first is much harder. It is not yet clear what makes a hint into a good hint, and how to create an effective [[hint sequence]]. It is possible that the hints, as implemented in the units of the Cognitive Tutor we used, are not optimal. For example, there may be too many levels of hints, with each level adding too little information to the previous one. Also, perhaps the detailed explanations are too demanding with regard to students’ reading comprehension ability. It is quite possible that these hints, regardless of how they are being used, are above students&#039; [[zone of proximal development]], and thus do not contribute much to learning. Support for that idea comes from Schworm and Renkl (2002). They found that offering explanations by the system impaired learning when [[self-explanation]] was required. The Geometry Cognitive Tutor prompts for [[self-explanation]] in certain units. Perhaps elaborated hints are redundant, or even damaging, when [[self-explanation]] is required. &lt;br /&gt;
It is possible also that [[help-seeking behavior]] that we currently view as faulty may actually be useful and desirable, in specific contexts for specific students. For example, perhaps a student who does not know the material should be allowed to view the bottom-out hint immediately, in order to turn the problem into a solved example. Support for that idea can be found in a work by Yudelson et al. (2006), in which medical students in a leading med school successfully learned by repeatedly asking for more elaborated hints. Such “clicking-though hints” behavior would be considered faulty by Help Seeking Support Environment. However, this population of students is known to have good metacognitive skills (without them it is unlikely they would have reached their current position). Thus, it seems that sometimes “misusing” help (e.g., [[help abuse]], according to the Help Seeking Support Environment) can be beneficial to some students. Further evidence can be found in Baker et al. (2004), who showed that some (but not all) students who “[[game the system]]” (i.e., click through hints or guess repeatedly) learn just as much as students who do not game. It may be the case that certain gaming behaviors are adaptive, and not irrational. Students who use these strategies will insist on viewing the [[bottom out hint]] and will ignore all intermediate hints, whether domain-level or metacognitive. Once intermediate hints are ignored, better [[help-seeking behavior]] according to the Help Seeking Support Environment should have no effect whatsoever on domain [[knowledge component]]s, as indeed was seen.&lt;br /&gt;
It is possible that we are overestimating students’ ability to learn from hints. Our first recommendation is to re-evaluate the role of hints in [[cognitive tutor]]s using complementary methodologies such as log-file analysis (e.g., Chang (2006) uses dynamic Bayes nets to evaluate the contribution of hints in a reading tutor); tracing individual students (to evaluate the different uses students make of hints), experimentation of different types of hints (for example, proactive vs. on demand), and analysis of human tutors who aid students while working with [[cognitive tutor]]s.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 3: The focus of metacognitive tutoring in ITS. ====&lt;br /&gt;
The previous hypothesis, focused on students’ tendency to skip hints, suggests that perhaps the main issue is not lack of knowledge, but lack of motivation. In other words, &#039;&#039;&#039;perhaps students already have these skills in place, but choose not to use them.&#039;&#039;&#039; For students who ignore intermediate hints, metacognitive messages offer little incentive. While the Help Seeking Support Environment can increase the probability that a proper hint level appears on the screen, it has no influence on whether it is being read, or whether the student attempts to understand it. Students may ignore the messages for several reasons. For example, they may habitually click through hints, and may resent the changes that the Help Seeking Support Environment imposes. This idea is consistent with the teachers’ observation that the students were not fond of the Help Seeking Support Environment error messages. They may comply with them, in order to make progress, but beyond that will ignore their content. The test data discussed above provides support for this idea. On 7 out of the 12 hint evaluations (seen in the findings section of goal 4) students scored lower on items with hints than on items with no hints. A [[cognitive headroom]] explanation does not account for this difference, since the Request hints did not add much load. A more likely explanation is that students chose to skip the hints since they were new to them in the given context. Baker (2005) reviewed several reasons for why students [[game the system]]. While no clear answer was given, the question is applicable here as well. &lt;br /&gt;
Motivational issues bring us to our final hypothesis. Time preference discount (Feldstein, 1964) is a term coined in economics, that describes behavior in which people would rather have a smaller immediate reward over a distant greater reward. In the tutoring environment, comparing the benefit of immediate correct answer with the delayed benefit (if any) of acting metacognitively correct may often lead the student to choose the first. If that is indeed the case, then students may already have the right metacognitive skills in place. The question we should be asking ourselves is not only how to get students to learn the desired metacognitive skills – but mainly, how to get students to use them.&lt;br /&gt;
&lt;br /&gt;
=== Connections===&lt;br /&gt;
&lt;br /&gt;
* The Help Tutor attempts to extend traditional tutoring beyond the common domains. In that, it is similar to the work of Amy Ogan on tutoring [[FrenchCulture | French Culture]]&lt;br /&gt;
&lt;br /&gt;
* The manipulation of interaction between the student and the tutor, which is &amp;quot;natural&amp;quot; in the control condition, is guided by the help tutor.  This is similar to the scripting manipulation of the [[Rummel Scripted Collaborative Problem Solving]] and the [[Walker A Peer Tutoring Addition]] projects.&lt;br /&gt;
&lt;br /&gt;
* Another example for studying the effects of hints is [[Ringenberg Examples-as-Help|Ringenberg&#039;s study]], in which hints are compared to examples. &lt;br /&gt;
&lt;br /&gt;
* Going to do an in-vivo study at a LearnLab site? Check out how to answer [[FAQ for teachers|teacher&#039;s FAQ]]&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
Plans for June 2007 - Dec. 2007:&lt;br /&gt;
* Present the study in the International Conference on Artificial Intelligence on Education&lt;br /&gt;
* Submit camera ready copy of the paper to the Journal on Metacognition and Instruction&lt;br /&gt;
* Analyze the logfiles for Metacognitive learning&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
# Aleven, V., &amp;amp; Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th International Conference on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., &amp;amp; Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th International Conference on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int Journal of Artificial Intelligence in Education(16), 101-30 [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Baker, R.S., Corbett, A.T., &amp;amp; Koedinger, K.R. (2004) Detecting Student Misuse of Intelligent Tutoring Systems. in proceedings of 7Th International Conference on Intelligent Tutoring Systems, 531-40.&lt;br /&gt;
# Baker, R.S., Roll, I., Corbett, A.T., &amp;amp; Koedinger, K.R. (2005) Do Performance Goals Lead Students to Game the System? in proceedings of 12Th International Conference on Artificial Intelligence in Education, 57-64. Amsterdam, The Netherlands: IOS Press.&lt;br /&gt;
# Chang, K.K., Beck, J.E., Mostow, J., &amp;amp; Corbett, A. (2006) Does Help Help? A Bayes Net Approach to Modeling Tutor Interventions. in proceedings of Workshop on Educational Data Mining at AAAI 2006, 41-6. Menlo Park, California: AAAI.&lt;br /&gt;
# Feldstein, M.S. (1964). The Social Time Presence Discount Rate in Cost-Benefit Analysis. The Economic Journal 74(294), 360-79&lt;br /&gt;
# Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono,  (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., &amp;amp; Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th International Conference on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., &amp;amp; Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students&#039; Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (To appear) Can Help Seeking Be Tutored? Searching for the Secret Sauce of Metacognitive Tutoring. To appear in the proceedings of the International Conference on Artificial Intelligence in Education 2007.&lt;br /&gt;
# Schworm, S., &amp;amp; Renkl, A. (2002) Learning by solved example problems: Instructional explanations reduce self-explanation activity. in proceedings of The 24Th Annual Conference of the Cognitive Science Society, 816-21. Mahwah, NJ: Erlbaum.&lt;br /&gt;
# Yudelson, M.V., Medvedeva, O., Legowski, E., Castine, M., Jukic, D., &amp;amp; Crowley, R.S. (2006) Mining Student Learning Data to Develop High Level Pedagogic Strategy in a Medical ITS. in proceedings of Workshop on Educational Data Mining at AAAI 2006, Menlo Park, CA: AAAI.# Roll, I., Aleven, V., &amp;amp; Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th International Conference on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Empirical Study]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7131</id>
		<title>The Help Tutor Roll Aleven McLaren</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=The_Help_Tutor_Roll_Aleven_McLaren&amp;diff=7131"/>
		<updated>2008-02-13T19:30:54Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Evaluation of goal 4: Improve future metacognitive behavior */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Towards Tutoring [[Metacognition]] - The Case of Help Seeking ==&lt;br /&gt;
Ido Roll, Vincent Aleven, Bruce M. McLaren, Kenneth Koedinger&lt;br /&gt;
&lt;br /&gt;
=== Meta-data ===&lt;br /&gt;
&lt;br /&gt;
PI&#039;s: Vincent Aleven, Ido Roll, Bruce M. McLaren&lt;br /&gt;
&lt;br /&gt;
Other Contributers: EJ Ryu (programmer)&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Study # !! Start Date !! End Date !! LearnLab Site !!  # of Students !! Total Participant Hours !! DataShop? &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;1&#039;&#039;&#039; || 2004       || 2004     || Analysis of existing data || 40 || 280 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;2&#039;&#039;&#039; || 2005       || 2005     || Analysis of existing data || 70 || 105 || No, old data&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;3&#039;&#039;&#039; || 5/2005     || 5/2005   || Hampton &amp;amp; Wilkinsburg (Geometry) || 60 || 270 || No, incompatible format&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;4&#039;&#039;&#039; || 2/2006     || 4/2006   || CWCTC (Geometry)          || 84 || 1,008 || Yes&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
While working with a tutoring system, students are expected to regulate their own learning process. However, often, they demonstrate inadequate [[metacognition|metacognitive process]] in doing so. For example, students often ask for help too frequently or not frequently enough. &lt;br /&gt;
In this project we built an Intelligent Tutoring System to teach [[metacognition]], and in particular, to improve students&#039; [[help-seeking behavior]].  Our Help Seeking Support Environment includes three components:&lt;br /&gt;
# Direct [[help-seeking behavior|help seeking]] [[explicit instruction|instruction]], given by the teacher&lt;br /&gt;
# A [[Self-Assessment]] Tutor, to help students evaluate their own need for help&lt;br /&gt;
# The Help Tutor - a domain-independent agent that can be added as an adjunct to a [[cognitive tutor]]. Rather than making help-seeking decisions for the students, the Help Tutor teaches better help-seeking skills by tracing students actions on a (meta)cognitive [[help-seeking model]] and giving students appropriate feedback. &lt;br /&gt;
&lt;br /&gt;
In a series of [[in vivo experiment]]s, the Help Tutor accurately detected help-seeking errors that were associated with poorer learning and with poorer [[declarative]] and [[procedural]] [[knowledge component]]s of help seeking.  The main findings were that students made fewer help-seeking errors while working with the Help Tutor and acquired better help seeking [[declarative]] [[knowledge component]]s. &lt;br /&gt;
However, we did not find evidence that this led to an improvement in learning at the domain level or to better [[help-seeking behavior]] in a paper-and-pencil environment. &lt;br /&gt;
We pose a number of hypotheses in an attempt to explain these results. We question the current focus of metacognitive tutoring, and suggest ways to reexamine the role of [[help facilities]] and of metacognitive tutoring within Intelligent Tutoring Systems.&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
&lt;br /&gt;
Not only that teaching [[metacognition]] holds the promise of improving current learning of the domain of interest, but also, or even mainly, it can accelerate future learning and successful regulation of independent learning. One example to metacognitive knowledge is help-seeking [[knowledge component]]s: The ability to identify the need for help, and to elicit appropriate assistance from the [[relevant resources|help facilities].  &lt;br /&gt;
However, considerable evidence shows that metacognitive [[knowledge component]]s are in need of better support. For example, while working with Intelligent Tutoring Systems, students try to &amp;quot;[[game the system]]&amp;quot; or do not [[self-explanation|self-explain]] enough. Similarly, research shows that students&#039; [[help-seeking behavior]] leaves much room for improvement. &lt;br /&gt;
&lt;br /&gt;
==== Shallow help seeking [[knowledge component]]s ====&lt;br /&gt;
Research shows that students do not use their help-seeking konwledge components approrpiately. For example, Aleven et al. (2006) show that 30% of students&#039; actions were consecutive fast help requests (a common form of [[help abuse]], termed &#039;[[clicking through hints]]&#039;), without taking enough time to read the requested hints.  &lt;br /&gt;
Extensive log-file analysis suggests that students apply faulty [[knowledge component]]s such as the following:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[procedural]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
Cognitive aspects:&lt;br /&gt;
  If I don’t know the answer =&amp;gt; &lt;br /&gt;
  I should guess&lt;br /&gt;
&lt;br /&gt;
Motivational aspects:&lt;br /&gt;
  If I get the answer correct =&amp;gt;&lt;br /&gt;
  I achieved the goal&lt;br /&gt;
&lt;br /&gt;
Social aspects:&lt;br /&gt;
  If I ask for help =&amp;gt;&lt;br /&gt;
  I am weak&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Faulty [[declarative]] [[knowledge component]]s:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
  Asking for hints will always reduce my skill level&lt;br /&gt;
&lt;br /&gt;
  Making an error is better than asking for a hint&lt;br /&gt;
&lt;br /&gt;
  Only weak people ask for help&lt;br /&gt;
&lt;br /&gt;
==== Teaching vs. supporting [[metacognition]] ====&lt;br /&gt;
&lt;br /&gt;
Several systems support students&#039; metacognitive actions in a way that encourages, or even forces, students to learn productively and efficiently. For example, a tutoring system can require the student to self-explain. While this approach is likely to improve domain learning in the supported environment, the effect is not likely to persist beyond the scope of the tutoring system, and therefore is not likely to help students become better future learners. &lt;br /&gt;
&lt;br /&gt;
Towards that end, we chose not to &#039;&#039;&#039;support&#039;&#039;&#039; students&#039; help seeking actions, but to &#039;&#039;&#039;teach&#039;&#039;&#039; them better help-seeking skills. Rather than making the metacognitive decisions for the students (for example, by preventing help-seeking errors or gaming opportunities), this study focuses on helping students refine their Help Seeking [[knowledge component]]s and acquire better [[feature validity]] of their [[help-seeking behavior|help-seeking]] [[metacognition|metacognitive skills]].&lt;br /&gt;
&lt;br /&gt;
By doing so, we examine whether metacognitive knowledge can be taught using familiar conventional domain-level pedagogies.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
See [[:Category:Help Tutor|Help Tutor Glossary]]&lt;br /&gt;
&lt;br /&gt;
=== Research questions ===&lt;br /&gt;
&lt;br /&gt;
# Can conventional and well-established instructional principles in the domain level be used to tutor [[metacognition|metacognitive]] [[knowledge component]]s such as [[help-seeking behavior|Help Seeking]] [[knowledge component]]s?&lt;br /&gt;
# Does the practice of better metacognitive behavior translates, in turn, to better domain learning?&lt;br /&gt;
&lt;br /&gt;
In addition, the project makes the following contributions:&lt;br /&gt;
# An improved understanding of the nature of help-seeking knowledge and its acquisition.&lt;br /&gt;
# A novel framework for the design of  goals, interaction and assessment for metacognitive tutoring.&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
&lt;br /&gt;
Two studies were performed with the Help Tutor. In both studies the independent variable was the existence of help seeking support.&lt;br /&gt;
Control condition used the conventional Geometry Cognitive Tutor&lt;br /&gt;
&lt;br /&gt;
[[Image:Geometry Cognitive Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
The treatment condition varied between studies:&lt;br /&gt;
* Study one: The Geometry Cognitive Tutor + the Help Tutor&lt;br /&gt;
* Study two: The Geometry Cognitive Tutor + the Help Seeking Support Environment (help seeking explicit instruction, self-assessment tutor, and Help Tutor)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Help Tutor:&#039;&#039;&#039;&lt;br /&gt;
The Help Tutor is an a Cognitive Tutor in its own right that identifies recommended types of actions by tracing students’ interaction with the Geometry Cognitive Tutor relative to a metacognitive help-seeking model. When students perform actions that deviate from the recommended ones, the Help Tutor presents a message that stresses the recommended action to be taken. Messages from the metacognitive Help Tutor and the domain-level Cognitive Tutor are coordinated, so that the student receives only the most helpful message at each point [2].   &lt;br /&gt;
&lt;br /&gt;
[[Image:The Help-tutor.jpg]]&lt;br /&gt;
[[image:The Help Seeking Model.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Self-Assessment Tutor:&#039;&#039;&#039;&lt;br /&gt;
The ability to correctly self-assess one’s own knowledge level is correlated with strategic use of help (Tobias and Everson, 2002). The Self-Assessment Tutor is designed to tutor students on &lt;br /&gt;
their self-assessment skills; to help students make appropriative learning decisions based on their self assessment; and mainly, to give students a tutoring environment, low on cognitive load, in which they can practice using their help-seeking skills.  &lt;br /&gt;
The curriculum used by the Treatment group in study two consists of interleaving Self Assessment and Cognitive Tutor + Help Tutor sessions, with the Self Assessment sessions taking about 10% of the students’ time. During each self-assessment session the student assesses the skills to be practiced in the subsequent Cognitive Tutor section.&lt;br /&gt;
  &lt;br /&gt;
[[Image:The Self-Assessment Tutor.jpg]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Explicit help-seeking instruction:&#039;&#039;&#039;&lt;br /&gt;
As White and Frederiksen demonstrated (1998), reflecting in the classroom environment on the desired metacognitive process helps students internalize it. With that goal in mind, we created a short classroom lesson about help seeking with the following objectives: to give students a better declarative understanding of desired and effective help-seeking behavior; to improve their dispositions and attitudes towards seeking help; and to frame the help-seeking knowledge as an important learning goal, alongside Geometry knowledge, for the coming few weeks. The instruction includes a video presentation with examples of productive and faulty help-seeking behavior and the appropriate help-seeking principles. &lt;br /&gt;
&lt;br /&gt;
[[Image:Explicit help-seeking instruction.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
&lt;br /&gt;
The study uses two levels of dependent measures:&lt;br /&gt;
# Directly assessing Help Seeking skills&lt;br /&gt;
# Assessing domain-level learning, and by that evaluating the contribution of the help-seeking skills.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Assessments of help-seeking konwledge:&lt;br /&gt;
* [[Normal post-test]]: &lt;br /&gt;
** Declarative: hypothetical help-seeking dilemmas&lt;br /&gt;
** Procedural: Help seeking error rate while working with the tutor&lt;br /&gt;
* [[Transfer]]: Ability to use optional hints embedded within certain test items in the paper test.&lt;br /&gt;
&lt;br /&gt;
[[Image:embedded hints.jpg]]&lt;br /&gt;
&lt;br /&gt;
2. Assessments of domain konwledge:&lt;br /&gt;
* [[Normal post-test]]: Problem solving and explanation items like those in the tutor&#039;s instruction.&lt;br /&gt;
* [[Transfer]]: &lt;br /&gt;
** Data insufficiency (or &amp;quot;not enough information&amp;quot;) items.&lt;br /&gt;
** Conceptual understanding items (study two only)&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
&lt;br /&gt;
The combination of explicit help-seeking instruction, on-time feedback on help seeking errors, and raising awareness to knowledge deficits will&lt;br /&gt;
* Improve [[feature validity]] of students&#039; help seeking skills&lt;br /&gt;
and thus, in turn, will&lt;br /&gt;
* Improve learning of domain knowledge by using those skills effectively.&lt;br /&gt;
&lt;br /&gt;
==== Instructional Principles ====&lt;br /&gt;
&lt;br /&gt;
The main principles being evaluated here is whether [[Roll_help seeking principle | instruction should support meta-cognition in the context of problem solving]] by using principles of cognitive tutoring such as:&lt;br /&gt;
* Giving direct instruction&lt;br /&gt;
* Giving immediate feedback on errors&lt;br /&gt;
* Prompting for self-assessment&lt;br /&gt;
&lt;br /&gt;
This utilizes the following instructional principles:&lt;br /&gt;
&lt;br /&gt;
* The Self-Assessment Tutor utilizes the [[Reflection questions]] principle&lt;br /&gt;
* The Help Tutor itself utilizes the [[Tutoring feedback]] principle&lt;br /&gt;
* The Help Seeking Instruction utilizes the [[Explicit instruction]] principle.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
As seen below (adapted from Roll et al. 2006), metacognitive tutoring has the following goals:&lt;br /&gt;
# First, the tutoring system should capture metacognitive errors (in our case, help-seeking errors).&lt;br /&gt;
# Then, it should lead to an improved metacognitive behavior within the tutoring system.&lt;br /&gt;
# This, in turn, should lead to an improvement in the domain learning.&lt;br /&gt;
# The effect should persist beyond the scope of the tutoring system.&lt;br /&gt;
# As a result, students are expected to become better future learners.&lt;br /&gt;
&lt;br /&gt;
[[Image:Roll Pyramid.png]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 1: Capture metacognitive errors ====&lt;br /&gt;
&lt;br /&gt;
In study 1, 17% of students actions were classified as errors. These errors were singificantly negatively correlated with learning (r=-0.42) - the more help-seeking errors captured by the system, the smaller the improvement from pre- to post-test.&lt;br /&gt;
This data suggests that the help-seeking model captures appropriate actions, and that the goal was achieved - the Help Tutor captures help-seeking errors.&lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking and learning.png]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 2: Improve metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
Log files from study 1 were analyzed to evaluate whether students improved their help-seeking behavior while working with the Help Tutor. While overall there was only a minor decrease in the error rate (from 19% in the control condition to 16% in the Help Tutor condition), there was a significant decrease in the error rate on first hints, following hints, and the portion of bottom-out hints out of all hint-sequences. &lt;br /&gt;
&lt;br /&gt;
[[Image:help-seeking behavior.png]]&lt;br /&gt;
&lt;br /&gt;
This data suggests that while the system was not able to improve all aspects of the desired metacognitive behavior, it did improve students&#039; behavior on the common types of errors.&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 3: Improve domain learning ====&lt;br /&gt;
&lt;br /&gt;
While students&#039; help seeking behavior improved while working with the Help Tutor (in study 1) or the full Help Seeking Support Environment (in study 2), we did not observe differences in learning between the two conditions on neither study 1 nor study 2.&lt;br /&gt;
&lt;br /&gt;
Study 1 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study 1 results.png]]&lt;br /&gt;
&lt;br /&gt;
Study 2 results:&lt;br /&gt;
&lt;br /&gt;
[[Image:study 2 results.png]]&lt;br /&gt;
&lt;br /&gt;
* Notice, since the tests in times 1 and 2 evaluated different instructional units (Angles vs. Quadrilaterals), lower grades at time 2 do not suggest decrease in knowledge. &lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 4: Improve future metacognitive behavior ====&lt;br /&gt;
&lt;br /&gt;
To evaluate whether the effect of the help-seeking curriculum persists beyond the tutored environment, students&#039; help seeking behavior was evaluated in a transfer environment - the paper and pencil tests.&lt;br /&gt;
&lt;br /&gt;
Hypothetical help seeking dilemmas, such as the one described below, were used to evaluate declarative help-seeking knowledge.&lt;br /&gt;
&lt;br /&gt;
  1. You tried to answer a question that you know, but for some reason the tutor says that your answer is wrong. What should you do? &lt;br /&gt;
  [ ] First I would review my calculations. Perhaps I can find the mistake myself? &lt;br /&gt;
  [ ] The Tutor must have made a mistake. I will retype the same answer again. &lt;br /&gt;
  [ ] I would ask for a hint, to understand my mistake.&lt;br /&gt;
&lt;br /&gt;
Procedural help-seeking skills were evaluated using embedded hints in the tests (see figure in the Dependant Measures section above).&lt;br /&gt;
&lt;br /&gt;
In study 1 (which included only the Help Tutor component,) students in the Treatment condition demonstrated neither better declarative nor procedural help-seeking knowledge, compared with the Control condition.&lt;br /&gt;
&lt;br /&gt;
In study 2 (which included the explicit help-seeking instruction and the Self-Assessment tutor in addition to the Help Tutor) students in in the Treatment condition demonstrated better declarative help-seeking knowledge (compared with Control group students) but no better procedural knowledge&lt;br /&gt;
&lt;br /&gt;
[[Image:Declarative knowledge.png]]&lt;br /&gt;
[[Image:Procedural_knowledge.jpg]]&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of goal 5: Improve future domain learning ====&lt;br /&gt;
&lt;br /&gt;
Due to technical difficulties, this goal was not evaluated in both studies.&lt;br /&gt;
&lt;br /&gt;
==== Summary of results ====&lt;br /&gt;
&lt;br /&gt;
Overall, the following pattern of results emerges from the studies:&lt;br /&gt;
- The Help Seeking Support Environment intervened appropriate actions during the learning process&lt;br /&gt;
- Students improved their help-seeking behavior while working with the system&lt;br /&gt;
- Students acquired better help-seeking declarative knowledge following the system&lt;br /&gt;
- However, students&#039; domain learning did not improve&lt;br /&gt;
- Also, the improvement in students&#039; help seeking behavior did not persist beyond the tutoring system.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
These somewhat disappointing results raise an important question: Why did the environment not lead to an improvement in learning and in help-seeking behavior in the paper-test measures? Why did the improved online help-seeking behavior not lead to improved learning gains?&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 1: Students do not have the skills, but we didn&#039;t teach them right. ==== &lt;br /&gt;
One possible explanation may be that the Help Seeking Support Environment requires excessive cognitive load during problem solving. Clearly, the learning process with the Help Seeking Support Environment is more demanding compared with the conventional Cognitive Tutor alone, since more needs to be learned. However, much of the extra content is introduced during the classroom discussion and self-assessment sessions. The only extra content presented during the problem-solving sessions are the Help Tutor’s error messages, but they are not expected to increase the load much, especially given that a prioritization algorithm makes sure students receive only one message at a time (either from the Help Tutor or the [[cognitive tutor]]). Also, if the Help Seeking Support Environment indeed creates requires too much cognitive load, it should be expected to hinder learning, which we did not observe.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 2: The role of help seeking in ITS ==== &lt;br /&gt;
Hints in tutoring systems have two objectives: to promote learning of challenging skills, and to help students move forward within the curriculum (i.e., to prevent them from getting stuck). While the latter is achieved easily with both the Cognitive Tutor and the Help Seeking Support Environment, achieving the first is much harder. It is not yet clear what makes a hint into a good hint, and how to create an effective [[hint sequence]]. It is possible that the hints, as implemented in the units of the Cognitive Tutor we used, are not optimal. For example, there may be too many levels of hints, with each level adding too little information to the previous one. Also, perhaps the detailed explanations are too demanding with regard to students’ reading comprehension ability. It is quite possible that these hints, regardless of how they are being used, are above students&#039; [[zone of proximal development]], and thus do not contribute much to learning. Support for that idea comes from Schworm and Renkl (2002). They found that offering explanations by the system impaired learning when [[self-explanation]] was required. The Geometry Cognitive Tutor prompts for [[self-explanation]] in certain units. Perhaps elaborated hints are redundant, or even damaging, when [[self-explanation]] is required. &lt;br /&gt;
It is possible also that [[help-seeking behavior]] that we currently view as faulty may actually be useful and desirable, in specific contexts for specific students. For example, perhaps a student who does not know the material should be allowed to view the bottom-out hint immediately, in order to turn the problem into a solved example. Support for that idea can be found in a work by Yudelson et al. (2006), in which medical students in a leading med school successfully learned by repeatedly asking for more elaborated hints. Such “clicking-though hints” behavior would be considered faulty by Help Seeking Support Environment. However, this population of students is known to have good metacognitive skills (without them it is unlikely they would have reached their current position). Thus, it seems that sometimes “misusing” help (e.g., [[help abuse]], according to the Help Seeking Support Environment) can be beneficial to some students. Further evidence can be found in Baker et al. (2004), who showed that some (but not all) students who “[[game the system]]” (i.e., click through hints or guess repeatedly) learn just as much as students who do not game. It may be the case that certain gaming behaviors are adaptive, and not irrational. Students who use these strategies will insist on viewing the [[bottom out hint]] and will ignore all intermediate hints, whether domain-level or metacognitive. Once intermediate hints are ignored, better [[help-seeking behavior]] according to the Help Seeking Support Environment should have no effect whatsoever on domain [[knowledge component]]s, as indeed was seen.&lt;br /&gt;
It is possible that we are overestimating students’ ability to learn from hints. Our first recommendation is to re-evaluate the role of hints in [[cognitive tutor]]s using complementary methodologies such as log-file analysis (e.g., Chang (2006) uses dynamic Bayes nets to evaluate the contribution of hints in a reading tutor); tracing individual students (to evaluate the different uses students make of hints), experimentation of different types of hints (for example, proactive vs. on demand), and analysis of human tutors who aid students while working with [[cognitive tutor]]s.&lt;br /&gt;
&lt;br /&gt;
==== Hypothesis 3: The focus of metacognitive tutoring in ITS. ====&lt;br /&gt;
The previous hypothesis, focused on students’ tendency to skip hints, suggests that perhaps the main issue is not lack of knowledge, but lack of motivation. In other words, &#039;&#039;&#039;perhaps students already have these skills in place, but choose not to use them.&#039;&#039;&#039; For students who ignore intermediate hints, metacognitive messages offer little incentive. While the Help Seeking Support Environment can increase the probability that a proper hint level appears on the screen, it has no influence on whether it is being read, or whether the student attempts to understand it. Students may ignore the messages for several reasons. For example, they may habitually click through hints, and may resent the changes that the Help Seeking Support Environment imposes. This idea is consistent with the teachers’ observation that the students were not fond of the Help Seeking Support Environment error messages. They may comply with them, in order to make progress, but beyond that will ignore their content. The test data discussed above provides support for this idea. On 7 out of the 12 hint evaluations (seen in the findings section of goal 4) students scored lower on items with hints than on items with no hints. A [[cognitive headroom]] explanation does not account for this difference, since the Request hints did not add much load. A more likely explanation is that students chose to skip the hints since they were new to them in the given context. Baker (2005) reviewed several reasons for why students [[game the system]]. While no clear answer was given, the question is applicable here as well. &lt;br /&gt;
Motivational issues bring us to our final hypothesis. Time preference discount (Feldstein, 1964) is a term coined in economics, that describes behavior in which people would rather have a smaller immediate reward over a distant greater reward. In the tutoring environment, comparing the benefit of immediate correct answer with the delayed benefit (if any) of acting metacognitively correct may often lead the student to choose the first. If that is indeed the case, then students may already have the right metacognitive skills in place. The question we should be asking ourselves is not only how to get students to learn the desired metacognitive skills – but mainly, how to get students to use them.&lt;br /&gt;
&lt;br /&gt;
=== Connections===&lt;br /&gt;
&lt;br /&gt;
* The Help Tutor attempts to extend traditional tutoring beyond the common domains. In that, it is similar to the work of Amy Ogan on tutoring [[FrenchCulture | French Culture]]&lt;br /&gt;
&lt;br /&gt;
* The manipulation of interaction between the student and the tutor, which is &amp;quot;natural&amp;quot; in the control condition, is guided by the help tutor.  This is similar to the scripting manipulation of the [[Rummel Scripted Collaborative Problem Solving]] and the [[Walker A Peer Tutoring Addition]] projects.&lt;br /&gt;
&lt;br /&gt;
* Another example for studying the effects of hints is [[Ringenberg Examples-as-Help|Ringenberg&#039;s study]], in which hints are compared to examples. &lt;br /&gt;
&lt;br /&gt;
* Going to do an in-vivo study at a LearnLab site? Check out how to answer [[FAQ for teachers|teacher&#039;s FAQ]]&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
Plans for June 2007 - Dec. 2007:&lt;br /&gt;
* Present the study in the International Conference on Artificial Intelligence on Education&lt;br /&gt;
* Submit camera ready copy of the paper to the Journal on Metacognition and Instruction&lt;br /&gt;
* Analyze the logfiles for Metacognitive learning&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
# Aleven, V., &amp;amp; Koedinger, K.R. (2000) Limitations of student control: Do students know when they need help? in proceedings of 5th International Conference on Intelligent Tutoring Systems, 292-303. Berlin: Springer Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2004) Toward tutoring help seeking - Applying cognitive modeling to meta-cognitive skills . in proceedings of 7th International Conference on Intelligent Tutoring Systems, 227-39. Berlin: Springer-Verlag. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., Roll, I., McLaren, B.M., Ryu, E.J., &amp;amp; Koedinger, K.R. (2005) An architecture to combine meta-cognitive and cognitive tutoring: Pilot testing the Help Tutor. in proceedings of 12th International Conference on Artificial Intelligence in Education, Amsterdam, The Netherlands: IOS press. [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Aleven, V., McLaren, B.M., Roll, I., &amp;amp; Koedinger, K.R. (2006). Toward meta-cognitive tutoring: A model of help seeking with a Cognitive Tutor. Int Journal of Artificial Intelligence in Education(16), 101-30 [[http://www.cs.cmu.edu/~aleven/publications.html#help-seeking pdf]]&lt;br /&gt;
# Baker, R.S., Corbett, A.T., &amp;amp; Koedinger, K.R. (2004) Detecting Student Misuse of Intelligent Tutoring Systems. in proceedings of 7Th International Conference on Intelligent Tutoring Systems, 531-40.&lt;br /&gt;
# Baker, R.S., Roll, I., Corbett, A.T., &amp;amp; Koedinger, K.R. (2005) Do Performance Goals Lead Students to Game the System? in proceedings of 12Th International Conference on Artificial Intelligence in Education, 57-64. Amsterdam, The Netherlands: IOS Press.&lt;br /&gt;
# Chang, K.K., Beck, J.E., Mostow, J., &amp;amp; Corbett, A. (2006) Does Help Help? A Bayes Net Approach to Modeling Tutor Interventions. in proceedings of Workshop on Educational Data Mining at AAAI 2006, 41-6. Menlo Park, California: AAAI.&lt;br /&gt;
# Feldstein, M.S. (1964). The Social Time Presence Discount Rate in Cost-Benefit Analysis. The Economic Journal 74(294), 360-79&lt;br /&gt;
# Roll, I., Baker, R.S., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (2005) Modeling Students’ Metacognitive Errors in Two Intelligent Tutoring Systems. in L. Ardissono,  (Eds.), in proceedings of User Modeling 2005, 379-88. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Ryu, E., Sewall, J., Leber, B., McLaren, B.M., Aleven, V., &amp;amp; Koedinger, K.R. (2006) Towards Teaching Metacognition: Supporting Spontaneous Self-Assessment. in proceedings of 8th International Conference on Intelligent Tutoring Systems, 738-40. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., Ryu, E., Baker, R.S., &amp;amp; Koedinger, K.R. (2006) The Help Tutor: Does Metacognitive Feedback Improves Students&#039; Help-Seeking Actions, Skills and Learning? in proceedings of 8th Int C on Intelligent Tutoring Systems, 360-9. Berlin: Springer Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
# Roll, I., Aleven, V., McLaren, B.M., &amp;amp; Koedinger, K.R. (To appear) Can Help Seeking Be Tutored? Searching for the Secret Sauce of Metacognitive Tutoring. To appear in the proceedings of the International Conference on Artificial Intelligence in Education 2007.&lt;br /&gt;
# Schworm, S., &amp;amp; Renkl, A. (2002) Learning by solved example problems: Instructional explanations reduce self-explanation activity. in proceedings of The 24Th Annual Conference of the Cognitive Science Society, 816-21. Mahwah, NJ: Erlbaum.&lt;br /&gt;
# Yudelson, M.V., Medvedeva, O., Legowski, E., Castine, M., Jukic, D., &amp;amp; Crowley, R.S. (2006) Mining Student Learning Data to Develop High Level Pedagogic Strategy in a Medical ITS. in proceedings of Workshop on Educational Data Mining at AAAI 2006, Menlo Park, CA: AAAI.# Roll, I., Aleven, V., &amp;amp; Koedinger, K.R. (2004) Promoting Effective Help-Seeking Behavior through Declarative Instruction. in proceedings of 7th International Conference on Intelligent Tutoring Systems, 857-9. Berlin: Springer-Verlag. [[http://www.andrew.cmu.edu/user/iroll/Publications.html pdf]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Empirical Study]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Procedural_knowledge.jpg&amp;diff=7130</id>
		<title>File:Procedural knowledge.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Procedural_knowledge.jpg&amp;diff=7130"/>
		<updated>2008-02-13T19:30:41Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Interest_combined.jpg&amp;diff=7129</id>
		<title>File:Interest combined.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Interest_combined.jpg&amp;diff=7129"/>
		<updated>2008-02-13T19:27:27Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Graph40.jpg&amp;diff=7128</id>
		<title>File:Graph40.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Graph40.jpg&amp;diff=7128"/>
		<updated>2008-02-13T19:24:27Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7127</id>
		<title>REAP Study on Personalization of Readings by Topic (Fall 2006)</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7127"/>
		<updated>2008-02-13T19:21:50Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Findings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== REAP Study on Personalization of Readings for Increased Interest ==&lt;br /&gt;
 &lt;br /&gt;
=== Logistical Information ===&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Contributors&#039;&#039;&#039; || Maxine Eskenazi, Alan Juffs, Michael Heilman, Kevyn Collins-Thompson, Lois Wilson, Jamie Callan   &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || September 11, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || November 21, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Learnlab Courses&#039;&#039;&#039; || English Language Institute Reading 4 (ESL LearnLab) &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || 35 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours (est.)&#039;&#039;&#039; || 270 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Data in Datashop&#039;&#039;&#039; || no &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
This paper discusses the enhancement of the REAP tutor to allow for [[personalization]] of reading materials&lt;br /&gt;
by topic in order to increase interest and motivation. In this work, the term “[[personalization]]” refers to the&lt;br /&gt;
selection of practice readings in order to match a student’s interests. &lt;br /&gt;
&lt;br /&gt;
During each training session with REAP, students work through a series of readings, each of which is followed by&lt;br /&gt;
practice exercises for the target words in the reading. While reading a passage, students are able to access&lt;br /&gt;
dictionary definitions for any word in a reading either by clicking on a highlighted target word or by typing a&lt;br /&gt;
word into a box in the lower-left corner of the screen. The target words in the readings are also highlighted&lt;br /&gt;
because highlighting may increase the use of dictionary definitions, thus encouraging students to&lt;br /&gt;
coordinate multiple sources of information about a word’s meaning—namely, the implicit context around&lt;br /&gt;
words and the explicit definitions of words.&lt;br /&gt;
&lt;br /&gt;
A problem discovered in past studies with REAP is that many students spend only a brief amount of time&lt;br /&gt;
on a reading and do not deeply process the text. Students often only read the dictionary definition for target&lt;br /&gt;
words rather than attempting to process the entire context around the words. Inferring the meaning of&lt;br /&gt;
vocabulary from context is a seemingly important strategy that is not used by such students. This behavior is likely due to a desire to perform well on post-reading practice exercises and post-test, which can be viewed as forms of extrinsic motivation. Intrinsically&lt;br /&gt;
motivated students who are more interested in a reading are more likely to read the entire text and to use&lt;br /&gt;
context to learn the meaning of unknown vocabulary. Therefore, [[personalization]] that increases intrinsic&lt;br /&gt;
motivation could lead to deeper processing of context and better learning of vocabulary.&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| ||&#039;&#039;&#039;Passive&#039;&#039;&#039; || &#039;&#039;&#039;Active&#039;&#039;&#039; || &#039;&#039;&#039;Interactive&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Explicit (general)&#039;&#039;&#039; || Dictionary Definitions ||  || Practice Exercises&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Implicit (instance)&#039;&#039;&#039; || Interpreting meaning in context while reading || Sentence Production (assessment) || Practice Exercises&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Intrinsic Motivation:&#039;&#039; Motivation to learn for learning&#039;s own sake rather than some external goal.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Extrinsic Motivation:&#039;&#039; Motivation for learn in order to satisfy an external goal, such as completing a task or passing an assessment.&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
&lt;br /&gt;
Do the benefits of [[personalization]] of practice readings by topics of interest outweigh the costs in a tutoring system for ESL vocabulary practice?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
[[Normal post-test]] scores &lt;br /&gt;
&lt;br /&gt;
[[Normal post-test]] scores for practiced words only&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test scores, same post-test but administered months later.&lt;br /&gt;
&lt;br /&gt;
Evidence of [[Transfer]]: sentence production tasks for target words, correct use of words in writing assignments for other courses.&lt;br /&gt;
&lt;br /&gt;
=== Independent variables ===&lt;br /&gt;
[[Personalization]] of readings by topics of interest.  In the control condition, the tutor did not use potential personal interest as a factor in its selection of reading materials.  In the treatment condition, the tutor did use interest as a factor.  All other selection criteria were the same in both conditions.  Time on task was also the same.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
&lt;br /&gt;
Since intrinsic motivation seems to be important in language learning, the benefits of [[personalization]] will outweigh the costs.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Students in the treatment condition with [[personalization]] performed better on average (M=35.5%, SD=14.9%) in terms of overall post-test scores compared to students in the control condition (M=27.1%, SD=17.2%).  Further analysis, including statistical tests for significance, is forthcoming....&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:graph40.jpg|500px]]&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
There is evidence that the difference in post-test scores is due to increased interest leading to deeper processing of the reading practice texts.&lt;br /&gt;
&lt;br /&gt;
Responses to questionnaires following each reading show the interest level of students using the REAP tutor.  The questionnaires asked students to indicate on a scale from one to five their interest inr the preceding text.  The distributions of post-reading interest ratings for students in the treatment and control conditions are shown in Figures 1 and 2.&lt;br /&gt;
&lt;br /&gt;
[[Image:Interest_combined.jpg|700px]]&lt;br /&gt;
&lt;br /&gt;
Students were also given an exit survey during their last week of practice with the tutor that asked them, among other questions, for to indicate whether they agreed with the statement, “Most of the readings were interesting.”  The ratings were on a scale from one to five, with five indicating strong agreement and one indicating strong disagreement.  Exit survey interest ratings by students in the treatment condition were significantly higher (p&amp;lt;0.05) than the ratings by students in the control condition.  The mean response for students who received personalized readings was 3.18, while it was 2.65 for students in the control condition.&lt;br /&gt;
&lt;br /&gt;
Further analysis of post-test scores reveals that students did learn more of the words that they actually practiced in REAP.  The post-test contained 40 questions for target vocabulary words.  Many of the students did not practice 40 words, so performance on practiced words alone was analyzed.  Students in the treatment condition scored higher (N=16, M=50.3, SD=20.1) on questions for words seen in readings than did students in the control condition (N=19, M=32.4, SD=18.9).  A two-tailed t-test for independent means verified that this result is statistically significant (t=2.719, df=33, p=0.005).  The difference of scores between the two groups was 17.9% (95% CI = 4.5%, 31.3%), which corresponds to a large effect size of 0.85.  This result indicates that [[personalization]] improved learning for the words that students saw in readings, which is in line with previous findings that intrinsic motivation leads to improved learning.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:post_just_practice.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
However, students in the treatment condition that included [[personalization]] saw fewer words in their training sessions (N=16, M=12.0 , SD=1.13) than students in the control condition (N=19, M=16.3, SD=0.87) (t=-2.9, df=33, p=0.006).  Average time on task was essentially the same for students in both conditions.  Students in the treatment condition spent slightly longer on each reading.  The main reason, however, for the difference in the average total number of words practiced was that students for whom the tutor provided personalized instruction saw fewer words (M=3.41, SD=0.55) per practice reading passage than students in the control condition (M=4.07, SD=0.83) (t=2.929, df=33, p=0.006).&lt;br /&gt;
&lt;br /&gt;
Thus, when the tutor used [[personalization]] as a factor in the selection of readings, it chose readings that were less valuable according to other factors.  Specifically, this result shows that by personalizing instruction, the tutor was not able to provide practice for as many words.  Of course, the practice that it did provide was better, as is shown in the previous result that for words student did practice, [[personalization]] appeared to increase learning.&lt;br /&gt;
&lt;br /&gt;
The reduced number of target words per text with personalization is a technical issue which can be avoided in a straightforward manner by increasing the size of the database of readings.  With more readings, the tutor can find texts that both have ample target words and cover topics of personal interest.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:words_per_reading.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
There is a possibility that the students in the treatment condition who were seeing fewer words in each reading were learning more of the words simply because they had fewer to learn per reading.  To rule out this hypothesis, regression analyses (multiple linear regression) with overall post-test performance and performance for practiced words as the dependent variables.  In both regression analyses, the number of target words per reading was not a significant predictor of performance.  In fact, the number of target words per document was slightly positively correlated with post-test performance in both cases.  This result seems to rule out the possibility that students were learning more target words in the treatment condition because they were seeing fewer words.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test results showed no reliable differences because of a small sample size.  The test was administered to students who stayed in the ELI in the subsequent semester, which constituted only a fraction of the original sample.&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
&lt;br /&gt;
The following study addresses a different form of personalization, by which interactions with the learner (e.g., instructions, directions) are conducted using casual and direct rather than formal language:&lt;br /&gt;
&lt;br /&gt;
[[Stoichiometry_Study | Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems (McLaren, Koedinger &amp;amp; Yaron)]]&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
Note: a paper on this study has been submitted to International Journal of Artificial Intelligence in Education.&lt;br /&gt;
&lt;br /&gt;
[http://reap.cs.cmu.edu/Papers/heilman_topic_choice_AIED2007_poster_final.pdf Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (2007). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)]&lt;br /&gt;
&lt;br /&gt;
Clark, R. C. and Mayer, R. E. (2003). e-Learning and the Science of Instruction.  Jossey-Bass/Pfeiffer.&lt;br /&gt;
&lt;br /&gt;
Cordova, D. I. &amp;amp; Lepper, M. R. (1996).  Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice.  Journal of Educational Psychology.  Vol. 88,l No. 4, 715-730. &lt;br /&gt;
&lt;br /&gt;
Lepper, M.  (1988).  Motivational Considerations in the Study of Instruction.  Cognition and Instruction. 5(4), 289-309.&lt;br /&gt;
&lt;br /&gt;
Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (To Appear). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Post_just_practice.jpg&amp;diff=7126</id>
		<title>File:Post just practice.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Post_just_practice.jpg&amp;diff=7126"/>
		<updated>2008-02-13T19:20:59Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Words_per_reading.jpg&amp;diff=7125</id>
		<title>File:Words per reading.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Words_per_reading.jpg&amp;diff=7125"/>
		<updated>2008-02-13T19:16:56Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7124</id>
		<title>REAP Study on Personalization of Readings by Topic (Fall 2006)</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7124"/>
		<updated>2008-02-13T19:15:46Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Explanation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== REAP Study on Personalization of Readings for Increased Interest ==&lt;br /&gt;
 &lt;br /&gt;
=== Logistical Information ===&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Contributors&#039;&#039;&#039; || Maxine Eskenazi, Alan Juffs, Michael Heilman, Kevyn Collins-Thompson, Lois Wilson, Jamie Callan   &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || September 11, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || November 21, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Learnlab Courses&#039;&#039;&#039; || English Language Institute Reading 4 (ESL LearnLab) &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || 35 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours (est.)&#039;&#039;&#039; || 270 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Data in Datashop&#039;&#039;&#039; || no &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
This paper discusses the enhancement of the REAP tutor to allow for [[personalization]] of reading materials&lt;br /&gt;
by topic in order to increase interest and motivation. In this work, the term “[[personalization]]” refers to the&lt;br /&gt;
selection of practice readings in order to match a student’s interests. &lt;br /&gt;
&lt;br /&gt;
During each training session with REAP, students work through a series of readings, each of which is followed by&lt;br /&gt;
practice exercises for the target words in the reading. While reading a passage, students are able to access&lt;br /&gt;
dictionary definitions for any word in a reading either by clicking on a highlighted target word or by typing a&lt;br /&gt;
word into a box in the lower-left corner of the screen. The target words in the readings are also highlighted&lt;br /&gt;
because highlighting may increase the use of dictionary definitions, thus encouraging students to&lt;br /&gt;
coordinate multiple sources of information about a word’s meaning—namely, the implicit context around&lt;br /&gt;
words and the explicit definitions of words.&lt;br /&gt;
&lt;br /&gt;
A problem discovered in past studies with REAP is that many students spend only a brief amount of time&lt;br /&gt;
on a reading and do not deeply process the text. Students often only read the dictionary definition for target&lt;br /&gt;
words rather than attempting to process the entire context around the words. Inferring the meaning of&lt;br /&gt;
vocabulary from context is a seemingly important strategy that is not used by such students. This behavior is likely due to a desire to perform well on post-reading practice exercises and post-test, which can be viewed as forms of extrinsic motivation. Intrinsically&lt;br /&gt;
motivated students who are more interested in a reading are more likely to read the entire text and to use&lt;br /&gt;
context to learn the meaning of unknown vocabulary. Therefore, [[personalization]] that increases intrinsic&lt;br /&gt;
motivation could lead to deeper processing of context and better learning of vocabulary.&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| ||&#039;&#039;&#039;Passive&#039;&#039;&#039; || &#039;&#039;&#039;Active&#039;&#039;&#039; || &#039;&#039;&#039;Interactive&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Explicit (general)&#039;&#039;&#039; || Dictionary Definitions ||  || Practice Exercises&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Implicit (instance)&#039;&#039;&#039; || Interpreting meaning in context while reading || Sentence Production (assessment) || Practice Exercises&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Intrinsic Motivation:&#039;&#039; Motivation to learn for learning&#039;s own sake rather than some external goal.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Extrinsic Motivation:&#039;&#039; Motivation for learn in order to satisfy an external goal, such as completing a task or passing an assessment.&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
&lt;br /&gt;
Do the benefits of [[personalization]] of practice readings by topics of interest outweigh the costs in a tutoring system for ESL vocabulary practice?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
[[Normal post-test]] scores &lt;br /&gt;
&lt;br /&gt;
[[Normal post-test]] scores for practiced words only&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test scores, same post-test but administered months later.&lt;br /&gt;
&lt;br /&gt;
Evidence of [[Transfer]]: sentence production tasks for target words, correct use of words in writing assignments for other courses.&lt;br /&gt;
&lt;br /&gt;
=== Independent variables ===&lt;br /&gt;
[[Personalization]] of readings by topics of interest.  In the control condition, the tutor did not use potential personal interest as a factor in its selection of reading materials.  In the treatment condition, the tutor did use interest as a factor.  All other selection criteria were the same in both conditions.  Time on task was also the same.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
&lt;br /&gt;
Since intrinsic motivation seems to be important in language learning, the benefits of [[personalization]] will outweigh the costs.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Students in the treatment condition with [[personalization]] performed better on average (M=35.5%, SD=14.9%) in terms of overall post-test scores compared to students in the control condition (M=27.1%, SD=17.2%).  Further analysis, including statistical tests for significance, is forthcoming....&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:graph40.PNG|500px]]&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
There is evidence that the difference in post-test scores is due to increased interest leading to deeper processing of the reading practice texts.&lt;br /&gt;
&lt;br /&gt;
Responses to questionnaires following each reading show the interest level of students using the REAP tutor.  The questionnaires asked students to indicate on a scale from one to five their interest inr the preceding text.  The distributions of post-reading interest ratings for students in the treatment and control conditions are shown in Figures 1 and 2.&lt;br /&gt;
&lt;br /&gt;
[[Image:Interest_combined.jpg|700px]]&lt;br /&gt;
&lt;br /&gt;
Students were also given an exit survey during their last week of practice with the tutor that asked them, among other questions, for to indicate whether they agreed with the statement, “Most of the readings were interesting.”  The ratings were on a scale from one to five, with five indicating strong agreement and one indicating strong disagreement.  Exit survey interest ratings by students in the treatment condition were significantly higher (p&amp;lt;0.05) than the ratings by students in the control condition.  The mean response for students who received personalized readings was 3.18, while it was 2.65 for students in the control condition.&lt;br /&gt;
&lt;br /&gt;
Further analysis of post-test scores reveals that students did learn more of the words that they actually practiced in REAP.  The post-test contained 40 questions for target vocabulary words.  Many of the students did not practice 40 words, so performance on practiced words alone was analyzed.  Students in the treatment condition scored higher (N=16, M=50.3, SD=20.1) on questions for words seen in readings than did students in the control condition (N=19, M=32.4, SD=18.9).  A two-tailed t-test for independent means verified that this result is statistically significant (t=2.719, df=33, p=0.005).  The difference of scores between the two groups was 17.9% (95% CI = 4.5%, 31.3%), which corresponds to a large effect size of 0.85.  This result indicates that [[personalization]] improved learning for the words that students saw in readings, which is in line with previous findings that intrinsic motivation leads to improved learning.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:post_just_practice.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
However, students in the treatment condition that included [[personalization]] saw fewer words in their training sessions (N=16, M=12.0 , SD=1.13) than students in the control condition (N=19, M=16.3, SD=0.87) (t=-2.9, df=33, p=0.006).  Average time on task was essentially the same for students in both conditions.  Students in the treatment condition spent slightly longer on each reading.  The main reason, however, for the difference in the average total number of words practiced was that students for whom the tutor provided personalized instruction saw fewer words (M=3.41, SD=0.55) per practice reading passage than students in the control condition (M=4.07, SD=0.83) (t=2.929, df=33, p=0.006).&lt;br /&gt;
&lt;br /&gt;
Thus, when the tutor used [[personalization]] as a factor in the selection of readings, it chose readings that were less valuable according to other factors.  Specifically, this result shows that by personalizing instruction, the tutor was not able to provide practice for as many words.  Of course, the practice that it did provide was better, as is shown in the previous result that for words student did practice, [[personalization]] appeared to increase learning.&lt;br /&gt;
&lt;br /&gt;
The reduced number of target words per text with personalization is a technical issue which can be avoided in a straightforward manner by increasing the size of the database of readings.  With more readings, the tutor can find texts that both have ample target words and cover topics of personal interest.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:words_per_reading.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
There is a possibility that the students in the treatment condition who were seeing fewer words in each reading were learning more of the words simply because they had fewer to learn per reading.  To rule out this hypothesis, regression analyses (multiple linear regression) with overall post-test performance and performance for practiced words as the dependent variables.  In both regression analyses, the number of target words per reading was not a significant predictor of performance.  In fact, the number of target words per document was slightly positively correlated with post-test performance in both cases.  This result seems to rule out the possibility that students were learning more target words in the treatment condition because they were seeing fewer words.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test results showed no reliable differences because of a small sample size.  The test was administered to students who stayed in the ELI in the subsequent semester, which constituted only a fraction of the original sample.&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
&lt;br /&gt;
The following study addresses a different form of personalization, by which interactions with the learner (e.g., instructions, directions) are conducted using casual and direct rather than formal language:&lt;br /&gt;
&lt;br /&gt;
[[Stoichiometry_Study | Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems (McLaren, Koedinger &amp;amp; Yaron)]]&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
Note: a paper on this study has been submitted to International Journal of Artificial Intelligence in Education.&lt;br /&gt;
&lt;br /&gt;
[http://reap.cs.cmu.edu/Papers/heilman_topic_choice_AIED2007_poster_final.pdf Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (2007). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)]&lt;br /&gt;
&lt;br /&gt;
Clark, R. C. and Mayer, R. E. (2003). e-Learning and the Science of Instruction.  Jossey-Bass/Pfeiffer.&lt;br /&gt;
&lt;br /&gt;
Cordova, D. I. &amp;amp; Lepper, M. R. (1996).  Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice.  Journal of Educational Psychology.  Vol. 88,l No. 4, 715-730. &lt;br /&gt;
&lt;br /&gt;
Lepper, M.  (1988).  Motivational Considerations in the Study of Instruction.  Cognition and Instruction. 5(4), 289-309.&lt;br /&gt;
&lt;br /&gt;
Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (To Appear). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7123</id>
		<title>REAP Study on Personalization of Readings by Topic (Fall 2006)</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=REAP_Study_on_Personalization_of_Readings_by_Topic_(Fall_2006)&amp;diff=7123"/>
		<updated>2008-02-13T19:15:19Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Explanation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== REAP Study on Personalization of Readings for Increased Interest ==&lt;br /&gt;
 &lt;br /&gt;
=== Logistical Information ===&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Contributors&#039;&#039;&#039; || Maxine Eskenazi, Alan Juffs, Michael Heilman, Kevyn Collins-Thompson, Lois Wilson, Jamie Callan   &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || September 11, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || November 21, 2006  &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Learnlab Courses&#039;&#039;&#039; || English Language Institute Reading 4 (ESL LearnLab) &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || 35 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours (est.)&#039;&#039;&#039; || 270 &lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Data in Datashop&#039;&#039;&#039; || no &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
&lt;br /&gt;
This paper discusses the enhancement of the REAP tutor to allow for [[personalization]] of reading materials&lt;br /&gt;
by topic in order to increase interest and motivation. In this work, the term “[[personalization]]” refers to the&lt;br /&gt;
selection of practice readings in order to match a student’s interests. &lt;br /&gt;
&lt;br /&gt;
During each training session with REAP, students work through a series of readings, each of which is followed by&lt;br /&gt;
practice exercises for the target words in the reading. While reading a passage, students are able to access&lt;br /&gt;
dictionary definitions for any word in a reading either by clicking on a highlighted target word or by typing a&lt;br /&gt;
word into a box in the lower-left corner of the screen. The target words in the readings are also highlighted&lt;br /&gt;
because highlighting may increase the use of dictionary definitions, thus encouraging students to&lt;br /&gt;
coordinate multiple sources of information about a word’s meaning—namely, the implicit context around&lt;br /&gt;
words and the explicit definitions of words.&lt;br /&gt;
&lt;br /&gt;
A problem discovered in past studies with REAP is that many students spend only a brief amount of time&lt;br /&gt;
on a reading and do not deeply process the text. Students often only read the dictionary definition for target&lt;br /&gt;
words rather than attempting to process the entire context around the words. Inferring the meaning of&lt;br /&gt;
vocabulary from context is a seemingly important strategy that is not used by such students. This behavior is likely due to a desire to perform well on post-reading practice exercises and post-test, which can be viewed as forms of extrinsic motivation. Intrinsically&lt;br /&gt;
motivated students who are more interested in a reading are more likely to read the entire text and to use&lt;br /&gt;
context to learn the meaning of unknown vocabulary. Therefore, [[personalization]] that increases intrinsic&lt;br /&gt;
motivation could lead to deeper processing of context and better learning of vocabulary.&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
|+ &lt;br /&gt;
|-&lt;br /&gt;
| ||&#039;&#039;&#039;Passive&#039;&#039;&#039; || &#039;&#039;&#039;Active&#039;&#039;&#039; || &#039;&#039;&#039;Interactive&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Explicit (general)&#039;&#039;&#039; || Dictionary Definitions ||  || Practice Exercises&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Implicit (instance)&#039;&#039;&#039; || Interpreting meaning in context while reading || Sentence Production (assessment) || Practice Exercises&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Intrinsic Motivation:&#039;&#039; Motivation to learn for learning&#039;s own sake rather than some external goal.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Extrinsic Motivation:&#039;&#039; Motivation for learn in order to satisfy an external goal, such as completing a task or passing an assessment.&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
&lt;br /&gt;
Do the benefits of [[personalization]] of practice readings by topics of interest outweigh the costs in a tutoring system for ESL vocabulary practice?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
[[Normal post-test]] scores &lt;br /&gt;
&lt;br /&gt;
[[Normal post-test]] scores for practiced words only&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test scores, same post-test but administered months later.&lt;br /&gt;
&lt;br /&gt;
Evidence of [[Transfer]]: sentence production tasks for target words, correct use of words in writing assignments for other courses.&lt;br /&gt;
&lt;br /&gt;
=== Independent variables ===&lt;br /&gt;
[[Personalization]] of readings by topics of interest.  In the control condition, the tutor did not use potential personal interest as a factor in its selection of reading materials.  In the treatment condition, the tutor did use interest as a factor.  All other selection criteria were the same in both conditions.  Time on task was also the same.&lt;br /&gt;
&lt;br /&gt;
=== Hypotheses ===&lt;br /&gt;
&lt;br /&gt;
Since intrinsic motivation seems to be important in language learning, the benefits of [[personalization]] will outweigh the costs.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Students in the treatment condition with [[personalization]] performed better on average (M=35.5%, SD=14.9%) in terms of overall post-test scores compared to students in the control condition (M=27.1%, SD=17.2%).  Further analysis, including statistical tests for significance, is forthcoming....&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:graph40.PNG|500px]]&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
There is evidence that the difference in post-test scores is due to increased interest leading to deeper processing of the reading practice texts.&lt;br /&gt;
&lt;br /&gt;
Responses to questionnaires following each reading show the interest level of students using the REAP tutor.  The questionnaires asked students to indicate on a scale from one to five their interest inr the preceding text.  The distributions of post-reading interest ratings for students in the treatment and control conditions are shown in Figures 1 and 2.&lt;br /&gt;
&lt;br /&gt;
[[Image:Interest_combined.PNG|700px]]&lt;br /&gt;
&lt;br /&gt;
Students were also given an exit survey during their last week of practice with the tutor that asked them, among other questions, for to indicate whether they agreed with the statement, “Most of the readings were interesting.”  The ratings were on a scale from one to five, with five indicating strong agreement and one indicating strong disagreement.  Exit survey interest ratings by students in the treatment condition were significantly higher (p&amp;lt;0.05) than the ratings by students in the control condition.  The mean response for students who received personalized readings was 3.18, while it was 2.65 for students in the control condition.&lt;br /&gt;
&lt;br /&gt;
Further analysis of post-test scores reveals that students did learn more of the words that they actually practiced in REAP.  The post-test contained 40 questions for target vocabulary words.  Many of the students did not practice 40 words, so performance on practiced words alone was analyzed.  Students in the treatment condition scored higher (N=16, M=50.3, SD=20.1) on questions for words seen in readings than did students in the control condition (N=19, M=32.4, SD=18.9).  A two-tailed t-test for independent means verified that this result is statistically significant (t=2.719, df=33, p=0.005).  The difference of scores between the two groups was 17.9% (95% CI = 4.5%, 31.3%), which corresponds to a large effect size of 0.85.  This result indicates that [[personalization]] improved learning for the words that students saw in readings, which is in line with previous findings that intrinsic motivation leads to improved learning.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:post_just_practice.PNG|400px]]&lt;br /&gt;
&lt;br /&gt;
However, students in the treatment condition that included [[personalization]] saw fewer words in their training sessions (N=16, M=12.0 , SD=1.13) than students in the control condition (N=19, M=16.3, SD=0.87) (t=-2.9, df=33, p=0.006).  Average time on task was essentially the same for students in both conditions.  Students in the treatment condition spent slightly longer on each reading.  The main reason, however, for the difference in the average total number of words practiced was that students for whom the tutor provided personalized instruction saw fewer words (M=3.41, SD=0.55) per practice reading passage than students in the control condition (M=4.07, SD=0.83) (t=2.929, df=33, p=0.006).&lt;br /&gt;
&lt;br /&gt;
Thus, when the tutor used [[personalization]] as a factor in the selection of readings, it chose readings that were less valuable according to other factors.  Specifically, this result shows that by personalizing instruction, the tutor was not able to provide practice for as many words.  Of course, the practice that it did provide was better, as is shown in the previous result that for words student did practice, [[personalization]] appeared to increase learning.&lt;br /&gt;
&lt;br /&gt;
The reduced number of target words per text with personalization is a technical issue which can be avoided in a straightforward manner by increasing the size of the database of readings.  With more readings, the tutor can find texts that both have ample target words and cover topics of personal interest.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:words_per_reading.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
There is a possibility that the students in the treatment condition who were seeing fewer words in each reading were learning more of the words simply because they had fewer to learn per reading.  To rule out this hypothesis, regression analyses (multiple linear regression) with overall post-test performance and performance for practiced words as the dependent variables.  In both regression analyses, the number of target words per reading was not a significant predictor of performance.  In fact, the number of target words per document was slightly positively correlated with post-test performance in both cases.  This result seems to rule out the possibility that students were learning more target words in the treatment condition because they were seeing fewer words.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] test results showed no reliable differences because of a small sample size.  The test was administered to students who stayed in the ELI in the subsequent semester, which constituted only a fraction of the original sample.&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
&lt;br /&gt;
The following study addresses a different form of personalization, by which interactions with the learner (e.g., instructions, directions) are conducted using casual and direct rather than formal language:&lt;br /&gt;
&lt;br /&gt;
[[Stoichiometry_Study | Studying the Learning Effect of Personalization and Worked Examples in the Solving of Stoichiometry Problems (McLaren, Koedinger &amp;amp; Yaron)]]&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
Note: a paper on this study has been submitted to International Journal of Artificial Intelligence in Education.&lt;br /&gt;
&lt;br /&gt;
[http://reap.cs.cmu.edu/Papers/heilman_topic_choice_AIED2007_poster_final.pdf Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (2007). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)]&lt;br /&gt;
&lt;br /&gt;
Clark, R. C. and Mayer, R. E. (2003). e-Learning and the Science of Instruction.  Jossey-Bass/Pfeiffer.&lt;br /&gt;
&lt;br /&gt;
Cordova, D. I. &amp;amp; Lepper, M. R. (1996).  Intrinsic Motivation and the Process of Learning: Beneficial Effects of Contextualization, Personalization, and Choice.  Journal of Educational Psychology.  Vol. 88,l No. 4, 715-730. &lt;br /&gt;
&lt;br /&gt;
Lepper, M.  (1988).  Motivational Considerations in the Study of Instruction.  Cognition and Instruction. 5(4), 289-309.&lt;br /&gt;
&lt;br /&gt;
Heilman, M., Juffs, A., &amp;amp; Eskenazi, M. (To Appear). Choosing Reading Passages for Vocabulary Learning by Topic to Increase Intrinsic Motivation. Proceedings of the 13th International Conferenced on Artificial Intelligence in Education. Marina del Rey, CA. (poster)&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Summary_Table.jpg&amp;diff=7122</id>
		<title>File:Summary Table.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Summary_Table.jpg&amp;diff=7122"/>
		<updated>2008-02-13T19:12:19Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Visual_Representations_in_Science_Learning&amp;diff=7121</id>
		<title>Visual Representations in Science Learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Visual_Representations_in_Science_Learning&amp;diff=7121"/>
		<updated>2008-02-13T19:11:21Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Summary Table */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Visual Representations in Science Learning ==&lt;br /&gt;
 Jodi Davenport&lt;br /&gt;
 &lt;br /&gt;
===Summary Table===&lt;br /&gt;
Total Studies: 7&amp;lt;br&amp;gt;&lt;br /&gt;
Total Participants: 1652&amp;lt;BR&amp;gt;&lt;br /&gt;
Total Participant hours: 7461&amp;lt;BR&amp;gt;&lt;br /&gt;
&amp;lt;P&amp;gt;&lt;br /&gt;
Additional Contributors&lt;br /&gt;
* David Yaron&lt;br /&gt;
* David Klahr&lt;br /&gt;
* Ken Koedinger&lt;br /&gt;
* Michael Karabinos&lt;br /&gt;
* Gaea Leinhardt&lt;br /&gt;
* Jim Greeno&lt;br /&gt;
* Katie McEldoon&lt;br /&gt;
&lt;br /&gt;
[[Image:Summary Table.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Visual representations, in the forms of diagrams, notation (e.g., equations), graphs and tables are fundamental tools in science instruction and practice. Whether diagrams or notational systems are helpful aids to problem solving depends critically on the content of the visual representation and how learners are able to process the information they contain. Expert/novice studies have demonstrated that different levels of experience result in differential processing of the same stimuli. However, it is not known how students are able to refine initially shallow understandings into meaningful chemical concepts or how the [[coordination]] of multiple representations helps with this process. &lt;br /&gt;
&lt;br /&gt;
The current project seeks to determine when and how the use of multiple representations during instruction and problem solving will lead to [[robust learning]]. To date, 7 studies (4 completed, 3 ongoing) have explored science learning at the two levels (micro and macro) of the theoretical framework. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Microscopic Level: Identifying [[knowledge components]] and developing assessments&#039;&#039;&#039;&lt;br /&gt;
In order to assess robust learning, we conducted studies and collaborated with Chemistry and Education faculty (David Yaron, Gaea Leinhardt &amp;amp; Jim Greeno) to identify key knowledge components of equilibrium and acid base chemistry. For instance, a verbal protocol study has demonstrated that experts and novices differ in their ability to invoke relevant knowledge components in contexts involving different representations (e.g., chemical equations, graphs, and diagrams). We continue to create and revise new forms of assessments of knowledge [[transfer]] that identify which correct and incorrect knowledge components students have and apply as they learn chemistry.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Macroscopic Level: Testing general learning principles&#039;&#039;&#039; &lt;br /&gt;
At the macroscopic level, &#039;&#039;in vivo&#039;&#039; studies test general learning principles. Studies have investigated whether the use of molecular-level diagrams increases [[robust learning]] as measured by [[transfer]] performance and have manipulated conditions to determine what type of instructional prompts will promote active processing. Early studies failed to find a learning advantage for molecular level diagrams and ongoing studies seek to determine what conditions may be required to produce a benefit of multiple representations during instruction.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
Visual representations: [[External representations]] that are used in instruction and problem solving such as diagrams, graphs, and equations&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
Chemical equilibrium is a difficult topic for students to learn as it involves learning a large set of [[knowledge components]] and applying these components flexibly in a variety of situations. As different representations make salient different key features of knowledge components, we hypothesize that instruction that requires the [[coordination]] of multiple representations will lead to more [[robust learning]] as measured by [[transfer]] tests such as conceptual multiple choice questions and open-ended inference questions.&lt;br /&gt;
&lt;br /&gt;
=== Research Questions ===&lt;br /&gt;
Our project seeks to identify the [[knowledge components]] of equilibrium and acid base chemistry and determine when and how  [[coordination]] of different types of representations lead to the acquisition of correct knowledge components and [[robust learning]].&lt;br /&gt;
&lt;br /&gt;
*What are the knowledge components of equilibrium and acid/base chemistry? (CMU, lab study #1, 2006; Chemistry working group)&lt;br /&gt;
*How do experts and novices differ in equilibrium problem solving with multiple representations? (CMU, lab study #1, 2006)&lt;br /&gt;
*Does the presence of molecular level diagrams enhance robust learning of acid/base chemistry? (UBC, in vivo study #1, 2006; CMU, in vivo study #2, 2006)&lt;br /&gt;
*Do molecular level diagrams enhance self explanation leading to robust learning in a tutorial on acids and bases? (CMU, lab study #2, 2006)&lt;br /&gt;
*Do labelled diagrams enhance robust learning as measured by transfer performance? (UBC, in vivo study #3, 2007; CMU, in vivo study #5, 2007)&lt;br /&gt;
*Do virtual lab activities (in which multiple representations must be coordinated) enhance performance on interactive problem solving and transfer questions? (UBC, in vivo study #3, 2007)&lt;br /&gt;
*Does instruction with multiple representations including molecular level diagrams and graphs enhance the acquisition of equilibrium knowledge components leading to robust learning? (CMU, in vivo study #4, 2007)&lt;br /&gt;
&lt;br /&gt;
=== Background and Significance ===&lt;br /&gt;
Robust learning in chemistry is more than learning a set of facts or procedures. Instead, many pieces of knowledge must be acquired and flexibly coordinated in order to make predictions, inferences and analyses of chemical systems. We chose to focus on chemical equilibrium systems as the knowledge is important for understanding processes in chemistry, biology and engineering and the domain is notoriously difficult for students (e.g., Banerjee, 1991; Van Driel et al. 1999). &lt;br /&gt;
&lt;br /&gt;
As the knowledge components of chemical equilibrium are not necessarily explicit in textbooks or instructional materials, we have formed a Chemistry working group with Chemistry, Education and Learning Science faculty to identify core pieces of understanding in this domain. In addition, verbal protocol analyses help us understand student how students process instruction and solve chemistry problems.&lt;br /&gt;
&lt;br /&gt;
Our hypothesis is that instruction and practice that includes a variety of representations will lead to robust learning. Multimedia instruction is widely believed to help chemistry learning by providing a bridge between the mathematical procedures and core chemistry concepts. For instance, chemistry education researchers have suggested that the ability to transform between representations leads to increased success in problem solving (Bodner &amp;amp; Domin, 2000) and that including diagrams during classroom instruction leads to improved conceptual understanding (Ardac &amp;amp; Akaygun, 2004; Bunce &amp;amp; Gabel, 2002; Noh &amp;amp; Scharmann, 1997; Sanger &amp;amp; Badger, 2001; Williamson &amp;amp; Abraham, 1995;see Kozma &amp;amp; Russell, 2005 for a review). Though the results of classroom studies are promising, the interventions often lasted many class periods and the content in the experimental and control classes was not tightly controlled, so it is unclear exactly how specific representations influenced learning. Further, assessments that demonstrated greater conceptual understanding did not always show increased problem solving performance.&lt;br /&gt;
&lt;br /&gt;
In a number of laboratory studies, Mayer (Clark &amp;amp; Mayer, 2003), Ainsworth &amp;amp; Loizou (2003) and others have found that instructional materials that include both diagrams and text provide learning benefits over materials that only include text. Will this same learning advantage extend to classroom-based instruction? Studies in chemistry education research have suggested that molecular level diagrams may promote deeper understanding that text or equations. However, these classroom-based studies lacked rigorous controls and assessments of robust learning. The current project investigates if and when the presence of visual representations in addition to text promotes deep conceptual understanding in chemistry.&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
The studies in the project manipulate the presence of visual representations in chemistry instruction and problem solving.&lt;br /&gt;
&lt;br /&gt;
For instance, in the (CMU, 2006) Lab Study #1, experts and novices solved problems in diffferent representational formats. The traditional format, found in many textbooks, involves a chemical equation and text-based setup. The diagram format requires solvers to additionally integrate pictorial information.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:trad_diag.jpg]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In studies testing the role of molecular-level representations in chemistry instruction (Lab Study #2, &#039;&#039;in vivo&#039;&#039; Studies #1, #2, #3 and #5), text is identical in both conditions and the diagram condition supplements the text with pictures.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:mole_pic.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Dependent Variables ===&lt;br /&gt;
Dependent variables include improvement from pre to post test on conceptual multiple choice questions, performance on scaffolded problem solving with tutors and performance on open-ended [[transfer]] questions.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;[[Transfer]] Questions&#039;&#039;&amp;lt;BR&amp;gt;&lt;br /&gt;
Our studies use a range of transfer assessments&lt;br /&gt;
*Conceptual Multiple Choice &lt;br /&gt;
**E.g. You have 5 weak acids in your laboratory and want to create a series of buffer solutions. If the target pH is 4.3 which weak acid you would use to create your buffer, and would you use more weak base, weak acid or the same amount of each?&lt;br /&gt;
:::Weak acid A	pKa = 7.35&lt;br /&gt;
:::Weak acid B	pKa = 7.15&lt;br /&gt;
:::Weak acid C	pKa = 6&lt;br /&gt;
:::Weak acid D	pKa = 4.3&lt;br /&gt;
:::Weak acid E	pKa = 6.5&lt;br /&gt;
*Open-ended Transfer Questions&lt;br /&gt;
** E.g., &amp;quot;How could you make a solution of lemon juice that is more acidic than a solution of hydrocholoric acid?&amp;quot;&lt;br /&gt;
*Scaffolded Problem Solving with tutors&lt;br /&gt;
*Virtual Laboratory Problems&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;[[Normal post-test]]&#039;&#039; Assessments&amp;lt;BR&amp;gt;&lt;br /&gt;
Include: &lt;br /&gt;
*Definitions: Defining terms such as acid/base&lt;br /&gt;
*True/False questions&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
&lt;br /&gt;
====Expert/Novice Equilibrium Problem Solving: Lab study #1 ====&lt;br /&gt;
N = 16 (CMU, 2006) Lab Study #1&lt;br /&gt;
&lt;br /&gt;
The two aims of this study were 1) to create a taxonomy of knowledge components for equilibrium chemistry and 2) to determine whether problem representation led to differences in application of knowledge components.&lt;br /&gt;
&lt;br /&gt;
A knowledge decomposition was carried out on the transcribed protocols and a taxonomy of equilibrium knowledge components has been created. &lt;br /&gt;
&lt;br /&gt;
=====Findings Summary=====&lt;br /&gt;
To assess whether problem representation influenced the application of knowledge components, experts and novices solved equilibrium problems in different contexts while talking aloud. While experts were equally able to retrieve the correct [[knowledge component]] (in this case that the equilibrium constant, K, was required for problem solving), novices were able to retrieve the correct knowledge component when solving a traditional problem, but were less successful on problems using molecular-level diagrams. There was a significant main effect of expertise, F(1,13) = 9.15, p = .01, and an interaction between expertise and problem type, F (1, 13) = 5.205, p = .04. &amp;lt;br&amp;gt; &lt;br /&gt;
&lt;br /&gt;
[[Image:K_exnov.jpg]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Learning with molecular level diagrams: &#039;&#039;In vivo&#039;&#039; studies #1 and #2====&lt;br /&gt;
N = 42 (UBC, 2006) &#039;&#039;in vivo&#039;&#039; Study #1&amp;lt;BR&amp;gt;&lt;br /&gt;
N = 89 (CMU, 2006) &#039;&#039;in vivo&#039;&#039; Study #2&lt;br /&gt;
&lt;br /&gt;
The aim of these studies was to determine whether molecular level diagrams lead to increased performance on tranfer questions after reading a tutorial that either contained Text+Diagram, or Text Only.&lt;br /&gt;
&lt;br /&gt;
Results (UBC, 2006) &#039;&#039;in vivo&#039;&#039; Study #1&lt;br /&gt;
*An ANCOVA was run on posttest scores with format (Diagram+Text vs. Text Only) as a between-subjects variable and pretest scores as a covariate. No significant effect of condition was found, F(1, 39) = .025, p = .88. Posttest scores were similar regardless of whether students were in the diagrams (M =.72) or text-only (M = .72) conditions.&lt;br /&gt;
&lt;br /&gt;
Results (CMU, 2006) &#039;&#039;in vivo&#039;&#039; Study #2&lt;br /&gt;
*A mixed 2x2 ANOVA was carried out with format (Diagram+Text vs. Text-only) as a between-subjects variable, time of test (Pre vs. Posttest) as a within-subjects variable and multiple-choice accuracy as the dependent variable. Pretest to posttest gains were significant, F(1, 87) = 94.4, p &amp;lt; .001, but there was no effect of format, and no interaction between test-time and format. See table below.&lt;br /&gt;
&lt;br /&gt;
::[[image:Earli_tables.gif]]&lt;br /&gt;
 &lt;br /&gt;
=====Findings summary=====&lt;br /&gt;
To date, results suggest no advantage for the addition of diagrams to text. Specifically, acid/base chemistry tutorials that included molecular-level diagrams did not produce enhanced learning compared with text-only versions of the same tutorials. Future studies will investigate whether labelled diagrams will be more likely to promote [[integration]] of textual material with visual representations, leading to more robust learning of chemistry concepts.&lt;br /&gt;
&lt;br /&gt;
====Molecular level diagrams and self explanation: Lab study #2====&lt;br /&gt;
N = 22 (CMU, 2006) Lab Study #2&lt;br /&gt;
&lt;br /&gt;
As no benefits were found from the &#039;&#039;in vivo&#039;&#039; studies, we conducted a lab study to determine whether there were any processing differences in the Text+Diagram and Diagram Only conditions. Participants were instructed to self explain as they read through a tutorial on acid/base chemistry. &lt;br /&gt;
&lt;br /&gt;
=====Findings Summary=====&lt;br /&gt;
* Students made substantial learning gains from pre to post-test, p &amp;lt; .001. However, no significant main effect of condition was found for either the multiple choice test or performance on the definition questions F(1,20) &amp;lt; 1. Further, no main effect of condition was found for performance on transfer questions.&lt;br /&gt;
:[[Image:CMUabtut.jpg]]&lt;br /&gt;
&lt;br /&gt;
* Transcribed protocols were analysed for time on task and number of words. There were no significant differences between conditions.&lt;br /&gt;
* Verbal protocols were coded for Self Explanations (Noticing Coherence, Elaborations or Principle Based) and Monitoring statements (Positive or Negative). No significant differences in number of self explanations were found between conditions. &lt;br /&gt;
* Correlations: Significant correlations were found between the number of self explanations (SE) and transfer performance. In the Text Only condition, SE total was positively correlated with transfer performance (.642), however in the Diagrams+Text condition, SE total was negatively correlated with transfer performance (-.606). These results suggest that the diagrams in this particular study may have generated less germane processing. An ongoing analysis of this data will investigate whether SEs in the Diagram+Text condition were more shallow than in the Text Only condition.&lt;br /&gt;
&lt;br /&gt;
====Learning from labelled molecular level diagrams and Virtual Lab activities: &#039;&#039;In vivo&#039;&#039; study #3, UBC, 2007====&lt;br /&gt;
N = 1139 (UBC, 2007) &#039;&#039;in vivo&#039;&#039; Study #3&lt;br /&gt;
&lt;br /&gt;
This ongoing study is an extension of &#039;&#039;in vivo&#039;&#039; Studies #1 and #2. The aims of this study are 1) to determine whether labelled diagrams would lead to greater learning advantages and 2) To determine whether Virtual Lab activities enhance problem-solving performance with scaffolded, interactive tutors. The lack of effect of diagrams in the earlier studies may have been due to the lack of labels with the diagrams. Prior work may have confounded the study of labels with diagrams and it is possible that only labelled diagrams promote the active processing required for learning gains from the [[coordination]] of representations.&lt;br /&gt;
&lt;br /&gt;
The 2x2 design varies format (Diagrams + Text vs. Text Only) and the order of Virtual Lab and problem-solving tutors (Virtual Lab first vs. Virtual Lab second.)&lt;br /&gt;
&lt;br /&gt;
====The influence of multiple representations on knowledge component acquisition in chemical equilibrium: &#039;&#039;In vivo&#039;&#039; study #4====&lt;br /&gt;
N = 172 (CMU, 2007) &#039;&#039;in vivo&#039;&#039; Study #4&lt;br /&gt;
&lt;br /&gt;
Our knowledge decomposition of chemical equilibrium (informed by Lab Studies #1 and #2) as well as discussions with our Chemistry working group have revealed that the key knowledge component of &amp;quot;progress of reaction&amp;quot; is left implicit in many types of traditional instruction. For this ongoing study two sets of online lectures were created by Prof. Yaron to determine if instruction that uses multiple representations to convey the notion of progress of reaction would lead to more robust learning of chemistry concepts.&lt;br /&gt;
&lt;br /&gt;
[[Transfer]] measures of open-ended responses and conceptual multiple choice questions were collected, and measures of [[long term retention]] on a follow-up quiz, in class &amp;quot;clicker&amp;quot; responses and exam scores will be analyzed.&lt;br /&gt;
&lt;br /&gt;
====Molecular level diagrams and robust learning of acid base buffer concepts: &#039;&#039;In vivo&#039;&#039; study #5====&lt;br /&gt;
N = 172 (CMU, 2007) &#039;&#039;in vivo&#039;&#039; Study #5&lt;br /&gt;
&lt;br /&gt;
This ongoing study is an extension of &#039;&#039;in vivo&#039;&#039; studies #1 and #2 that manipulate the presence of molecular level diagrams in a Chemistry tutorial. In addition to collecting pretest and posttest [[transfer]] measures, we are also collecting transfer performance on each page of the tutorial to determine which knowledge components are accessed in the Diagram+Text versus the Diagram Condition.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
The knowledge decomposition of chemical equilibrium has revealed a number of correct knowledge components to be acquired by students and incorrect knowledge components to be avoided. This taxonomy has been used to create assessments used in our studies. &lt;br /&gt;
&lt;br /&gt;
Our expert-novice protocol study revealed that experts are more likely to invoke a relevent [[knowledge component]] across different problem types than novice solvers. This result suggests that many students maintain a shallow understanding of chemical systems even after completing a year of college-level chemistry. Current studies are addressing whether instruction that ties the core concept of progress of reaction (identified through the expert/novice studies) to multiple representations will lead to more [[robust learning]].&lt;br /&gt;
&lt;br /&gt;
Our studies testing learning benefits for visual diagrams during instruction suggest the large effects of diagrams commonly found in laboratory studies of topics such as bicycle pumps, lightning formation and disc brakes may be difficult to replicate in educational settings. Active and intentional coordination of representations may be required if diagrams are to increase learning and the mere presence of diagrams does not guarantee this type of active processing. Current studies seek to determine whether labelled diagrams will enhance the coordination of text and diagrams via [[sense making]]. Further, prior work has not closely mapped the features of to-be-learned material to the information contained in diagrams and text. In our ongoing studies, we are investigating which pieces of information are contained in text, which are contained in the diagrams and which features students are able to extract when they are presented with multiple representations.&lt;br /&gt;
&lt;br /&gt;
=== Publications and Presentations ===&lt;br /&gt;
&lt;br /&gt;
Davenport, J.L., McEldoon, K. &amp;amp; Klahr, K. (2007). Depicting invisible processes: The influence of molecular-level diagrams in Chemistry instruction. To be presented at the 29th Annual meeting of the Cognitive Science Society. August 2007. [http://www.learnlab.org/uploads/mypslc/publications/cogsci07davenport.pdf download]&lt;br /&gt;
&lt;br /&gt;
Davenport, J.L., Yaron, D., Karabinos, M., Klahr, K. &amp;amp; Koedinger, K. (2007). Chemical equilibrium: an evaluation of a new type of instruction. To be presented at the Gordon Conference for Chemistry Education Research and Practice. June 2007. [http://www.learnlab.org/uploads/mypslc/publications/davenport_gordon07.pdf download]&lt;br /&gt;
&lt;br /&gt;
Davenport, J.L., Klahr, D. &amp;amp; Koedinger (2007). The influence of diagrams on chemistry learning. Paper accepted for the 12th Biennial Conference of the European Association for Research on Learning and Instruction. August 2007. [http://www.learnlab.org/uploads/mypslc/publications/davenportearli07.pdf download] &lt;br /&gt;
&lt;br /&gt;
Davenport, J.L., Klahr, D. &amp;amp; Koedinger (2006). The influence of external representations on chemistry problem solving. Poster presented at the Forty-seventh Annual Meeting of the Psychonomic Society in Houston, Texas. November 2006. [http://www.learnlab.org/uploads/mypslc/publications/davenport06.pdf download]&lt;br /&gt;
&lt;br /&gt;
Yaron, D.,  Karabinos, M., Davenport, J. &amp;amp; Leinhardt, G. Virtual labs and scenario-based activities for introductory chemistry. American Chemical Society - Penn-Ohio Regional Meeting, Theil College, Greenville, PA, October 2006.&lt;br /&gt;
&lt;br /&gt;
Yaron, D.,  Karabinos, M., Davenport, J. &amp;amp; Leinhardt, G. Virtual lab activities for introductory chemistry labs, American Chemical Society Annual Meeting, San Francisco, September 2006.&lt;br /&gt;
&lt;br /&gt;
Yaron, D.,  Karabinos, M., Davenport, J. &amp;amp; Leinhardt, G. Virtual labs activities for introductory chemistry. Biennial Conference on Chemical Education, Purdue University, West Layefette In, July 2006.&lt;br /&gt;
&lt;br /&gt;
Yaron, D.,  Karabinos, M., Davenport, J., Cuadros, J., Rehm, E., McCue, W., Dennis, D., Leinhardt, G. and Evans, K. The ChemCollective Virtual Lab and Other Online Materials. Presented at Duke University, Durham, NC, November 2005.&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
&lt;br /&gt;
[[Category:Study]]&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Effect_of_adding_simple_worked_examples_to_problem-solving_in_algebra_learning&amp;diff=7120</id>
		<title>Effect of adding simple worked examples to problem-solving in algebra learning</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Effect_of_adding_simple_worked_examples_to_problem-solving_in_algebra_learning&amp;diff=7120"/>
		<updated>2008-02-13T19:10:32Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Glossary */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;Lisa Anthony, Jie Yang, Kenneth R. Koedinger&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
=== Summary Table ===&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&lt;br /&gt;
| &#039;&#039;&#039;PIs&#039;&#039;&#039; || Lisa Anthony, Jie Yang, &amp;amp; Ken Koedinger&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Other Contributers&#039;&#039;&#039; || n/a&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study Start Date&#039;&#039;&#039; || December 4, 2006&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Study End Date&#039;&#039;&#039; || December 20, 2006&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Site&#039;&#039;&#039; || Central Westmoreland Career &amp;amp; Technology Center (CWCTC)&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;LearnLab Course&#039;&#039;&#039; || Algebra&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Number of Students&#039;&#039;&#039; || 38&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Total Participant Hours&#039;&#039;&#039; || 114&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;DataShop&#039;&#039;&#039; || To be completed ASAP&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
This &#039;&#039;in vivo&#039;&#039; experiment compared differences in learning that occur when students problem solve vs when they problem solve aided by worked [[example]]s.  Students worked in the standard Cognitive Tutor Algebra lesson on 2-step problems.  Those in the worked examples condition copied the worked example given to them using the solver&#039;s interface the first time they saw a particular problem type (&#039;&#039;i.e.&#039;&#039;, ax+b=c or a/x=c); following that, an analogous example would appear each time the students solve a similar problem.&lt;br /&gt;
&lt;br /&gt;
The hypothesis of this study was that students who were given the worked examples would experience improved learning in both normal learning and in terms of the [[robust learning]] measures of [[transfer]] and [[accelerated future learning]].  Copying the problem the first time the students encountered a particular problem type acts as additional scaffolding for students to solve the problems.&lt;br /&gt;
&lt;br /&gt;
Results are forthcoming.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
Forthcoming, but will probably include&lt;br /&gt;
* Sample worked-out-example:&lt;br /&gt;
[[Image:lanthony-example-unit9.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
Is robust learning affected by the addition of scaffolded worked examples to the problem-solving process?&lt;br /&gt;
&lt;br /&gt;
=== Background &amp;amp; Significance ===&lt;br /&gt;
...Worked examples studies undergone at PSLC and beyond...&lt;br /&gt;
&lt;br /&gt;
See VanLehn&#039;s paper on students using examples -- copying vs. as feedback ...&lt;br /&gt;
Lefevre &amp;amp; Dicksen ... (1986). Cognition and Instruction.&lt;br /&gt;
&lt;br /&gt;
See Koedinger &amp;amp; Aleven&#039;s Assistance Dilemma explanation ...&lt;br /&gt;
&lt;br /&gt;
=== Independent Variables ===&lt;br /&gt;
One independent variable was used:&lt;br /&gt;
* Inclusion of worked example: present or not present.&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
The inclusion of worked examples during the problem-solving process will have benefits for learning by virtue of the scaffolding provided by having the students copy the example the first time they see a particular problem type.&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
* &#039;&#039;Near [[transfer]], immediate&#039;&#039;: Students were given a 15-minute post-test after their sessions with the computer tutor had concluded.&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;Near transfer, [[retention]]&#039;&#039;: We intend to analyze the log data from the students&#039; Cognitive Tutor usage in the equation solving unit that followed the 2-step problems, to determine if there was any difference in performance at the start of that lesson.&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;Far transfer&#039;&#039;: Far transfer items such as 3-step problems and literal equations were included on the immediate post-test.&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;[[Accelerated future learning]]&#039;&#039;:  We intend to analyze the log data from the students&#039; Cognitive Tutor usage in the equation solving unit that followed the 2-step problems, to determine if there were learning curve differences during training.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
Final findings in progress.&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
This study is part of the [[Coordinative Learning]] cluster and addresses the examples and explanation sub-group.&lt;br /&gt;
&lt;br /&gt;
The students were given examples throughout their use of the tutor.  On the first instance of a particular problem type, students were asked to copy out a worked example; on subsequent instances, examples remained on the screen while students solved analogous problems.&lt;br /&gt;
&lt;br /&gt;
=== Descendants ===&lt;br /&gt;
&lt;br /&gt;
None.&lt;br /&gt;
&lt;br /&gt;
=== Annotated Bibliography ===&lt;br /&gt;
&lt;br /&gt;
Analysis and write-up in progress.&lt;br /&gt;
&lt;br /&gt;
=== Further Information ===&lt;br /&gt;
Connected to [[Lab study proof-of-concept for handwriting vs typing input for learning algebra equation-solving]] in the [[Refinement and Fluency]] cluster.&lt;br /&gt;
&lt;br /&gt;
=====Plans for June 2007-December 2007=====&lt;br /&gt;
&lt;br /&gt;
* Complete transition of log data to DataShop.&lt;br /&gt;
* Analyze data to determine effect of including examples on pre to post test gains and/or learning curves.&lt;br /&gt;
* Write up results for publication in a learning science conference.&lt;br /&gt;
* Lab study comparing alternative methods of delivering and presenting worked examples is a possible side avenue for the parent project of this study ([[Handwriting Algebra Tutor]]).&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Lanthony-example-unit9.jpg&amp;diff=7119</id>
		<title>File:Lanthony-example-unit9.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Lanthony-example-unit9.jpg&amp;diff=7119"/>
		<updated>2008-02-13T19:10:13Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Rectangle-area_learning_curve.jpg&amp;diff=7118</id>
		<title>File:Rectangle-area learning curve.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Rectangle-area_learning_curve.jpg&amp;diff=7118"/>
		<updated>2008-02-13T19:07:17Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=Using_learning_curves_to_optimize_problem_assignment&amp;diff=7117</id>
		<title>Using learning curves to optimize problem assignment</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=Using_learning_curves_to_optimize_problem_assignment&amp;diff=7117"/>
		<updated>2008-02-13T19:04:33Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: /* Background and significance */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Abstract ===&lt;br /&gt;
This study examined the effectiveness of an educational data mining method – Learning Factors Analysis (LFA) – on improving the learning efficiency in the Cognitive Tutor curriculum. LFA uses a statistical model to predict how students perform in each practice of a knowledge component (KC), and identifies over-practiced or under-practiced KCs. By using the LFA findings on the Cognitive Tutor geometry curriculum, we optimized the curriculum with the goal of improving student learning efficiency. With a control group design, we analyzed the learning performance and the learning time of high school students participating in the Optimized Cognitive Tutor geometry curriculum. Results were compared to students participating in the traditional Cognitive Tutor geometry curriculum. Analyses indicated that students in the optimized condition saved a significant amount of time in the optimized curriculum units, compared with the time spent by the control group. There was no significant difference in the learning performance of the two groups in either an immediate post test or a two-week-later retention test. Findings support the use of this data mining technique to improve learning efficiency with other computer-tutor-based curricula.&lt;br /&gt;
&lt;br /&gt;
=== Glossary ===&lt;br /&gt;
Data mining, intelligent tutoring systems, learning efficiency&lt;br /&gt;
&lt;br /&gt;
=== Research question ===&lt;br /&gt;
How can we achieve a higher learning efficiency by reducing unnecessary over-practice in the intelligent tutoring settings?&lt;br /&gt;
&lt;br /&gt;
=== Background and significance ===&lt;br /&gt;
Much intelligent tutoring system (ITS) research has been focused on designing new features to improve learning gains measured by the difference between pre and post test scores. However, learning time is another principal measure in the summative evaluation of an ITS.  Intelligent tutors contribute more to education when they accelerate learning [9]. Bloom’s “Two Sigma” effect of a model human tutor [4] has been one of the ultimate goals for most intelligent tutors to achieve. So should be the “Accelerated Learning” effect shown by SHERLOCK’s offering four-year’s trouble shooting experience in the space of seven days of practice [12]. &lt;br /&gt;
&lt;br /&gt;
Cognitive Tutors are an ITS based on cognitive psychology results [11]. Students spend about 40% of their class time using the software. The software is built on cognitive models, which represent the knowledge a student might possess about a given subject. The software assesses students’ knowledge step by step and presents curricula tailored to individual skill levels [11]. According to Carnegie Learning Inc., by 2006, Cognitive Tutors have been widely used in over 1300 school districts in the U.S. by over 475,000 secondary school students. With such a large user base, the learning efficiency with the Tutor is of great importance. If every student saves four hours of learning over one year, nearly two million hours will be saved. To ensure adequate yearly progress, many schools are calling for an increase in instructional time. However, the reality is that students have a limited amount of total learning time, and teachers have limited amount of instructional time. Saving one hour of learning time can be better than increasing one hour of instructional time because it does not increase students’ or teachers’ work load. Moreover, if these saved hours are &lt;br /&gt;
spent on learning other time-consuming subjects, they can improve the learning gains in those subjects. &lt;br /&gt;
&lt;br /&gt;
Educational data mining is an emerging area, which provides many potential insights that may improve education theory and learning outcomes. Much educational data mining to date has stopped at the point of yielding new insights, but has not yet come full circle to show how such insights can yield a better intelligent tutoring system (ITS) that can improve student learning [2, 3].&lt;br /&gt;
Learning Factors Analysis (LFA) [6, 5] is a data-mining method for evaluating cognitive models and analyzing student-tutor log data. Combining a statistical model [10], human expertise and a combinatorial search, LFA is able to measure the difficulty and the learning rates of knowledge components (KC), predict student performance in each KC practice, identify over-practiced or under-practiced KCs, and discover “hidden” KCs interpretable to humans.&lt;br /&gt;
 	[[Image:LFA_Formula.jpg]]&lt;br /&gt;
Pijt is the probability of getting a step in a tutoring question right by the ith student’s tth opportunity to practice the jth KC. The model says that the log odds of Pijt is proportional to the overall “smarts” of that student (θi) plus the “easiness” of that KC (βj) plus the amount gained (γj) for each practice opportunity. With this model, we can show the learning growth of students at any current or past moment.&lt;br /&gt;
&lt;br /&gt;
By applying LFA to the student log data from the Area unit of the 1997 Geometry Cognitive Tutor, we found two interesting phenomena. On the one hand, some easy (i.e. high βj) KCs with low learning rates (i.e. low γj) are practiced many times. Few improvements can be made in the later stages of those practices. KC rectangle-area is an example. This KC characterizes the skill of finding the area of a rectangle, given the base and height. As shown in Figure 1, students have an initial error rate around 12%. After 18 times of practice, the error rate reduces to only 8%. The average number of practices per student is 10. Many practices spent on an easy skill are not a good use of student time. Reducing the amount of practice for this skill may save student time without compromising their performance. Other over-practiced KCs include square-area, and parallelogram-area. On the other hand, some difficult (i.e. low βj) KCs with high learning rates (i.e. high γj) do not receive enough practice. Trapezoid-area is such an example in the unit. But students received up to a maximum of 6 practices. Its initial error rate is 76%. By the end of the 6th practice the error rate remains as high as 40%, far from the level of mastery. More practice on this KC is needed for students to reach mastery. Other under-practiced KCs include pentagon-area and triangle-area. &lt;br /&gt;
&lt;br /&gt;
[[Image: Rectangle-area_learning_curve.jpg]] &lt;br /&gt;
&lt;br /&gt;
[[Image: Pentagon-area_learning_curve.jpg]]&lt;br /&gt;
&lt;br /&gt;
Having students practice less than needed is clearly undesirable in the curriculum. Is over practice necessary?  The old idiom “practice makes perfect” suggests that the more practice we do on a skill, the better we can apply the skill. Many teachers believe that giving students more practice problems is beneficial and “would like to have the students work on more practice problems”, even when “[students] were not making any mistakes and were progressing through the tutor quickly”[7].&lt;br /&gt;
We believe that if the teachers want more problems for their students to practice unmastered KCs or useful KCs not covered by the curriculum, more practice is necessary. To support KC long-term retention, more practice is necessary but needs to be spread on an optimal schedule [1, 14]. In the rectangle-area example, where all the practice for this KC is allocated in a short period, more practice becomes over practice, which is unnecessary after the KC is mastered.&lt;br /&gt;
&lt;br /&gt;
=== Dependent variables ===&lt;br /&gt;
&lt;br /&gt;
[[Normal post-test]] -- Shortly after finishing the 6th unit, the students took the post test.&lt;br /&gt;
&lt;br /&gt;
[[Long-term retention]] -- In two weeks after each student finished the post test, we gave each student a retention test.&lt;br /&gt;
&lt;br /&gt;
=== Independent variables ===&lt;br /&gt;
The independent variable was whether students used the version of the tutor with the optimized parameter settings or whether they used the version with the original parameter settings (an [[ecological control group]]).&lt;br /&gt;
&lt;br /&gt;
=== Hypothesis ===&lt;br /&gt;
Over-practice is not necessary for short term retention as well as long term retention. &lt;br /&gt;
&lt;br /&gt;
Reducing over-practice can improve learning efficiency.&lt;br /&gt;
&lt;br /&gt;
=== Findings ===&lt;br /&gt;
The optimized group learned as much as the control group but in less time. The two groups have similar scores in both the pre test and the post test. The amount of learning gain in both groups is approximately 5 points. To further examine the treatment effect, we ran an ANCOVA on the post test scores, with condition as a between subject factor, the pretest scores as a covariate, and an interaction term between the pretest scores and the condition. The post test scores are significantly higher than the pretest, p &amp;lt; .01, suggesting that the curriculum overall is effective. Meanwhile neither the condition nor the interaction are significant, p = 0.772, and p = 0.56 respectively. As shown in the Figure 2 (right), we found no significant difference in the retention test scores (p = 0.602, two tailed). The results from the post test and the retention tests suggest that there is no significant difference between the two groups on either of the two tests. Thus, over practice does not lead to a significantly higher learning gain. &lt;br /&gt;
&lt;br /&gt;
[[Image: Pre_post_test.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[Image: Retention_test.jpg]]&lt;br /&gt;
&lt;br /&gt;
The actual learning time in each unit matches our hypotheses.  As shown in Table 2, the students in the optimized condition spent less time than the students in the control condition in all the units except in the circle unit. The optimized group saved the most amount of time, 14 minutes, in unit 1 with marginal significance p = .19; 5 minutes in unit 2, p = .01, and 1.92, 0.49, 0.28 minutes in unit 3, 4, and 5 respectively. In unit 6, where we lowered P(L0), the optimized group spent 0.3 more minutes. Notice the percentage of the time saved in each unit. The students saved 30% of tutoring time in unit 2 Parallelogram, and 14% in unit 1 Square. In total students in the optimized condition saved around 22 minutes, an 12% reduction in the total tutoring time.&lt;br /&gt;
&lt;br /&gt;
[[Image: Time_saving_table.jpg]]&lt;br /&gt;
&lt;br /&gt;
=== Explanation ===&lt;br /&gt;
&lt;br /&gt;
[[Knowledge component hypothesis]]&lt;br /&gt;
The over practice is the amount of practice done after students have mastered the skills under measurement. Once the skills reach the mastery level, any more practice on them may only leads to very little learning gain. Those gain are not even retained over the longer term. By removing those over practice, students can learn as much but save more time.&lt;br /&gt;
&lt;br /&gt;
=== Descendants ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Annotated bibliography ===&lt;br /&gt;
&lt;br /&gt;
[1]	J. R. Anderson, J. M. Fincham and S. A. Douglass, Practice and retention: A unifying analysis Journal of Experimental Psychology: Learning, Memory, and Cognition, 25 (1999), pp. 1120-1136.&lt;br /&gt;
&lt;br /&gt;
[2]	T. Barnes, The Q-matrix Method: Mining Student Response Data for Knowledge, American Association for Artificial Intelligence 2005 Educational Data Mining Workshop, 2005.&lt;br /&gt;
&lt;br /&gt;
[3]	J. E. Beck and B. P. Woolf, Evaluating tutorial actions with a user model, User Modeling and User Adapted Interaction, 2007.&lt;br /&gt;
&lt;br /&gt;
[4]	B. S. Bloom, The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring, Educational Researcher, 13 (1984), pp. 4-16 &lt;br /&gt;
&lt;br /&gt;
[5]	H. Cen, K. Koedinger and B. Junker, Automating cognitive model improvement by a* search and logistic regression, American Association for Artificial Intelligence 2005 Workshop on Educational Datamining, 2005.&lt;br /&gt;
&lt;br /&gt;
[6]	H. Cen, K. Koedinger and B. Junker, Learning Factors Analysis - A General Method for Cognitive Model Evaluation and Improvement, 8th International Conference on Intelligent Tutoring Systems, 2006.&lt;br /&gt;
&lt;br /&gt;
[7]	J. Cital, A business email report to Carnegie Learning Inc, 2006, pp. Email.&lt;br /&gt;
&lt;br /&gt;
[8]	A. T. Corbett and J. R. Anderson, Knowledge tracing: Modeling the acquisition of procedural knowledge, User Modeling and User-Adapted Interaction, 1995, pp. 253-278.&lt;br /&gt;
&lt;br /&gt;
[9]	A. T. Corbett, K. Koedinger and J. R. Anderson, Intelligent Tutorins Systems, in M. G. Helander, T. K. Landauer and P. Prabhu, eds., Handbook of Human-Computer Interaction, Elsevier Science, Amsterdam, The Netherlands, 1997.&lt;br /&gt;
&lt;br /&gt;
[10]	K. Draney, P. Pirolli and M. Wilson, A Measurement Model for a Complex Cognitive Skill, Cognitively Diagnostic Assessment, Erlbaum, Hillsdale, NJ 1995.&lt;br /&gt;
&lt;br /&gt;
[11]	K. R. Koedinger and A. T. Corbett, Cognitive Tutors: Technology Bringing Learning Science to the Classroom, in K. Sawyer, ed., The Cambridge Handbook of the Learning Sciences, Cambridge University Press., 2006, pp. 61-78.&lt;br /&gt;
&lt;br /&gt;
[12]	A. Lesgold, S. Lajoie, M. Bunzo and G. Eggan, Sherlock: A Coached Practice Environment for an Electronics Troubleshooting Job, in J. H. Larkin and R. W. Chabay, eds., Computer-assisted instruction and intelligent tutoring systems: shared goals and complementary approaches, Lawrence Erlbaum Associates, 1988.&lt;br /&gt;
&lt;br /&gt;
[13]	A. Mitrovic, M. Mayo, P. Suraweera and B. Martin, Constraint-based tutors: a success story, 14th Int. Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems IEA/AIE-2001, Springer-Verlag Berlin Heidelberg, Budapest, 2001.&lt;br /&gt;
&lt;br /&gt;
[14]	P. I. Pavlik and J. R. Anderson, Practice and Forgetting Effects on Vocabulary Memory: An Activation Based Model of the Spacing Effect Cognitive Science, 29 (2005), pp. 559-586.&lt;br /&gt;
&lt;br /&gt;
[15]	A. Rafferty and M. Yudelson, Applying LFA for Building Stereotypical Student Models, PSLC Summer School 2006 Projects, Pittsbugh Science of Learning Center, Pittsburgh PA, 2006.&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Pentagon-area_learning_curve.jpg&amp;diff=7116</id>
		<title>File:Pentagon-area learning curve.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Pentagon-area_learning_curve.jpg&amp;diff=7116"/>
		<updated>2008-02-13T19:02:55Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Int_writing_tutor_hyp1.gif&amp;diff=7109</id>
		<title>File:Int writing tutor hyp1.gif</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Int_writing_tutor_hyp1.gif&amp;diff=7109"/>
		<updated>2008-02-05T04:12:17Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Int writing tutor hyp1.gif&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Probability of Correct Response given time for no transfer and negative transfer&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Int_writing_tutor_hyp2.gif&amp;diff=7108</id>
		<title>File:Int writing tutor hyp2.gif</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Int_writing_tutor_hyp2.gif&amp;diff=7108"/>
		<updated>2008-02-05T04:12:02Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Int writing tutor hyp2.gif&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Probability of correct response given time for no transfer (PcNo), negative transfer(Pc), and probability of incorrect(PI)&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Hint.jpg&amp;diff=7107</id>
		<title>File:Hint.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Hint.jpg&amp;diff=7107"/>
		<updated>2008-02-05T04:11:46Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Hint.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;A typical hint in the Geometry Cognitive Tutor&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Geometry_Cognitive_Tutor.jpg&amp;diff=7106</id>
		<title>File:Geometry Cognitive Tutor.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Geometry_Cognitive_Tutor.jpg&amp;diff=7106"/>
		<updated>2008-02-05T04:11:29Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Geometry Cognitive Tutor.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Help-tutor.jpg&amp;diff=7105</id>
		<title>File:The Help-tutor.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Help-tutor.jpg&amp;diff=7105"/>
		<updated>2008-02-05T04:11:13Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:The Help-tutor.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Help_Seeking_Model.jpg&amp;diff=7104</id>
		<title>File:The Help Seeking Model.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Help_Seeking_Model.jpg&amp;diff=7104"/>
		<updated>2008-02-05T04:10:59Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:The Help Seeking Model.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Self-Assessment_Tutor.jpg&amp;diff=7103</id>
		<title>File:The Self-Assessment Tutor.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:The_Self-Assessment_Tutor.jpg&amp;diff=7103"/>
		<updated>2008-02-05T04:10:44Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:The Self-Assessment Tutor.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Explicit_help-seeking_instruction.jpg&amp;diff=7102</id>
		<title>File:Explicit help-seeking instruction.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Explicit_help-seeking_instruction.jpg&amp;diff=7102"/>
		<updated>2008-02-05T04:10:20Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Explicit help-seeking instruction.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Embedded_hints.jpg&amp;diff=7101</id>
		<title>File:Embedded hints.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Embedded_hints.jpg&amp;diff=7101"/>
		<updated>2008-02-05T04:10:06Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Embedded hints.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:AFL_results.JPG&amp;diff=7100</id>
		<title>File:AFL results.JPG</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:AFL_results.JPG&amp;diff=7100"/>
		<updated>2008-02-05T04:01:20Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:AFL results.JPG&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:FTR_results.JPG&amp;diff=7099</id>
		<title>File:FTR results.JPG</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:FTR_results.JPG&amp;diff=7099"/>
		<updated>2008-02-05T04:01:10Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:FTR results.JPG&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_TableScreenShot.jpg&amp;diff=7098</id>
		<title>File:Butcher TableScreenShot.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_TableScreenShot.jpg&amp;diff=7098"/>
		<updated>2008-02-05T04:00:56Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Butcher TableScreenShot.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Screen shot of Noncontiguous tutor, with verbal explanations only.&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_UnsolvableExplanations.jpg&amp;diff=7097</id>
		<title>File:Butcher UnsolvableExplanations.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_UnsolvableExplanations.jpg&amp;diff=7097"/>
		<updated>2008-02-05T04:00:42Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Butcher UnsolvableExplanations.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Last updated January 2007.&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_FalseExplanations.jpg&amp;diff=7096</id>
		<title>File:Butcher FalseExplanations.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_FalseExplanations.jpg&amp;diff=7096"/>
		<updated>2008-02-05T04:00:28Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Butcher FalseExplanations.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Last Updated January 2007&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_DiagramScreenShot.jpg&amp;diff=7095</id>
		<title>File:Butcher DiagramScreenShot.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_DiagramScreenShot.jpg&amp;diff=7095"/>
		<updated>2008-02-05T04:00:17Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Butcher DiagramScreenShot.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_TableScreenShot2.jpg&amp;diff=7094</id>
		<title>File:Butcher TableScreenShot2.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Butcher_TableScreenShot2.jpg&amp;diff=7094"/>
		<updated>2008-02-05T04:00:06Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Butcher TableScreenShot2.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Mc_res.jpg&amp;diff=7093</id>
		<title>File:Mc res.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Mc_res.jpg&amp;diff=7093"/>
		<updated>2008-02-05T03:59:07Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Mc res.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Def_res.jpg&amp;diff=7092</id>
		<title>File:Def res.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Def_res.jpg&amp;diff=7092"/>
		<updated>2008-02-05T03:58:50Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Def res.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Trans_res.jpg&amp;diff=7091</id>
		<title>File:Trans res.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Trans_res.jpg&amp;diff=7091"/>
		<updated>2008-02-05T03:58:38Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Trans res.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:K_exnov.jpg&amp;diff=7090</id>
		<title>File:K exnov.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:K_exnov.jpg&amp;diff=7090"/>
		<updated>2008-02-05T03:58:05Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:K exnov.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Trad_diag.jpg&amp;diff=7089</id>
		<title>File:Trad diag.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Trad_diag.jpg&amp;diff=7089"/>
		<updated>2008-02-05T03:57:50Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Trad diag.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Mole_pic.jpg&amp;diff=7088</id>
		<title>File:Mole pic.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:Mole_pic.jpg&amp;diff=7088"/>
		<updated>2008-02-05T03:57:36Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:Mole pic.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
	<entry>
		<id>https://learnlab.org/mediawiki-1.44.2/index.php?title=File:CSE_screen_shot.jpg&amp;diff=7087</id>
		<title>File:CSE screen shot.jpg</title>
		<link rel="alternate" type="text/html" href="https://learnlab.org/mediawiki-1.44.2/index.php?title=File:CSE_screen_shot.jpg&amp;diff=7087"/>
		<updated>2008-02-05T03:57:17Z</updated>

		<summary type="html">&lt;p&gt;Gurpreet: uploaded a new version of &amp;quot;Image:CSE screen shot.jpg&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Gurpreet</name></author>
	</entry>
</feed>