Difference between revisions of "Educational Research Methods 2013"
(→Cognitive Task Analysis (Koedinger)) |
(→Cognitive Task Analysis (Koedinger)) |
||
Line 118: | Line 118: | ||
**Klahr, D., & Carver, S.M. (1988). Cognitive objectives in a LOGO debugging curriculum: Instruction, learning, and transfer. Cognitive Psychology, 20, 362-404. [[Media:Klahr&carver88.pdf|Klahr&carver88.pdf]] | **Klahr, D., & Carver, S.M. (1988). Cognitive objectives in a LOGO debugging curriculum: Instruction, learning, and transfer. Cognitive Psychology, 20, 362-404. [[Media:Klahr&carver88.pdf|Klahr&carver88.pdf]] | ||
**Siegler, R.S. (1976). Three aspects of cognitive development. Cognitive Psychology, 8 (4), 481-520, Elsevier. [[Media:Siegler76.pdf|Siegler76.pdf]] | **Siegler, R.S. (1976). Three aspects of cognitive development. Cognitive Psychology, 8 (4), 481-520, Elsevier. [[Media:Siegler76.pdf|Siegler76.pdf]] | ||
− | ***Pick '''one''' of these readings to focus on and skim the other two. Target your first post on that reading (and make clear which one it was). Your second post can be on any of the three. These readings illustrate the use of Cognitive Task Analysis (CTA) outside of math domains. The Aleven et al reading provides an example of a CTA at the level of metacognitive skills. The Siegler reading shows a CTA dealing with younger kids. The Klahr & Carver reading shows how CTA can facilitate the design of instruction that achieves a substantial level of transfer. When you skim all three, pay particular attention to how the authors represent the output of their analysis: Do they use production rules? | + | ***Pick '''one''' of these readings to focus on and skim the other two. Target your first post on that reading (and make clear which one it was). Your second post can be on any of the three. These readings illustrate the use of Cognitive Task Analysis (CTA) outside of math domains. The Aleven et al reading provides an example of a CTA at the level of metacognitive skills. The Siegler reading shows a CTA dealing with younger kids. The Klahr & Carver reading shows how CTA can facilitate the design of instruction that achieves a substantial level of transfer. When you skim all three, pay particular attention to 1) what are tasks the authors are analyzing, 2) what is their goal, 3) what is(are) the method(s) of analysis, and 4) how do the authors represent the output of their analysis: Do they use any of production rules, goal trees, semantic nets, hierarchical task models, or other? |
+ | **In the first forum (where you posted one of your research topics), reply to your thread with a post that describes an example task that you could productively analyze in your domain of interest. You might also indicate some variations on the task that might help reveal what is most challenging for learners. | ||
*Other possible readings: | *Other possible readings: | ||
**Newell & Simon [[Media:Human_Problem_Solving.pdf|Human_Problem_Solving.pdf]] | **Newell & Simon [[Media:Human_Problem_Solving.pdf|Human_Problem_Solving.pdf]] |
Revision as of 17:45, 21 January 2013
Contents
- 1 Research Methods for the Learning Sciences 05-748
- 1.1 Class times
- 1.2 Location
- 1.3 Instructor
- 1.4 Class URLs
- 1.5 Goals
- 1.6 Course Prerequisites
- 1.7 Textbook and Readings
- 1.8 Flipped Homework: Reading Reports and Pre-Class Assignments
- 1.9 Grading
- 1.10 Class Schedule in Brief
- 1.11 Class Schedule with Readings and Assignments
- 1.11.1 Course Intro & Formulating Good Research Questions (Koedinger)
- 1.11.2 Cognitive Task Analysis (Koedinger)
- 1.11.3 Video and Verbal Protocol Analysis (Lovett, Rosé)
- 1.11.4 Cognitive Task Analysis - Revisited (Koedinger)
- 1.11.5 Psychometrics, reliability, Item Response Theory (Junker)
- 1.11.6 Design Research & Qualitative Methods (Koedinger)
- 1.11.7 NO CLASS – Spring break 3-12 and 3-14
- 1.11.8 Surveys, Questionnaires, Interviews (Kiesler)
- 1.11.9 Educational Data Mining -- Learning Curve Analysis (Koedinger)
- 1.11.10 Flex day (Koedinger)
- 1.11.11 Educational Data Mining -- Causal Inference from Data (Scheines)
- 1.11.12 Experimental Research Methods (Koedinger)
- 1.11.13 Wrap-up
Research Methods for the Learning Sciences 05-748
Spring 2013 Syllabus Carnegie Mellon University
Class times
4:30 to 5:50 Tuesday & Thursday
Location
3001 Newell Simon Hall
Instructor
Professor Ken Koedinger
Office: 3601 Newell-Simon Hall, Phone: 412-268-7667
Email: Koedinger@cmu.edu, Office hours by appointment
Class URLs
Syllabus and useful links: learnlab.org/research/wiki/index.php/Educational_Research_Methods_2013
For reading reports: www.cmu.edu/blackboard
Summary Table: [1]
Goals
The goals of this course are to learn data collection, design, and analysis methodologies that are particularly useful for scientific research in education. The course will be organized in modules addressing particular topics including cognitive task analysis, qualitative methods, protocol and discourse analysis, survey design, psychometrics, educational data mining, and experimental design. We hope students will learn how to apply these methods to their own research programs, how to evaluate the quality of application of these methods, and how to effectively communicate about using these methods.
Course Prerequisites
To enroll you must have taken 85-738, "Educational Goals, Instruction, and Assessment" or get the permission of the instruction.
Textbook and Readings
"The Research Methods Knowledge Base: 3rd edition" by William M.K. Trochim and James P. Donnelly. You can find it at www.atomicdogpublishing.com/BookDetails.asp?BookEditionID=160
The course registration id is 1620032912010.
Other readings will be assigned in class. See below.
Flipped Homework: Reading Reports and Pre-Class Assignments
We are often going to implement "flipped homework", a variation on the flipped classroom idea you might have heard of. Flipped homework is an assignment before a relevant class meeting rather than after it. It helps students (you!) to "problematize" the topic -- to get a better sense of what you don't know and what questions you have. It helps instructors focus the class discussion to better avoid belaboring what students already know and to better pursue student needs and interests.
Students will be asked to write "reading reports" before most class sessions. We will use the discussion board on Blackboard (www.cmu.edu/blackboard) for this purpose.
Unless otherwise directed by instructors, students should make two posts on the readings before 9am on the day of class that those readings are due. If slides for the class are available, please review these as well.
These posts serve multiple purposes: 1) to improve your understanding and learning from the readings, 2) to provide instructors with insight into what aspects of the readings merit further discussion, either because of student need or interest, and 3) as an incentive to do the readings before class!
In general, please come to class prepared to ask questions and give answers.
Your two posts may be original or in response to another post (one of both is nice).
- Original posts should contain one or more of the following:
- something you learned from the reading or slides
- a question you have about the reading or slides or about the topic in general
- a connection with something you learned or did previously in this or another course, or in other professional work or research
- Replies should be an on-topic, relevant response, clarification, or further comment on another student’s post.
You may be asked to do other activities before class, such as answer questions on-line using the Assistment system, parts of the an OLI course, or beginning work on an assignment. That way you can come to class with a better appreciation for what you do not understand and need to learn.
Grading
There will be assignments associated with each section of the course. Grades will be determined by your performance on these assignments, by before-class preparation activities including reading reports, by your participation in class, and by a final paper.
- Course work
- 30% Before-class preparation, including reading reports, and in-class participation
- 40% Assignments
- Project & final paper - Due May 10.
- 30% Design a new study based on one or more of these methods that pushes your own research in a new direction.
- Apply a method from the class to your research. You should not choose a method that you already know well.
- Think of it as writing a grant proposal. Because some methods will be introduced after the project proposal date, we are open to a modification in your project to apply the newly introduced method. But, please check with us to get feedback and approval on a proposed change.
- No more than 15 double-spaced pages. Be efficient. Space is always limited in academic publications and you will find it useful to learn to include only what is important. Since this is styled as a grant proposal, please include some literature review and discussion of significance of the area you want to investigate. You should also briefly detail plans for participants, explain specifically how you will apply the method, and describe how you will analyze the data.
Class Schedule in Brief
- Course Intro: Formulating Good Research Questions: Jan 15 (T)
- Cognitive Task Analysis 1: Jan 17, 22, 24 (RTR)
- Video and Verbal Protocol Analysis: Jan 29, 31, Feb 5,7,12,14 (TRTRTR)
- Guest Instructors: Marsha Lovett & Carolyn Rose
- Cognitive Task Analysis 2: Feb 19, 21 (TR)
- Educational Measurement & Psychometrics: Feb 26, 28, Mar 5 (TRT)
- Guest Instructor: Brian Junker
- Educational Design Research: Mar 7 (R)
- NO CLASS – Spring break, Mar 12, 14 (TR)
- Surveys, Questionnaires, Interviews: Mar 19, 21 (TR)
- Guest Instructor: Sara Kiesler
- Educational Data Mining & Learning Curves: March 26, 28, Apr 2 (TRT)
- Flex day: Apr 4 (R)
- Educational Data Mining & Causal Inference: Apr 9, 11, 16 (TRT)
- Guest Instructor: Richard Scheines
- NO CLASS – Spring Carnival, Apr 18 (R)
- Experimental Methods: Apr 23, 25, 30 (TRT)
- Wrap-up: May 2 (R)
Class Schedule with Readings and Assignments
NOTE: This is a "living" document. It carries over some elements from the past course offering that may get changed before the scheduled class period.
Course Intro & Formulating Good Research Questions (Koedinger)
- 1-15
- See your email or www.cmu.edu/blackboard for the pre-class assignment.
- Lecture slides
- Read Trochim Chapter 1, particularly sections 1-2d and 1-4. See above for how to get the book -- but here's Chapter1
- [Optional (re)reading] Nathan, M., & Alibali, M. (2010). Learning sciences. WIREs Cognitive Science. PDF
Cognitive Task Analysis (Koedinger)
- 1-17
- Zhu, X. & Simon, H. A. (1987). Learning mathematics from examples and by doing. Cognition and Instruction, 4(3), 137-166. Zhu&Simon-1987.pdf
- Do a couple short assignments here: http://Assistment.org. Please create and an account, click on "Tutor", "Enroll in a class", select "Ken Koedinger" and "Educational Research Methods".
- Slides: CTA1-2013.pdf
- [Optional reading] Zhu X., Lee Y., Simon H.A., & Zhu, D. (1996). Cue recognition and cue elaboration in learning from examples. In Proceedings of the National Academy of Sciences 93, (pp. 1346±1351). PNAS-1996-Zhu-Simon.pdf
- 1-22
- Clark, R. E., Feldon, D., van Merriënboer, J., Yates, K., & Early, S. (2007). Cognitive task analysis: In J. M. Spector, M. D. Merrill, J. J. G. van Merriënboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 577–593). Mahwah, NJ: Lawrence Erlbaum Associates. Clarketal2007-CTAchapter.pdf
- One point of reflection for you on the Clark et al reading is to compare and contrast the Cognitive Task Analysis (CTA) methods and output representations recommended with the approach taken by Zhu & Simon. Also, note their examples and claims about the power of CTA for improving instruction. (If you saw Bror Saxberg's PIER talk last year, you may have heard that Kaplan is using CTA, with Clark's advice, to revise and improve their courses.)
- Chapter 2: How Experts Differ From Novices in Bransford, J. D., Brown, A., & Cocking, R. (2000). (Eds.), How people learn: Mind, brain, experience and school (expanded edition). Washington, DC: National Academy Press. HowPeopleLearnCh2.pdf
- Besides being an interesting read, a key point of this reading is the nature of expert knowledge (declarative and procedural) and how it is highly "conditionalized". How is this claim similar or different from Zhu & Simon? The notion of adaptive expertise is also important and interesting.
- As you read the 1-22 and 1-24 readings, be thinking about steps you could take to do a cognitive task analysis, empirical and rational, in a domain of your interest. Think about what tasks you would use, what CTA technique(s), and how might represent the output of your analysis.
- Clark, R. E., Feldon, D., van Merriënboer, J., Yates, K., & Early, S. (2007). Cognitive task analysis: In J. M. Spector, M. D. Merrill, J. J. G. van Merriënboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 577–593). Mahwah, NJ: Lawrence Erlbaum Associates. Clarketal2007-CTAchapter.pdf
- 1-24
- Aleven, V., McLaren, B., Roll, I., & Koedinger, K. R. (2004). Toward tutoring help seeking: Applying cognitive modeling to meta-cognitive skills. In J.C. Lester, R.M. Vicari, & F. Parguacu (Eds.) Proceedings of the 7th International Conference on Intelligent Tutoring Systems, 227-239. Berlin: Springer-Verlag. AlevenITS2004.pdf
- Klahr, D., & Carver, S.M. (1988). Cognitive objectives in a LOGO debugging curriculum: Instruction, learning, and transfer. Cognitive Psychology, 20, 362-404. Klahr&carver88.pdf
- Siegler, R.S. (1976). Three aspects of cognitive development. Cognitive Psychology, 8 (4), 481-520, Elsevier. Siegler76.pdf
- Pick one of these readings to focus on and skim the other two. Target your first post on that reading (and make clear which one it was). Your second post can be on any of the three. These readings illustrate the use of Cognitive Task Analysis (CTA) outside of math domains. The Aleven et al reading provides an example of a CTA at the level of metacognitive skills. The Siegler reading shows a CTA dealing with younger kids. The Klahr & Carver reading shows how CTA can facilitate the design of instruction that achieves a substantial level of transfer. When you skim all three, pay particular attention to 1) what are tasks the authors are analyzing, 2) what is their goal, 3) what is(are) the method(s) of analysis, and 4) how do the authors represent the output of their analysis: Do they use any of production rules, goal trees, semantic nets, hierarchical task models, or other?
- In the first forum (where you posted one of your research topics), reply to your thread with a post that describes an example task that you could productively analyze in your domain of interest. You might also indicate some variations on the task that might help reveal what is most challenging for learners.
- Other possible readings:
- Newell & Simon Human_Problem_Solving.pdf
- Lovett Lovett01CandI.pdf
Video and Verbal Protocol Analysis (Lovett, Rosé)
The plan for these six sessions in 2013, 1-29 to 2-14, is in this document.
By the end of this module, students should be able to:
- Explain what is involved in collecting and analyzing verbal data (including both “hand” and automatic approaches to analysis)
- Recognize when – and explain why – protocol analysis is/is not appropriate to particular research situations.
- Apply protocol analysis methods to already collected and segmented data.
Besides reading and discussing articles, students will complete a coding scheme design assignment.
Four parts of this assignment will be done as homework or in-class work:
- Part A (homework): Between sessions 2 and 3, propose one or more hypotheses and think about how you could use protocol analysis on the given data set to evaluate those hypotheses.
- Part B (homework): By session 5, develop a short coding manual and apply your coding scheme to a subset of the provided data. Bring 2 printouts to class. Also install LightSIDE software on your laptop and make sure it runs (http://www.cs.cmu.edu/~emayfiel/side.html).
- In class Part C: In session 5, swap coding manuals with a classmate and use their coding manual to code the same data they have coded (but not looking at their codes!), and measure reliability.
- Part D (homework): For session 6, prepare data for automatic coding, and bring soft-copy to class along with your laptop.
- Session 1[Jan 29]: Overview of Protocol Analysis
- In this introductory discussion, we will explore the basics of collecting verbal protocol data as well as a high-level view of what’s involved in analyzing such data. We will explore different uses of verbal data.
- Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 63), 271-315.
[[]]
- Discussion Questions:
- What are the main contrasts between the approach Chi advocates for analysis of verbal data and how she presents verbal protocol analysis?
- What can be gained from using these approaches? Which if either do you have experience with, and if so, can you explain that experience?
- How does Chi present these methodologies as complementary to more formally quantitative methodologies?
- Discussion Questions:
Cognitive Task Analysis - Revisited (Koedinger)
- 2-19
- Koedinger, K.R. & Nathan, M.J. (2004). The real story behind story problems: Effects of representations on quantitative reasoning. The Journal of the Learning Sciences, 13 (2), 129-164. Koedinger-Nathan-LS04.pdf
- Optional: Koedinger, K.R., & MacLaren, B. A. (2002). Developing a pedagogical domain theory of early algebra problem solving. CMU-HCII Tech Report 02-100. Accessible via http://reports-archive.adm.cs.cmu.edu/hcii.html KoedingerMacLaren02.pdf
- In addition to think aloud, another empirical approach to Cognitive Task Analysis is to compare student performance on a space of similar tasks designed to test specific hypotheses about the knowledge demands of those tasks. We have called this approach "Difficulty Factors Assessment" and the Koedinger & Nathan paper provides a nice illustration. Think about how could you create a model of skills needed for these tasks and use it to predict or account for the key differences observed in the data? After having come up with some ideas, compare them with the optional reading, Koedinger & McLaren. Make at least one related post.
- Write another post briefly indicating how you might perform a CTA in a domain of your interest. Which kind(s) of CTA would you employ and why?
- 2-21
- Koedinger, K.R. & McLaughlin, E.A. (2010). Seeing language learning inside the math: Cognitive analysis yields transfer. In S. Ohlsson & R. Catrambone (Eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society. (pp. 471-476.) Austin, TX: Cognitive Science Society. Koedinger-mclaughlin-cs2010.pdf
- One thing that struck me about our conversations last time about applying CTA to your work is that it was sometimes difficult to track the connection to the research goal. Let's make that goal explicit here. Make one post that states (one of) your key research question(s) as related to learning research.
- Make another post about the Koedinger & McLaughlin reading. The main point of this reading is to provide an illustration of the translation of CTA into an instructional innovation and an evaluation of the innovation. In a second post, describe the strategy used to translate from CTA to instructional innovation. (If you are having trouble, try first to describe the key CTA outcome and what is the innovation.)
- Other optional readings
- Rittle-Johnson, B. & Koedinger, K. R. (2001). Using cognitive models to guide instructional design: The case of fraction division: In Proceedings of the Twenty-Third Annual Conference of the Cognitive Science Society, (pp. 857-862). Mahwah,NJ: Erlbaum. Rittle-Johnson-Koedinger-cogsci01.pdf
- Koedinger, K. R., Corbett, A. C., & Perfetti, C. (in press). The Knowledge-Learning-Instruction (KLI) framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive Science. KLI-paper-v5.13.pdf
Psychometrics, reliability, Item Response Theory (Junker)
NEW ASSIGNMENTS
- 2-26
Quick introduction to the R statistical language
- Please complete and bring comments & questions to class on Tues Feb 28.
- Please download research_methods_r_assignment.zip from http://www.stat.cmu.edu/~brian/PIER-methods/. The Zip file contains three further files:
- R-preassignment.pdf - instructions for this assignment
- r-tutorial-1.R - examples of statistical things that you will do in R, for this assignment
- thermo11_data_integrated.csv - a data set for the examples.
- 2-28
1. From Trochim:
A. Chapter 3 - the vocabulary of measurement B. Chapter 5 - on constructing scales (it's ok to focus on the material up through sect 5.2a; the rest is more of a skim [but I'd be happy to talk about that in class also])
2. On item response theory (IRT), a set of statistical models that are used to construct scales and to derive scores from them, especially in education and psychological research:
A. Harris Article (PDF) Please take and self-score the test at the end of this article. Count each part of question one as one point, and each of the remaining three questions as one point (no partial credit!). Bring your 8 scores to class. E.g. if you missed 1(c) and (d), and you also missed question 4, then you would bring to class the following scores: 1 1 0 0 1 1 1 0 If you missed 1(a) and (b) and question 2, bring the following scores: 0 0 1 1 1 0 1 1 (note that the total score is 5 in both cases, but the pattern of rights and wrongs differs; it is the pattern that we are interested in). B. Please browse *online* through pp 1-23 of the pdf at [2]. The math is a bit heavy going but there are links to apps that illustrate various points in the harris article. So skim the math and play with the apps.
- 3-5
The assignment for this lecture has two parts.
- (A) An R assignment TBA. This you can actually email to my by Fri Mar 7.
- (B) The readings below.
On Tue we will discuss whatever of A and/or B seem interesting
1. "Psychometric Principles in Student Assessment" by Mislevy et al (Mislevy (PDF))
Read through p 18. This is a more modern modern look at some of the same issues that are addressed in Trochim's chapters. The remainder of this paper surveys various probabilistic models for the "measurement model" portion of Mislevy's framework (Figure 1). It is quite interesting but we will not pursue it.
2. "Cognitive Assessment Models with Few Assumptions..." by Junker & Sijtsma (Junker, Sijtsma (PDF))
Please read up through p 266 only. The math is a bit heavy going so please try to read around it to see what the point of the article is. We will try to look at some of the data in the article as examples in lecture 2.
Design Research & Qualitative Methods (Koedinger)
- 3-7
- Trochim Ch 8 (stop before 8.5), Ch 13 (stop before 13.3)
- Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1). PDF
- Optional: Chapter on Design Research in Handbook of Learning Sciences
NO CLASS – Spring break 3-12 and 3-14
Surveys, Questionnaires, Interviews (Kiesler)
- 3-19
- Reading: Trochim Ch 4 and 5
- You already read Ch 5 for the Psychometric section, so just review it. For both chapters, answer Trochim's on-line questions before and/or after reading (answering the questions before gives you goals for reading). For discussion board posts, do one post on how have or might use a survey (e.g., of student attitudes) in your own research. Make another post about Chapter 4, such as something you learned, a question you have, or an answer to someone else's question.
- Reading: Trochim Ch 4 and 5
- 3-21
- Do the following homework assignment Media:Arm-modQuestEduc.doc. Sara directs: Keep the text that's there and fill in answers, working through it step by step. I'm just as interested in your revisions as in the final version. Est time 45 minutes.
- Readings
- Tourangeau, Roger, and T. Yan. 2007. "Sensitive questions in surveys." Psychological Bulletin, 133(5): 859-883. Media:Tourangeau_SensitiveQuestions.pdf
- Tourangeau, R. (2000). “Remembering what happened: Memory errors and survey reports.@ In A. Stone, J. Turkkan, C. Bachrach, J. Jobe, H. Kurtzman, & V. Cain (Eds.), The Science of Self-Report: Implications for research and practice (pp. 29-48). Englewood Cliffs, N.J.: Lawrence Erlbaum. Media:Tourangeau_RememberingWhatHappened.pdf
Educational Data Mining -- Learning Curve Analysis (Koedinger)
- 3-26
- Readings:
- Ritter, F.E., & Schooler, L. J. (2001). The learning curve. In W. Kintch, N. Smelser, P. Baltes, (Eds.), International Encyclopedia of the Social and Behavioral Sciences. Oxford, UK: Pergamon. RittterSchooler01.pdf
- Stamper, J. & Koedinger, K.R. (2011). Human-machine student model discovery and improvement using data. In J. Kay, S. Bull & G. Biswas (Eds.), Proceedings of the 15th International Conference on Artificial Intelligence in Education, pp. 353-360. Berlin: Springer. Stamper-Koedinger-AIED2011.pdf
- Short assignment: The assignment ( Learning-curve-assignment-2012.doc) is a tutorial on using DataShop to begin analyzing learning curves. This file (Geometry_-_Unit_Area.pdf) will be needed to help answer questions Q6-Q7.
- On the discussion board, 1) post a comment or question about at least one of the two readings and 2) attach your assignment and/or bring a hard copy of it to class. That is, only one post is required (but more than one is welcome).
- Readings:
- 3-28
- Readings:
- Zhang, X., Mostow, J., & Beck, J. E. (2007, July 9). All in the (word) family: Using learning decomposition to estimate transfer between skills in a Reading Tutor that listens. AIED2007 Educational Data Mining Workshop, Marina del Rey, CA AIED2007_EDM_Zhang_ld_transfer.pdf
- Cen, H., Koedinger, K. R., & Junker, B. (2006). Learning Factors Analysis: A general method for cognitive model evaluation and improvement. In M. Ikeda, K. D. Ashley, T.-W. Chan (Eds.) Proceedings of the 8th International Conference on Intelligent Tutoring Systems, 164-175. Berlin: Springer-Verlag. PDF
- Optional: Martin, B., Mitrovic, T., Mathan, S., & Koedinger, K.R. (2011). Evaluating and improving adaptive educational systems with learning curves. User Modeling and User-Adapted Interaction: The Journal of Personalization Research (UMUAI). 21(3), pp. 249-283. file
- Readings:
- 4-2
- Please finish off one of the two exercises you started for last class. See A or B further below. In either case, provide a brief writeup in response to each of the numbered steps and include a summary of the result you achieved (e.g., did you get a more predictive model as measured by AIC, BIC, or cross validation). Turn in this writeup and the supporting file (KC model table or R file) on Blackboard.
- ALSO, make a post about your idea for a course final project. What method might you apply to address what research question?
- No required reading assignment.
- Optional readings:
- Roberts, Seth, & Pashler, Harold. (2000). How persuasive is a good fit? A comment on theory testing. Psychological Review, 107(2), 358 - 367. Media:2000_roberts_pashler.pdf
- Schunn, C. D., & Wallach, D. (2005). Evaluating goodness-of-fit in comparison of models to data. In W. Tack (Ed.), Psychologie der Kognition: Reden and Vorträge anlässlich der Emeritierung von Werner Tack (pp. 115-154). Saarbrueken, Germany: University of Saarland Press. Media:GOF.doc
Do A or B: A. Modify a KC model in a DataShop dataset 1. What is the DataShop dataset you modified? 2. Describe how you used the HMST procedure (from Stamper paper) to identify a KC to try to improve 3. Show how you recoded that KC with new KCs (turn in your modified KC file) & describe why you made the change you did 4. After importing your new KC model to DataShop, did it improve the predictions (are any of the metrics, AIC, BIC, or cross validation)? (Caution: Make sure your new KC model labels the same number of observations as the KC model you are modifying.)
B. Use R to create an alternative statistical model to AFM 1. Approximate afm in R using either glm or lmer. How do the parameter estimates and metrics (AIC and BIC) compare with results in DataShop? 2. Modify the regression equation to try to improve the prediction. Some options include: a) adding a student by KC interaction (there are just main effects of student and KC in AFM), b) adding student slopes (there is just a KC slope in AFM), c) counting success and failure opportunities separately (both kinds of opportunities are lumped together in AFM), d) using log of Opportunity, e) including step (perhaps as a random effect) ... 3. Turn in your R file including metrics (log-liklihood, parameters, AIC, BIC) on the statistical models you compared 4. Summarize whether or not your modification changes model fit (log liklihood), changes the number of parameters (from what to what), and, most importantly, improves prediction (as measured by AIC or BIC)
Flex day (Koedinger)
- 4-4 To be used in case of rescheduling or for a student-driven topic.
- And/or for Review of Projects or Past Topics
- Make some progress on your course project and bring your ideas/questions to class. On the discussion board, please reply to my reply about your posted project idea to commit to an idea or elaborate on your plan.
- Regarding methods we have addressed, it seems there were some lingering questions at the end of the last couple of sessions related to statistical analysis and/or design research strategies. We can talk in class about those.
- And/or for Review of Projects or Past Topics
Educational Data Mining -- Causal Inference from Data (Scheines)
- 4-9
- Do Unit 2 in the OLI course Empirical Research Methods
go to: http://oli.web.cmu.edu/openlearning/ in the left tab, go to "Prior work..." and then "Empirical Research Methods" click on Peek In complete Unit 2
- Read Scheines, R., Leinhardt, G., Smith, J., and Cho, K. (2005). Replacing lecture with web-based course materials. Journal of Educational Computing Research, 32, 1, 1-26. PDF
- 4-11 Continue discussion of Causal Inference from Data & TETRAD
- 4-16 Continue discussion of TETRAD
- 4-18 NO CLASS - Spring Carnival
Experimental Research Methods (Koedinger)
- 4-23
- Reading: Trochim Ch 7
- Slides: ppt
- 4-25
- 4-30
- 5-2
- Reading: Trochim Ch 14
- Optional: Try ANOVA module of OLI Statistics course
Wrap-up
If needed, schedule a course wrap-up
Final project is due May 10.