life is a rum go guv’nor, and that’s the truth

Teacher Authoring and Metacognition at the PSLC

JCDL 2008 trip continued: On my way out of town I couldn’t resist stopping by the PSLC to attend a lunch meeting where Turadg Aleahmad and Ido Roll were giving practice talks for ITS2008. Turadg presented on an online authoring tool designed for teachers to use to create worked example math problems. I was surprised to hear that he had over 500 different users submit problems. That is until I heard that he posted an invite on a website offering $10 for each submission. Most of the submissions were unusable.

This vision of providing tools for teachers to create online content is similar to what I envisioned for my dissertation work which led to the eNLVM. My eyes were soon opened to the fact that most teachers do not have the time or skill to create online content, especially from scratch. I suggested to Turadg that if he wanted to encourage better and more problem submissions that they could provide example problems from which teachers could base similar problems. I also pointed out that there is already a massive supply of math problems in textbooks that could be tapped. He and others present mentioned concerns about copyright. To me, this is not a problem. By looking at a math problem you can extract the essence of the problem or it’s “problem type” and use that to easily generate many more of the same type of problem with different cover stories and values. Of course, until you solve a problem it can be difficult to know that the problem has similar solution structure as another. This is the basis for a project I would like to do some day: a library of math problem generators coupled with math test generators that leverage the problem generators and their alignments with standards and textbooks.

Ido presented a study that measured metacognition, specifically help seeking behavior. He began by flaming simple recall as a learning outcome, showing the example of the YouTube video of the child who can point to the names of the countries that her parents name. He did this probably because a PostDoc sitting in the presentation focuses on fact learning (Chinese). Ido’s study compared a new measure to the “assistment” measure used by Carnegie Learning’s tutors as predictors of learning. It seems to me that they pretty much measured the same things, and both are somewhat good predictors of learning.

This is an interesting area. Information seeking is a metacognitive skill: knowing when you know enough to proceed and when you don’t. Having the will to not take the lazy out when you know enough. Knowing where to go to find information you need. The picture is actually much more complex than this. When you are first learning something, or solving a novel problem, it is expected that you would need more information. Better problem solvers and learners recognize this and seek the needed information effectively. As you learn more in an area, you don’t need as much help and so you should stop relying on it. In a situation where making a wrong decision could cause someone to die, the good problem solver relies on additional sources to verify that what they think is a good decision is actually one :-) .

Measuring information seeking behavior is an important way to measure problem solving ability. Unfortunately, school, and even worse, school testing situations, are very unnatural problem solving situations where information seeking behavior is called cheating :-)

2 Responses to “Teacher Authoring and Metacognition at the PSLC”

  1. Hi Joel,

    Thanks for coming to our practice talks and giving your input. Conferences are too infrequent an opportunity to connect with people outside our institutional silos. I’ve recently begun a blog to get my research out into the open, . (The link you included was to my personal/autobiographic blog, and also my name is spelled with a G, “Turadg”.)

    If anyone is interested in the details of the study Joel referenced, there’s an entry on the blog with the abstract and links to the full paper. ( )

    Joel is right that teachers weren’t very eager to participate in the experiment and most of the participation came from people seeking the money. The $10 figure he writes is incorrect though. First off, it was designed as a small controlled experiment to test the effect of a particular design intervention and included a pseudo-experimental examination of the effects of participant expertise. Participant motivation was not a question in this study. If it were, offering money would be a pretty silly thing to do, because that’s not sustainable. (If you care, participants were offered $4 per qualifying contribution, up to 3, and the URL was leaked out to reach many more people than intended.)

    Crunching the numbers, we found that the design manipulation did not improve the quality of the contribution but it did increase their length. To figure out why would take further work that we don’t plan to do. One interesting result regarding participant expertise was that math teachers wrote better problem statements than anyone else, but their solutions were no better and amateurs’ often were better. This is as judged by two math teachers (blind to the conditions).

    Again, thanks for joining our practice session that day and I hope we can exchange more ideas in the future.


  2. [...] at deeper levels of knowledge than what typical assessments do. I didn’t realize this but, Turadg, whose presentation I attended is one of David’s students. I shared my reaction to Turadg’s study with David: in order [...]

Leave a Reply