Icon for: Janice Gobert

JANICE GOBERT

Worcester Polytechnic Institute, Apprendis
Public Discussion
  • Icon for: Randy Kochevar

    Randy Kochevar

    Facilitator
    May 11, 2015 | 06:12 p.m.

    I think the idea of an “intelligent tutor” is really fascinating! How do the outcomes using this system compare to more linear, non-responsive systems?

  • Icon for: Janice Gobert

    Janice Gobert

    Presenter
    May 12, 2015 | 04:45 p.m.

    Hi— non-responsive is clearly not as good as one that responds. What in particular, are you asking? Its efficacy?

    jg

  • Icon for: Randy Kochevar

    Randy Kochevar

    Facilitator
    May 14, 2015 | 06:31 p.m.

    Sorry – my question wasn’t clear. I was curious how you assess the effectiveness of an “intelligent tutor” as compared to other systems of teaching the same material, but which lack the “intelligence.” It seems intuitively true that it is better – I’m just wondering how you measure the improvement. Thanks!

  • Icon for: Ben Sayler

    Ben Sayler

    Facilitator
    May 11, 2015 | 11:09 p.m.

    What levels of sophistication have you been able to achieve with the intelligent tutor? Is it at a level that it can rival a human tutor? Could you give an example of a student challenge that the intelligent tutor is good at addressing?

  • Icon for: Janice Gobert

    Janice Gobert

    Presenter
    May 12, 2015 | 04:47 p.m.

    Hi- the agent Rex, a dinosaur, can scaffold the student if s/he is NOT: testing their stated hypothesis, if the student is changing too many variables at once, if they make an incorrect inference from their data, if they fail to select proper trails to warrant their claims, and if their explanation for their experiment is too short.

  • Icon for: Ben Sayler

    Ben Sayler

    Facilitator
    May 12, 2015 | 10:01 p.m.

    Excellent. That helps.

  • Icon for: Brian Drayton

    Brian Drayton

    Co-Principal Investigator
    May 12, 2015 | 10:28 a.m.

    The last time I tuned in to intelligent tutoring was , I dunno, 20 years ago when people were building things like PASCAL tutors. How much more intelligent is your approach than those were — for example, to what extent does the system build a model of me when I’m using it? Is it able to provide “warmer/colder” sort of guidance (rather than "Sorry! Try again!)? and have you decided on some kinds of micro worlds that are more tractable for this than others? If I was going to read one or two papers to catch up with this line of work, what would you recommend?

  • Icon for: Janice Gobert

    Janice Gobert

    Presenter
    May 12, 2015 | 04:49 p.m.

    Hi- we are using some analytic techniques that track students’ skill levels over time over time. It provides levels of scaffolding (we have 4 levels). Our assessment and tracking works across many domains in Physical, Life, and Earth Science.
    jg

  • May 12, 2015 | 03:21 p.m.

    I’m asking the same question as Brian. How to catch up? Thanks for your clear explanation and representation of your work.

  • Icon for: Janice Gobert

    Janice Gobert

    Presenter
    May 12, 2015 | 04:49 p.m.

    What do you mean by “how to catch up?”

    jg

  • May 12, 2015 | 05:37 p.m.

    Papers/articles in the field of intelligent tutoring?

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Director - EdGE at TERC
    May 12, 2015 | 09:25 p.m.

    Hi Janice – nice video! Is the eye tracking tool you are using customized to Rex? or can it be generalized to other settings such as games? Also how do you decide what the student should be attending to, is that always obvious in your tutorials? We are doing similar work in games and I am wondering how it translates.

  • Icon for: Jackie DeLisi

    Jackie DeLisi

    Facilitator
    May 14, 2015 | 10:35 a.m.

    Have you learned anything through the eye tracking data about students’ learning through visualizations?

  • Further posting is closed as the showcase has ended.