NSF Awards: 1320064
Our research program is focused on better understanding the social and pragmatic nature of conversation, and using this understanding to build computational systems that can improve the efficacy of conversation between people, and between people and computers, to the benefit of human learning. In order to pursue these goals, we invoke approaches from computational discourse analysis and text mining, conversational agents, and computer supported collaborative learning. Our research towards this end has birthed and substantially contributed to the growth of two thriving inter-related areas of research: namely, Automated Analysis of Collaborative Learning Processes and Dynamic Support for Collaborative Learning, where intelligent conversational agents are used to support collaborative learning in a context sensitive way. Our approach is always to start with investigating how conversation works and formalizing this understanding in models that are precise enough to be reproducible and that demonstrate explanatory power in connection with outcomes that have real world value. The next step is to adapt, extend, and apply machine learning and text mining technologies in ways that leverage that deep understanding in order to build computational models that are capable of automatically applying these constructs to naturally occurring language interactions. Finally, with the technology to automatically monitor naturalistic language communication in place, the next stage is to build interventions that lead to real world benefits. Our proven technology has demonstrated significant impact on learning in dozens of classroom studies in math, science, and engineering.
NSF Awards: 1320064
Our research program is focused on better understanding the social and pragmatic nature of conversation, and using this understanding to build computational systems that can improve the efficacy of conversation between people, and between people and computers, to the benefit of human learning. In order to pursue these goals, we invoke approaches from computational discourse analysis and text mining, conversational agents, and computer supported collaborative learning. Our research towards this end has birthed and substantially contributed to the growth of two thriving inter-related areas of research: namely, Automated Analysis of Collaborative Learning Processes and Dynamic Support for Collaborative Learning, where intelligent conversational agents are used to support collaborative learning in a context sensitive way. Our approach is always to start with investigating how conversation works and formalizing this understanding in models that are precise enough to be reproducible and that demonstrate explanatory power in connection with outcomes that have real world value. The next step is to adapt, extend, and apply machine learning and text mining technologies in ways that leverage that deep understanding in order to build computational models that are capable of automatically applying these constructs to naturally occurring language interactions. Finally, with the technology to automatically monitor naturalistic language communication in place, the next stage is to build interventions that lead to real world benefits. Our proven technology has demonstrated significant impact on learning in dozens of classroom studies in math, science, and engineering.
Continue the discussion of this presentation on the Multiplex. Go to Multiplex
Carolina Milesi
Senior Research Scientist
I wonder what grades or school level has the research on computer supported collaborative learning been focused on. Would the expansion to machine learning be similar for elementary grades, middle school, high school, college, and adult learning?
Carolyn Rose
Associate Professor
In our own research, we have done similar work with middle school, high school, and college age. The models do indeed need to be substantially different. Students have very different skill levels when it comes to articulation of reasoning, perspective taking, negotiating meaning, ideation, decision making, etc. All of these things figure in. However, we have not found substantial differences in our ability to accommodate the differential needs and abilities of learners at these different levels as long as we base our decisions about how to support them differently on data. All of our work is iterative and data driven.
Tamara Moore
Associate Professor
I have often struggled with how to get students to engage meaningfully online. I was hoping to hear more about what you believe will be the impact of your work on other educators? How will you get your ideas propagated?
Carolyn Rose
Associate Professor
Hi Tamara, thanks for your questions!
First, as to the question of how to get learners to engage meaningfully online: Our biggest success measure has been the impact on pre to posttest gains we have achieved with our interventions, however we have also examined the effect the facilitation agents have on the interaction between learners. Most recently we have examined the impact of participation in synchronous chat activities in MOOCs both in terms of their positive impact on discussion behavior in comparison to the kinds of interaction we see in other social spaces associated with the MOOC as well as the impact on reducing attrition:
Ferschke, O., Howley, I., Tomar, G., Yang, D., Rosé, C. P. (in press). Fostering Discussion across Communication Media in Massive Open Online Courses, Proceedings of Computer Supported Collaborative Learning
Ferschke, O., Yang, D., Tomar, G., Rosé, C. P. (in press). Positive Impact of Collaborative Chat Participation in an edX MOOC, Proceedings of AI in Education
But we also have papers that explore the positive effect of the conversation computer agent facilitators on the intensity of the interaction between learners:
Dyke, G., Adamson, A., Howley, I., & Rosé, C. P. (2013). Enhancing Scientific Reasoning and Discussion with Conversational Agents, IEEE Transactions on Learning Technologies 6(3), special issue on Science Teaching, pp 240-247.
Adamson, D., Dyke, G., Jang, H. J., Rosé, C. P. (2014). Towards an Agile Approach to Adapting Dynamic Collaboration Support to Student Needs, International Journal of AI in Education 24(1), pp91-121.
Second, as to the impact on educators: Our partnership with the Institute for Learning, which I referred to in the 3 minute video, gave us more insight into the role computer supported collaborative learning (CSCL) interventions can play in teacher professional development efforts. What we saw in our several years of involvement was that the CSCL activities acted as a catalyst. Since students came into the teacher led whole class discussions already having had a lively smaller scale discussion, they were ready to take on a more agentive role in their interaction in the whole class setting. That made it easier for the teachers to take up the facilitation techniques they were being trained to use. We have observed teachers feeling somewhat hesitant to include small group activities in their classes, especially if discipline problems are an issue, because they are concerned about their students becoming unruly. That’s where the technology comes in handy. Since we are able to monitor the discussions in real time and use the analysis to trigger the intervention of conversational computer agents, each small group can have it’s own facilitator. Thus, the technology is in many ways an augmentation of the facilitation the teacher is able to offer in his/her rounds around the classroom.
See:
Clarke, S., Chen, G., Stainton, K., Katz, S., Greeno, J., Resnick, L., Dyke, G., Howley, H., Adamson, D., Rosé, C. P. (2013). The Impact of CSCL Beyond the Online Environment, Proceedings of Computer Supported Collaborative Learning
Third, on dissemination: I am sorry I didn’t have as much time to talk about this in the 3 minute video as I had planned. But it’s great to have the opportunity in response to your email. All of the software we have developed is freely shareable. We have formed an opensource community called DANCE to reach out to MOOC instructors (dance.cs.cmu.edu). As for educators in other sectors, we are part of a Gates foundation effort called the Digital Learning Network where we are partnering with three different community college systems to improve their online education offerings, including California, Georgia, and Arkansas. We’re also partnering with other large networks related to online learning including the Open Learning Initiative, the Knowledge Building Community and the Math Forum. We invite anyone who sees this entry here to contact us about collaboration as well. We are always happy to work with new collaborators.
Thanks for the great questions,
Carolyn
Tamara Moore
Associate Professor
Thank you for the reading list. I will work on getting many of these into my library! I love that your software is free. Thank you for your work and for presenting it here.
Dacid Lustick
Associate Professor
This is a topic I really appreciate as I teach online and rely heavily on weekly discussions on targetted topics. However, the presentation seems more geared for a professional audience rather than a general audience. What is the time frame of these discussions described in the video? are they real time chat sessions? Or, are the exchanges taking place over a period of days? I am also unclear what the specific problem this software solves. Does it make the teacher’s job easier? more efficient? more effective? How is it different from a teacher who activiely engages a class of learners in an online discussion?
Carolyn Rose
Associate Professor
Hi David,
Thank you for your excellent questions! The work described in the video focuses mainly on synchronous discussions that students participate in in small groups usually for 30-45 minutes. But we have also done work supporting asynchronous discussions that extend for much longer periods of time.
The problem that the software solves is that it keeps an eye on each group of students in real time, which the teacher is not able to do while the class is divided into small groups since the teacher can’t be listening and participating in more than one discussion at a time. This frees up the teacher to focus comfortably on one group at a time without worrying about chaos ensuing in the other groups in the mean time.
In the case of the asynchronous discussions, the work we have done has focused on increasing the rate at which students who start threads attract the appropriate people to participate in the discussion on their thread. You can read about that work in the following article:
Howley, I., Tomar, G., Yang, D., Ferschke, O. and Rosé, C. P. (in press) Expectancy Value Theory of Help Seeking Applied to Features in MOOCs, Proceesings of AI in Education
That paper describes an intervention called the Quick Helper that leverages a social recommendation algorithm to pick out participants in a learning community that would be good choices to invite to a thread to enhance the discussion. The student who posts the thread starter is presented with the option of having an invitation email sent to these selected people with an invitation and a link to the thread. The details of the social recommendation algorithm are published separately:
Yang, D., Adamson, D., & Rosé, C. P. (2014). Question Recommendation with Constraints for Massive Open Online Courses, in Proceedings of the 8th ACM Recomender Systems Conference
We have done a lot of work on automated analysis of online asynchronous discussions, which enables us to design a wide range of strategies for selecting invitees for this kind of social recommendation approach. Here are just two recent examples:
Xu, W., Yang, D., Wen, M., Koedinger, K. R., & Rosé, C. P. (2015). How does student’s cognitive behavior in MOOC Discussion Forums affect Learning, Proceedings of Educational Data Mining
Yang, D., Wen, M., Howley, I., Kraut, R., & Rosé, C. P. (2015). Exploring the Effect of Confusion in Discussion Forums of Massive Open Online Courses, in , Proceedings of the Second ACM Conference on Learning @ Scale
Feel free to post follow up questions!
Best wishes,
Carolyn
Dacid Lustick
Associate Professor
Thanks for the clarification. So, the platform helps a teacher track/record/evaluate small group discussions with greater fidelity than the old fashioned ‘analog’ method of circulating and listening to groups work.
Does the platform interfere with the quality of the student to student interactions? How do you know?
david
Carolyn Rose
Associate Professor
Hi David,
Thanks so much for your interest! That’s an interesting question. The answer would differ depending upon the age group, the task, and what you mean by interfere.
There have been many published media studies comparing communication in F2F, video, audio, and chat, with differential effects depending on a myriad of contextual factors. We don’t typically do comparisons with a face-to-face control as part of our studies. What we have formally evaluated is the extent to which the support we add to the chats improves or impedes communication and learning. We have a series of published positive results showing net improvements on both in supported conditions, however, not every design we have tested has led to positive effects. Rather, we have learned how important it is to have data driven design principles. Some principles we have identified that have empirical support in our research are:
As much as possible, the agent’s intervention should be at the discretion of the learner
The agents should show openness to ideas
The agent should seek to break tension through sparing uses of light humor or off-topic conversation
The agent should challenge the students and not offer support that is not needed
Best wishes,
Carolyn
Further posting is closed as the showcase has ended.