1. Brian Belland
  2. http://itls.usu.edu/bbelland/grants/career.php
  3. Associate Professor
  4. CAREER: Supporting Middle School Students' Construction of Evidence-based Arguments
  5. http://itls.usu.edu/bbelland/grants/career.php
  6. Utah State University
  1. Jiangyue Gu
  2. CAREER: Supporting Middle School Students' Construction of Evidence-based Arguments
  3. http://itls.usu.edu/bbelland/grants/career.php
  4. Utah State University
  1. Nam Kim
  2. CAREER: Supporting Middle School Students' Construction of Evidence-based Arguments
  3. http://itls.usu.edu/bbelland/grants/career.php
  4. Utah State University
  1. Mark Weiss
  2. CAREER: Supporting Middle School Students' Construction of Evidence-based Arguments
  3. http://itls.usu.edu/bbelland/grants/career.php
  4. Utah State University
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 11, 2015 | 06:21 a.m.

    Very solid and important approach. What are some of the research investigations that students carried out? (or what questions are they trying to address?) What challenges did students and teachers encounter when they used your tools?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 11, 2015 | 03:36 p.m.

    well, one related to how water quality changed in a local river over time and as it proceeded through the valley. Students then took various stakeholder positions to think about how water quality could be optimized for the people of the valley. they then needed to build arguments about this. In another, students analyzed soil quality in various locations in the valley to argue about what should be done with vacant land where the soil samples were taken.
    One challenge is for teachers to know which students need the most help. In our most recent redesign of the tool we built in a tool for teachers to monitor student progress more easily.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 12, 2015 | 09:34 a.m.

    Monitoring student progress is a great addition to the tool. I also am curious about how you capture the arguments or stakeholder positions across classrooms. I imagine that you are getting a range of responses from students and it might be great to illustrate or map some of them.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 12, 2015 | 09:34 a.m.

    Monitoring student progress is a great addition to the tool. I also am curious about how you capture the arguments or stakeholder positions across classrooms. I imagine that you are getting a range of responses from students and it might be great to illustrate or map some of them.

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 12, 2015 | 11:03 a.m.

    well, there are the entires in the scaffolds – they need to write answers to prompts and build their arguments there. there is the culminating activity, like persuasive presentation, and then we capture process data through things like video.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 11, 2015 | 11:39 a.m.

    This sounds like important work that is really having an impact. Brian in your video, you mention that you are working with teachers to help identify their needs, rather than give them a polished curriculum to implement. I’m curious about the process of working one-on-one with the teachers. How many teachers have you been able to work with? How long do you estimate that you spend working with each teacher? Thanks.

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 11, 2015 | 03:32 p.m.

    We have worked with 7 teachers. The amount of time working with teachers I guess is a bit hard to quantify because there is some one-to-one interaction and some asynchronous online PD (which of course also involves us interacting with teachers, just differently). And it varies a little bit by the level of related experience of the teacher. So we worked with three teachers who had never done any inquiry-oriented teaching, and the other teachers had done some. So for the first group, they needed more time. But for all of them, it is a lot more than a one-week workshop. I hope that this answers your question, if not please let me know.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 12, 2015 | 01:57 p.m.

    Yes, it does. Between your response here and below to Nevin’s question, I’ve got a good sense of how you’ve worked to support teachers through the process. Thanks.

  • Icon for: Brian Drayton

    Brian Drayton

    Co-Principal Investigator
    May 11, 2015 | 01:37 p.m.

    I’m curious about two things — First, you report significant epistemic gains in high school — how about middle school? (I would expect a big difference, a priori). Second, are there aspects of argumentation that seem particularly intractable or challenging for HS or MS students in your study — are you seen the sort of thing Deanna Kuhn has reported over the years, or does your scaffolding system discover new issues (e.g. not necessarily related to the understanding of evidence)?

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 12, 2015 | 09:38 a.m.

    Important question—to take a deeper look at the nature of argumentation that is emerging from these explorations (and examples of them). Is there a way to capture the maturity or sophistication of the scientific argumentation of students over time?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 12, 2015 | 10:51 a.m.

    to address Dr. Drayton’s question, in one study, we found that simply the process of engaging in PBL led to significant improvements in epistemic beliefs among higher-achieving and average-achieving 7th grade students by factors of .81 SDs and .31 SDs, respectively. In another study, we found that the epistemic beliefs of 6th grade students did not improve from pre to post, but that their epistemic aims were substantially more sophisticated when that teacher was part of the microsystem in which they were engaging.
    To address Dr. Guilfoy’s question, there are different approaches. Often, people look at the structure of arguments, e.g., claims-evidence-premises. Clark Chinn recently proposed that one think about argument quality from the perspective of epistemic support.

  • Jeffrey Birchfield

    Guest
    May 14, 2015 | 02:15 p.m.

    So I was wondering what instrument are you using to measure epistemic belief’s? Also, when determining quality of argument what is your grain size? Are you looking at individual pieces in a students argument and quantifying how many instances of evidence are provided for a claim or simply the sophistication of the claim and its ability to persuade?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 15, 2015 | 04:37 p.m.

    that’s a great question. it is a complicated issue. For epistemic beliefs, we have a self-report survey that is adapted with permission from that used by:
    Elder, A. D. (1999). An exploration of fifth-grade students’ epistemological beliefs in science and an investigation of their relation to science learning. Retrieved from ProQuest Dissertations & Theses Full Text. (Publication number AAT 9929819)
    But we also supplement that with video data of students as they are working during the unit.
    For looking at argument quality, it is again a complicated issue. We have developed various approaches looking at the structure and content of students arguments. We also are looking to integrate a new approach advocated by Clark Chinn of looking at the epistemic foundations of students arguments.

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 11, 2015 | 01:44 p.m.

    Tremendous project and I really like how you start the conversation with teachers before planning how to conduct the scaffolding. This sort of echoes Debra’s question, but I’m interested in knowing more about the process you use to co-create the curriculum supports and activities with teachers. Do you break the process down into particular phases? Are there definite things that you need to accomplish? That the teacher does?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 12, 2015 | 11:01 a.m.

    we have a generic computer-based scaffold to support the creation of evidence-based arguments that we continually work to improve based on empirical studies. But we also develop content support by working with the teacher. this is to avoid having students just do a google search and then unproductively go through the first search results. The type of support required by teachers in these types of units is contingent scaffolding, so much of what we need to do is help teachers learn to dynamically assess students’ current abilities and customize support on the spot to provide just the right amount of support. That is one of the hardest things to do as a teacher, and of course we do not think we have all the answers on how to do it, but we help teachers understand the nature of the different kinds of support they can choose from, what kinds of questions they can ask to dynamically assess students, and we try to have other teachers model this through video.
    so for phases through which we work with teachers, I would say the first phase is really getting to know the teacher, her/his experiences, and what areas of the standards he/she sees as areas in which students need to improve. Next, we propose some ideas for units and work with the teacher to optimize those ideas.
    One report in which we discuss how we worked with one teacher is here:
    http://dx.doi.org/10.1007/s10972-015-9419-2

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 12, 2015 | 11:07 a.m.

    Very informative! What do others think of this project? Are there other questions people have for Brian about co-planning with teachers our implementing these learning experiences? Any ideas for Brian on ways to tweak this model?

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 13, 2015 | 09:16 a.m.

    I am wondering if you have done outreach to involve city or county officials, planners, or ecologists to take a look at the students’ work, persepctives, and arguments. It might be interesting to have students know that they will have an opportunity to present their arguments to people responsible for decision making about these important issues. Also, are you linking topics to critical issues faced locally or globally or in the news?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 13, 2015 | 11:17 a.m.

    yes, we have had them present to county commissioners. We have been trying to tie what students do more into local issues, so that students can think they have a greater stake in what they are investigating. it seems to be easier to get student buyin that there is a problem when it is a local issue, that they can get their hands dirty investigating, rather than an issue of a more national or global level. Of course, the issues that they investigate – water quality, soil science, and so forth – are not just local issues, they are also issues in other areas.
    In the area of the country where we are, it is a desert climate, and being in the Salt Lake watershed, there is a lot of land that is unusable for farming due to salt deposits and so forth. So it is comparatively easy to get kids to think that there issues that directly impact their lives.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 13, 2015 | 07:29 p.m.

    Do you think the increase in student buy-in, that you’re seeing in projects with local connections, is something that you can measure (or are measuring) as an outcome of your work? Or is this more of an informal observation from your interactions with the students?

  • Icon for: Brian Belland

    Brian Belland

    Lead Presenter
    Associate Professor
    May 15, 2015 | 04:33 p.m.

    well, it is something that has been seen through informal interactions, but also some through interviews. It is also predicted by motivation theory. I wouldn’t say that it is a major thrust of our research, but when you are dealing with students who largely have never done inquiry-based science before, you need to do what you can to make what they are doing potentially personally meaningful.

  • Icon for: Mark Weiss

    Mark Weiss

    Co-Presenter
    May 13, 2015 | 11:29 a.m.

    Problem design and problem presentation are areas where I think we have much to learn, especially in K-12 environments. In Barrow’s authentic PBL (aPBL), he has the advantage of medical students committed to a profession, clinical cases from real patients, serious consequences if their problem solving isn’t correct, and a propensity among the students to engage in a “willful suspension of disbelief” (Coleridge). Replicating these elements of aPBL in a K-12 environment is challenging to say the least. Still, it has been amazing to read student writing after a unit or listen to their presentations, as they describe their journey from thinking that water was “ok” to “the water isn’t really as great as we thought.” Their solution propositions are often really quite creative and thoughtful.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 14, 2015 | 04:03 p.m.

    Thanks to everyone for a rich discussion. I learned a lot.

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 14, 2015 | 09:37 p.m.

    Thanks all – the dialogue here was terrific.

  • Further posting is closed as the showcase has ended.