1. Edys Quellmalz
  2. http://www.wested.org/personnel/edys-quellmalz/
  3. Director Technology Enhanced Assessment and Learning Systems
  4. SimScientist Assessments: Physical Science Links
  5. http://simscientists.org
  6. WestEd
  1. Matt Silberglitt
  2. Senior Research Associate
  3. SimScientist Assessments: Physical Science Links
  4. http://simscientists.org
  5. WestEd
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Jackie DeLisi

    Jackie DeLisi

    Facilitator
    Research Scientist
    May 11, 2015 | 11:41 a.m.

    This is a very interesting project. Can you say more about how you are establishing the validity of the assessment?

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Lead Presenter
    Director Technology Enhanced Assessment and Learning Systems
    May 11, 2015 | 05:16 p.m.

    We begin with external reviews of alignment, science content and item quality, conduct cognitive labs, then analyses of reliability and validity. Please go to simscientists.org to find the article published on this Quellmalz, Timms, et al 2012.

  • Icon for: Randy Kochevar

    Randy Kochevar

    Facilitator
    Senior Research Scientist
    May 11, 2015 | 06:35 p.m.

    These look like really engaging simulations! What kind of teacher training goes into implementing them in a classroom?

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Co-Presenter
    Senior Research Associate
    May 11, 2015 | 08:01 p.m.

    We provide teacher professional development to participants in our research projects. This has been either a primarily face-to-face workshop, with ongoing support, or, more recently, a web-based program. Each type of teacher professional development is about 8–12 hours total, although the web-based program is designed to allow for more and deeper exploration of model-based learning, formative assessment, and how these strategies apply to SimScientists.

  • Icon for: Kathy Perkins

    Kathy Perkins

    Director
    May 12, 2015 | 12:58 a.m.

    It’s great to hear about how you’ve been approaching new forms of assessment with the sims.
    What types of data are you collecting from the sim interaction, and how important is that to your assessment?
    Also, what are some of the effective teacher faciliation strategies you see teachers using around these tools, when engaging in formative assessment?

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Lead Presenter
    Director Technology Enhanced Assessment and Learning Systems
    May 12, 2015 | 01:23 p.m.

    The SimScientists assessments use Evidence-Centered Assessment Design (ECD) to specify the knowledge and skills to be assessed (Student Model), the features of the assessment tasks and questions that will elicit explicit evidence of learning on the targets (Task Models), and the rules for collecting, scoring, and aggregating, and reporting the evidence for the targets (Evidence Model). These three components for designing and collecting the student data are ESSENTIAL for using student response as evidence to support the claims of student learning of the specified science knowledge and practices.

    The curriculum-embedded simulation assessments collect data on the targeted NGSS core ideas and practices specified at the outset of the design of the assessments. Students responses are collected on tasks and questions requiring building models, designing, conducting, saving, and interpreting investigations, plus writing explanations, critiques, and arguments. The Learning Management System underlying the simulation interactions uses scoring and aggregation rules to produce Progress Reports on the targeted science knowledge and practices. Reports classify student responses to the questions and tasks for these content and practice targets into three categories: “On Track” “Progressing” and “Needs Help”.

    In the professional development provided before teachers use the SimScientists simulation-based assessments, then during the implementation of the assessments, strategies are shared for how to use the Progress Reports as a resource for formative assessment to adjust instruction. Ideas for grouping students based on the Progress Reports for classroom, off-line Reflection Activities are shared and developed during the professional development.

    An end-of unit simulation-based benchmark assessment for a unit provides a summative proficiency report for a student’s performance on the specified content and practices targets.

  • Icon for: Ben Sayler

    Ben Sayler

    Facilitator
    Professor of Physical Science and Mathematics
    May 12, 2015 | 08:01 a.m.

    Fascinating work. Having students conduct an experiment by way of simulation seems like an excellent tool for regular instruction — not just as a way to assess. Might teachers be able to use your simulations steadily throughout their teaching, thereby making assessment essentially indistinguishable from instruction?

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Lead Presenter
    Director Technology Enhanced Assessment and Learning Systems
    May 12, 2015 | 01:30 p.m.

    We are developing strands of SimScientists assessments for middle school curricula in life, physical, and Earth science. There are three or four suites of SimScientists assessments in each of those three content areas. In the Reflection Activities following student use of one of the curriculum-embedded simulation assessments, teachers can use the Progress Reports on student performance to provide follow up, adjusted instruction. One of the strategies is for teachers to use the simulations to review and reinforce content and/or practices needing more attention.

    We are also developing simulation-based instructional modules to provide more review and reinforcement of core ideas and practices and cross-cutting concepts. Please go to the website, simscientists.org for more information on the projects and resources.

  • Icon for: Ben Sayler

    Ben Sayler

    Facilitator
    Professor of Physical Science and Mathematics
    May 13, 2015 | 09:59 a.m.

    Will do. Thanks!

  • Icon for: Meredith Thompson

    Meredith Thompson

    Post doctoral fellow
    May 13, 2015 | 02:10 p.m.

    The simulations are wonderful. I especially like how the water molecules are vibrating even when water is in the solid state!

    I am curious about the details for the task and evidence model for the assessments. Are they on your website? I have read your J Ed. Psych article from 2013 but did not see the specific on those aspects of the assessment. For example, how is a “designing experiments” simulation scored in SimScientists Assessment? Are the criteria established within the team, or by reference to NGSS, or elsewhere?

    Thanks!

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Lead Presenter
    Director Technology Enhanced Assessment and Learning Systems
    May 13, 2015 | 02:59 p.m.

    The specifications for the task and evidence models for the simulation tasks are internal design documents and not published. Criteria for scoring the investigations’ designs, trials, and interpretations are developed by our team and referenced to the NGSS.

  • Icon for: Meredith Thompson

    Meredith Thompson

    Post doctoral fellow
    May 13, 2015 | 03:02 p.m.

    OK! Thank you.

  • Further posting is closed as the showcase has ended.