See Related: Science Cyberlearning
  1. Michelle Wilkerson-Jerde
  2. The SiMSAM Project
  3. Tufts University
  1. Chelsea Andrews
  2. Graduate Research Assistant
  3. The SiMSAM Project
  4. Tufts University
  1. Brian Gravel
  2. The SiMSAM Project
  3. Tufts University
  1. Yara Shaban
  2. Graduate Research Assistant
  3. The SiMSAM Project
  4. Tufts University
Public Discussion
  • Icon for: Carolina Milesi

    Carolina Milesi

    Facilitator
    May 11, 2015 | 03:50 p.m.

    Do students use SiMSAM differently in a regular classroom or in an after-school environment? Does the effect of SiMSAM vary across these contexts?

  • Icon for: Michelle Wilkerson-Jerde

    Michelle Wilkerson-Jerde

    Presenter
    May 12, 2015 | 09:34 p.m.

    Hi Carolina!

    There are certainly differences in how we are able to enact SiMSAM activities in classrooms versus after school environments. For example, after school environments often have more fluid attendance, and in many programs students do not meet every day. So we have found ourselves working to find strategies that allow us to support persistence and continuity (making videos of the modeled phenomenon that students can consult, creating discussion summaries, re-introducing driving questions) in those environments. Youth are also more flexible in what they expect from after-school activities and what they think is expected of them, which can lead to in some ways richer and more comfortable “science talk” earlier on. We haven’t yet had an opportunity to look deeply at comparing outcomes or modeling patterns across those contexts, but those are some dimensions I’d expect to matter.

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Director Technology Enhanced Assessment and Learning Systems
    May 11, 2015 | 07:40 p.m.

    What measures do you use to monitor learning? What NGSS standards do you assess?

  • Icon for: Michelle Wilkerson-Jerde

    Michelle Wilkerson-Jerde

    Presenter
    May 14, 2015 | 09:50 a.m.

    Hi Edys!

    There are some elements of this in my other responses, so I’ll try not to repeat here.

    With respect to NGSS standards specifically, the nature of the tool and tasks we have been using at this stage of the project means we focus a lot on Matter and Its Interactions as a disciplinary core idea. For practices, we have been focusing on developing and using models, and using mathematics and computational thinking. Specifically, we are interested in the intersection of these two things, how the kids learn to use the language of SiMSAM – the available functions and ways of “slicing up the world” – as the building blocks for models that help them explore new or different theories about how scientific events take place.

    We do most of this analysis by looking in detail at what students are doing moment to moment during the activity through things like video and discourse analysis (often synchronized with students’ on-screen actions to see how they are using the tool). But given the exploratory nature of the project, we’re looking for not just whether but how these understandings come to be over time. So while we do compare against and look for evidence of NGSS ideas and practices, we also try to document other ideas and ways of thinking kids might be leveraging that are productive, even if they aren’t necessarily aligned but are helping students move forward in their understandings.

  • Icon for: Tamara Moore

    Tamara Moore

    Facilitator
    May 12, 2015 | 01:42 a.m.

    I love the modeling aspects of this project. Can you tell us more about the kinds of questions you ask and where they come from (such as the how condensation forms on pop bottles)?

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Director Technology Enhanced Assessment and Learning Systems
    May 12, 2015 | 05:27 p.m.

    Our science team identifies the phenomena that are important to model for the particular middle school life, Earth, and physical science units. We use a systems framework of components, interactions, and emergent system behavior to shape the models to be used, investigated, and built.

  • Icon for: Janice Gobert

    Janice Gobert

    Associate Professor, CEO
    May 12, 2015 | 05:08 p.m.

    Hi- what are you doing in the form of assessments for this project?

    Janice

  • Icon for: Michelle Wilkerson-Jerde

    Michelle Wilkerson-Jerde

    Presenter
    May 12, 2015 | 09:28 p.m.

    Hi Janice! The vast majority of our work involves content analysis of student discourse during the activity – identifying patterns and shifts in students’ content focus, mechanistic reasoning, and engagement in practices as they work to construct and revise their models across media. We do this looking primarily at the student group level, and also looking at the level of classroom discourse when appropriate.

    Depending on the setting, we sometimes administer pre-post tasks as well, usually these focus on content-specific things like particulate nature of matter and random motion. We also conduct clinical interviews with focal students over the course of the activity.

    An example of the content analysis is described in Wilkerson-Jerde, M. H., Gravel, B. E., & Macrander, C. A. (2015). Exploring shifts in middle school learners’ modeling activity while generating drawings, animations, and simulations of molecular diffusion. Journal of Science Education and Technology, 24(2-3), 396-415. doi: 10.1007/s10956-014-9497-5.

  • Icon for: Edys Quellmalz

    Edys Quellmalz

    Director Technology Enhanced Assessment and Learning Systems
    May 12, 2015 | 05:35 p.m.

    The project is to design a balanced, multi-level science assessment system composed of curriculum-embedded assessments, end of unit summative assessments, and year-end assessments that are simulation-based. We also administer pre/post conventional tests. The process for studying their alignments with NGSS, science quality, and technical quality (reliability and validity) are described in the JRST 2012 article, Quellmalz, E. S., Timms, M. J., Silberglitt, M. D., & Buckley, B. C. (2012). Science assessments for all: Integrating science simulations into balanced state science assessment systems. Journal of Research in Science Teaching, 49(3), 363–393.

  • Icon for: Dacid Lustick

    Dacid Lustick

    Facilitator
    May 13, 2015 | 08:59 a.m.

    I love this project! I love the combination of analog artistry and digital manipulation. Cudos to the team for such impressive work! What is the learning curve for students to become comfortable with the software? How much time is dedicated to learning the technical aspects of the program versus using the program effectively? How is such time investments justified when there are so many learning objectives in a year long science course? What are some of the other questions students’ need to address with this approach? The two questions shared in the video were great, but where can we find out all the questions? Who came up with the questions and how were they selected?

  • Icon for: Michelle Wilkerson-Jerde

    Michelle Wilkerson-Jerde

    Presenter
    May 13, 2015 | 11:34 a.m.

    Thank you David!

    So far we’ve found that students take to the software relatively easily. They are usually up and running with stop motion animation within a few minutes, and they are usually constructing simulations with chains of interactions within the first or second day. Since the simulations are fundamentally based on student-created objects, there is less of a distinction between “learning the software” and “making the model” – they are learning how to make their objects behave in ways they want. That isn’t to say there aren’t frustrating or difficult aspects, but we work hard to make sure the difficult is about what is actually happening in the system, and not just how to make the simulation go. Most of our workshops and classroom/after-school activities take place over the course of 4 or 5 days for one modeling task.

    Time is always an issue for teachers and people interested in classroom practice. We try to think about scientific practices – things like questioning, representing, modeling – and content in an integrated way that is aligned with grade-level curriculum. So we don’t see a given activity as just about one thing like learning to model or program, but also giving students think to really dig into what things like evaporation or air pressure really are, and to engage in rich science talk and debate. With this approach we are trying to focus not just on this particular activity, but also on helping students and classrooms develop scientific norms, and ways of speaking precisely about mechanism and causality that we hope will be useful for other lessons as well (this is something we’re looking forward to exploring more deeply in the future!), and for how they view science and themselves more broadly.

    Most of our prompts are adapted from or inspired by existing research exploring representation, molecular theory, and technology-mediated modeling. They all involve invisible molecular processes that CoPI Gravel has termed “experiential unseens”. Some places we’ve used so far are the IQWST/MoDeLS projects (for evaporation, condensation, diffusion questions), the work of Tytler and Johnstone (evaporation/nature of matter), and Wright (sound propagation). Sometimes they include physical setups, like building a water still or exploring with materials and then questioning why, for example, the water that collects in a still does not contain food coloring or salt even if the source water does. Depending on teacher and program needs, we may develop or adapt questions. The idea of sharing a list of questions somewhere for people to consult is great, and I expect it would continue to grow!

    Here is a reference to the “experiential unseens” chapter: Gravel, B. E., Scheuer, N. & Brizuela, B. M. (2013). Using representations to reason about air and particles. In B. Brizuela & B. Gravel (Eds.), Show me what you know: Exploring student representations across STEM disciplines. New York, NY: Teachers College Press.

    Here is a recent paper we resonate with that talks about the connectedness of learning content and modeling practice: Manz, E. (2012). Understanding the codevelopment of modeling practice and ecological knowledge. Science Education, 96(6), 1071-1105.

  • Icon for: Dacid Lustick

    Dacid Lustick

    Facilitator
    May 13, 2015 | 02:30 p.m.

    Michelle,
    Thanks for such a thorough and helpful response. I am particularly interested in the ‘experiential unseen’ concepts. From my own work, these questions you have identified can serve as the basis for high quality ‘focus questions’ —questions that promote sustained reasoning by students over time. What is your project website?

  • Icon for: Michelle Wilkerson-Jerde

    Michelle Wilkerson-Jerde

    Presenter
    May 14, 2015 | 09:40 a.m.

    This is great – I am loving the focus on questions in your work. We have done some professional development where teachers develop questions for use with SiMSAM and your work can inform how we go about that. The project website is http://sites.tufts.edu/simsam/ but there is also a lot of information available at our lab website, extech.tufts.edu.

  • Further posting is closed as the showcase has ended.