See Related: Science PD Models Research

Public Discussion

  • Icon for: Jessica Hunt

    Jessica Hunt

    Facilitator
    May 11, 2015 | 04:07 p.m.

    This is a very interesting project. I am wondering what initial supports are put into place for lesson planning and implementation (i.e., ideas for facilitation, questioning, tasks, etc.) prior to any lesson analysis/reflection. Do you find that there are any specific challenges teachers experience initially?

    I also would love to hear more about the lesson analyses teachers engage in as they reflect on their practice. Are there guiding questions or other means of orienting and/or supporting teachers’ noticing of specific elements of their practice?

  • Icon for: Jody Bintz

    Jody Bintz

    Co-Presenter
    May 11, 2015 | 07:36 p.m.

    Early in the program, we lay a strong foundation for lesson/video analysis by helping participants connect to some key pieces of research (e.g., findings from How Students Learn Science in the Classroom and from the TIMSS video study) as they deepen their understanding of the two lenses and associated strategies. The lenses and strategies (along with important science ideas or common student ideas) become the focus on the lesson/video analysis. The lenses/strategies are what we expect teachers to notice (or not!). Initially, teachers analyze video of other teachers, then eventually analyze video of one another—all in a study group setting. This progression helps build a community where they can safely analyze video of their one another. The video they analyze of one another is taken/captured when they implement “STeLLA Lesson Plans” that serve as initial supports for learning the lenses/strategies. The lesson plans are specifically designed to call out the strategies, possible teacher questions, and student responses. As for specific challenges that teachers experience initially—the challenges are as different as the teachers. Commonly teachers struggle in two main areas: 1) most struggle in deepening their content knowledge and 2) some struggle with how the program challenges their beliefs about teacher and learning (i.e., some teachers really struggle too develop a student thinking lens). As you may be able to tell, the program is transformative and that involves cognitive dissonance and time to resolve and develop a new repertoire of classroom practice.

  • Icon for: Jessica Hunt

    Jessica Hunt

    Facilitator
    May 11, 2015 | 07:40 p.m.

    Thank you for sharing! I see this a lot in my mathematics pre-service teacher education courses (i.e., deepening of content knowledge and beliefs about what it means to teach – and learn – mathematics). It is interesting that you are also experiencing these challenges. Do you find these challenges dissipate over time?

  • Icon for: Jody Bintz

    Jody Bintz

    Co-Presenter
    May 11, 2015 | 07:48 p.m.

    Each teacher/group is different, but those that grow the most think differently or very differently about teaching and learning by the end of the 1-year program. Then it’s time to work with a whole new study group—or in my favorite world, group of new PD leaders.

  • Icon for: Neil Plotnick

    Neil Plotnick

    Facilitator
    May 11, 2015 | 09:29 p.m.

    Your analysis indicates that teacher content knowledge was enhanced over the control group. What factors do you believe most consistently serve to boost this ability? Does the increase come about holistically simply through the mechanisms of the STeLLA lesson planning or is it the intrinsic nature of collaboration with peers?

  • Icon for: Christopher Wilson

    Christopher Wilson

    Presenter
    May 11, 2015 | 10:31 p.m.

    Hi Neil – that’s a great question, and one we’ll be looking more at in subsequent analyses. We know that teachers’ developing deeper content knowledge is important, and our two groups of teachers did that in different ways. Our comparison group worked with university science faculty during a summer institute and in sessions throughout the year, whereas the STeLLA teachers spent half of their summer institute with science faculty and half with STeLLA PD leaders working on lesson analysis, then spent their sessions throughout the year examining their own and each other’s teaching through the two STeLLA lenses. Both groups received the same number of hours of professional development. Both groups of teachers and students showed significant gains on their science content tests, but the much greater gains in the STeLLA group (teachers and their students) indicates that deepening content knowledge in the context of analyzing practice (reflecting on student thinking and science content storylines) is very important. We also measured the teachers’ pedagogical content knowledge and classroom practice, and we are currently analyzing those data to see which aspects of their learning are most predictive of student learning.

  • Icon for: Zenaida Aguirre Munoz

    Zenaida Aguirre Munoz

    Associate Professor
    May 11, 2015 | 11:28 p.m.

    What measures did you use for PCK and classroom practice?

  • Icon for: Christopher Wilson

    Christopher Wilson

    Presenter
    May 11, 2015 | 11:46 p.m.

    Hi Zenaida. We measured PCK using an instrument in which teachers watch short segments (3-5 minutes) of carefully selected classroom video, and are asked to write about what they see in the video. On pretests we see teachers writing about things like classroom management and the ways in which the classroom is similar or dissimilar to their own. On posttests teachers move to providing detailed accounts of student thinking and the teacher’s pedagogical moves. A quite detailed and time-consuming rubric is used to score the teacher’s responses, and so we are working in another grant to use lexical analysis and machine learning to train computers to do the scoring (http://www.nsf.gov/awardsearch/showAward?AWD_ID...). PCK is a tricky construct to measure though, and we’ve benefited greatly by the work done at the BSCS PCK Summit (http://pcksummit.bscs.org/).
    Classroom practice is being measured by collecting classroom video of the teachers before and after participation in the professional development. A similarly resource intensive process is involved in scoring these data, but we have no measurement solution to that issue, yet!

  • Icon for: Meredith Thompson

    Meredith Thompson

    Post doctoral fellow
    May 13, 2015 | 11:28 a.m.

    Interesting program! Is there an ongoing community for STeLLA teachers to support their teaching by BSCS and for each other? Thank you!

  • Icon for: Christopher Wilson

    Christopher Wilson

    Presenter
    May 13, 2015 | 01:45 p.m.

    Hi Meredith. We continue to work with some of the STeLLA teachers, but we’re currently looking into ways to formalize ongoing collaborations, potentially through researcher-practitioner partnerships. It’s always frustrating to get to the end of grants like this and not be able to continue to support and work with people who have benefited so greatly from the collaboration. We’re hoping we can leverage the research data in that regard. We also have an MSP project in California in collaboration with Cal Poly Pomoma that has district-wide scaling and sustainability of STeLLA PD as a primary goal from the onset.

  • Icon for: Stephanie Teasley

    Stephanie Teasley

    Facilitator
    May 14, 2015 | 09:04 p.m.

    Really exciting project with some significant empirical results. My comments are on the video itself… Too many moving graphics and I found myself squinting at the tiny videos surrounded by lots of white space.

  • Further posting is closed as the showcase has ended.

  1. Christopher Wilson
  2. http://www.bscs.org
  3. Senior Science Educator
  4. STeLLA: Science Teachers Learning from Lesson Analysis
  5. http://bscs.org/stella
  6. BSCS
  1. Jody Bintz
  2. http://www.bscs.org
  3. Senior Science Educator
  4. STeLLA: Science Teachers Learning from Lesson Analysis
  5. http://bscs.org/stella
  6. BSCS

Student Learning from the BSCS STeLLA Professional Development Program
NSF Award #: 0918277

The NGSS put extra pressure on the need for knowledge about models of professional development (PD) that support high-quality science teaching and learning. While researchers have proposed a consensus model of effective PD (Desimone et al, 2009), few studies examine this model empirically and fewer still look at impacts on student learning. Also missing from PD research are lines of research that study PD programs in increasingly rigorous ways over time as mapped out in Borko’s seminal paper (2004) calling for more systematic PD research.

This video describes a rich line of research around an analysis-of-practice PD program that addresses these research gaps. This program pairs developing teachers’ science content knowledge alongside pedagogical content knowledge (PCK) via lesson analysis, in which teachers examine their own and each other’s teaching with respect to student thinking and science content storyline lenses. This research tests the consensus PD model by exploring impacts on teacher content knowledge, teacher PCK, teaching practice, and student learning via quasi-experimental and experimental studies.