Constructing Explanations and Engaging in Argument from Evidence are two Next Generation Science Standards (NGSS) practices I have heavily emphasized in my classroom over the past few years. My immersion in NGSS professional development that focuses on these practices has allowed me to develop new ways to engage my students and assess their abilities.
I teach seventh grade in a selective enrollment school in Chicago. When I first started teaching, I used a traditional lab report rubric (Figure 1) to help scaffold the conclusion writing of my students. The rubric focused on the skills we had started at the beginning of the year, collecting and analyzing quantitative and qualitative data, explaining data, and reflecting on the work done in the lab.
In the beginning, some students had difficulty explaining their data; they could only state some numbers or a qualitative change they had seen. As I reflected on their data, I realized they were providing me with a lab analysis that was still very surface level. The assessment structure I used also restricted the explanations they were making to lab reports, which happened only a few times per semester.
Modifying the Rubric
After a couple years of seeing this trend of surface level explanation, and a lot of professional development on the emergence of the NGSS, I began to change the rubric. My assessment moved to a focus on claims and evidence (Figure 2). I asked students to be more specific in determining what their data was telling them, i.e., is it supporting or rejecting their predictions made in the experiment. I also asked them to do a bit more than just restate their data, and describe why it supported or refuted their hypothesis.
This approach led to better student work that was more focused on the analysis of specific data points and the explanation of what their data meant. While this approach allowed for student explanations to dig a bit deeper, there was still room for growth.
The third iteration of constructing explanations (Figure 3) came from the work to align to our school’s push toward proficiency-based learning (sometimes known as standards-based grading). I modified my rubric to target specific skills: the ability to make a claim, cite specific and appropriate evidence, and use reasoning to explain why evidence supports the claim that was made.
This allowed me to not only break down the skills into more concrete pieces, but also to expand on these same skills for non-lab assignments (Figure 4).
After isolating constructing explanations as a skill for assessing my students, I needed to find ways to provide feedback to them more frequently. Another type of assessment I use is writing environmental decision statements. I took some of what I was assessing already and modified my rubrics to show more clearly that what students were producing was their “claim” (decision), and their evidence came from research and consequence diagrams they made.
Now I can use a modified rubric to assess “claim, evidence, and reasoning.” Students see the same skill move beyond hands-on labs, into decision statement writing, solutions to “mystery” scenarios, and analyzing graphs and other data sets. Modifying my rubrics has allowed me to assess students more frequently than I was in the past, which allows the students to track their growth over the course of the semester or year.
To me, this is part of the power of the Next Generation Science Standards. It allows us to give students many contexts to use, assess, and grow in the practices. Constructing explanations moves beyond writing a conclusion section of a lab report to frequently using the practice to make sense of phenomena in the classroom.