Zebrafish Test Cross
It was here that I first introduced students to the Base Explanation Rubric, which I use to assess their work, I asked students to grade a classmate’s lab discussion using the rubric. I was curious to observe how students felt their ability was at using claim evidence and reasoning. Out of 48 students, the peer grading averaged 2.9 for claim, 2.7 for evidence, and 2.8 for reasoning out of 3. Interestingly these peer assessed values for evidence and reasoning were much higher then any I had observed previously. When I evaluated students work using the rubric, I found out of 52 students who had an average for claim of 2.9, for evidence 2.5, and for reasoning 1.7 out of 3. In comparison to the CRISPR trifold, students’ average scores in evidence have been seen to be higher in both subsequent activities, the Zika virus article and the Zebrafish lab discussion. The greatest difference between my recorded value and the students was in the category of reasoning. It is clear to me that students are confused by this component of the rubric and are unclear on how to connect the evidence to the claim. Reasoning, according to Moje et al (2004), is the most challenging component of a scientific explanation for students (p. 233).
|
As this was the students’ first encounter using the base explanation rubric they reported to me that they found the rubric vague and challenging to use. Their biggest complaint about the rubric was that a 1 & 2 on the rubric were both defined the same. From my experience throughout the inquiry, I must agree with the students that these two should be separate categories. While assessing students’ work throughout the inquiry, if students were on the low end of the level 1 & 2 box, I gave them a 1 and if they were on the higher end of the level 1 & 2 box, I gave them a 2. This is how I explained it to the students for peer assessment.
|
Artifact 9: Samples of student lab discussions using CER.