Assessment Team Building Best Practices into Updated Units

Tuesday, March 29, 2022
As Mi-STAR updates its units, the assessment team has been in high gear, adapting to the changes and striving to reflect the highest standards of an evolving pedagogy.

The changes are an opportunity to keep Mi-STAR at the forefront of science education. “I think it's fair to say that the understanding of assessment is changing over time, as the Next Generation Science Standards are new for everyone,” said Gregg Bluth, senior research scientist for assessment and curriculum. “I've been learning from national education conferences about how rapidly the field is evolving.”

Under the standards, assessment has become more than just a snapshot of student knowledge. “We integrate assessments with the curriculum in such a way that students practice what they are learning,” he said. “I've been finding at conferences that what we are doing is a best practice.”

“The team is undertaking three types of revisions—high-, medium- and low-level— as it revisits its original work,” said Bluth. High-level assessment rebuilds are reserved for units that are being completely revised, such as, 6.6: Ecosystems, 7.1: Energy Planning and 8.1: Bed Bugs. The PrePost and Embedded Assessments are being revised to reflect the new unit content. 

During a high-level revision, the team updates the questions to assure that all are centered around an engaging phenomenon for students to make sense of, using the three dimensions of NGSS. Questions are scaffolded to assist students in the sense-making process. Finally, the PrePost is shortened to make it more student- and teacher-friendly.

Medium-level revisions are under way for other units, such as 6.1: Water Cycle, 7.3: Designing Dog Crates and 8.3: Sound and Light. The assessments are updated to address teacher feedback and incorporate NGSS-aligned best practices. In addition, phenomena and formatting are updated. 

Lastly, the team is conducting low-level upgrades for all MI-STAR units to standardize format and style and provide an assessment inventory table for teacher use. This table provides an at-a-glance inventory of which NGSS Dimensions are covered by each question. 

Back in the day, teachers often relied on straightforward, true-false and multiple choice tests. “They were fast and easy to grade, but that’s not what NGSS is all about,” said Bluth. Instead, the team consolidates those questions around phenomena, which are like next-generation story problems that yield insights into the three dimensions of science learning.  As Mi-STAR has evolved, so have the phenomena.

“We’re now trying to cover all of a unit’s subcomponents using the fewest number of phenomena possible,” said Barbara McIntyre, curriculum development and outreach associate. The aim is to guide students through assessment quickly and efficiently.  “Each new phenomenon increases the reading load for students, so if we can cover all the subcomponents using just two or three phenomena, that’s ideal.”

Finding just the right phenomenon can be tricky. “Once we had 96 pages of draft questions we tried that didn’t work,” Bluth said.

One phenomenon they use involves ants and aphids cohabitating on the same plants. They’re not fighting; they’re not eating each other. What’s going on?  “We make them answer questions that help them make sense of the phenomenon,” said McIntyre. By the time students are done with the assessment, they will have articulated a statement, identified patterns in data, interpreted bar graphs and read scientific material critically—all part of the NGSS framework.

The team still uses multiple-choice questions, but with a twist. “If we can do multiple-choice questions, we do, because they are easier to grade. But we often ask students why they chose that answer—we use the word ‘describe’ a lot,” she said. “If they get it wrong, the teacher can tell where students are falling apart.”

The team wants to make the teacher's job as easy as possible, so assessments are designed to allow them to pick and choose what works. “We encourage teachers to give the entire assessment bundle, but if that doesn’t work for them, they can refer to the assessment inventory table.”

Included with each unit, the table lists all of the SEPs, DCIs and CCCs in the unit and the lessons that address them. “If the teacher only has time to cover one item, this helps them choose what to focus on,” said McIntyre.

As educational research advances, Mi-STAR’s assessment team plans to adapt the curriculum accordingly. “An advantage we have is the ability to respond quickly to changes,” said Bluth. “Plus, the feedback from our teachers and their students is invaluable.”

So far, the feedback has been both helpful and enthusiastic. “New teachers are eating this up,” he said. “Preservice teachers are getting trained in NGSS, and Mi-STAR is a treat for them.”

In addition to Bluth and McIntyre, Mi-STAR’s assessment team includes Brian Gane, assessment specialist at the University of Illinois at Chicago; and Ashley Poole, a teacher in the Kalamazoo Public Schools.

 Image:By viamoi - originally posted to Flickr as ant aphidsUploaded using F2ComButton, CC BY 2.0,


Copyright © 2024 Mi-STAR
Mi-STAR was founded in 2015 through generous support provided by the Herbert H. and Grace A. Dow Foundation. Mi-STAR has also received substantial support from the National Science Foundation, the MiSTEM Advisory Council through the Michigan Department of Education, and Michigan Technological University.