This semester I've been purposeful about rewriting my Algebra unit tests to be more inline with the way I teach. Our Algebra team has had a focus on engaging all students, regardless of their readiness, during our lessons this year. I've seen a huge improvement of student engagement through low floor-high ceiling tasks and differentiated instruction. I was noticing that while daily lessons were leading to growth mindset for the students, we were still giving fixed mindset assessments. It's been a huge benefit to collaborate with Kelly McBride, Megan Thornton and Dave Sladkey on these new tests. In the past it's been hard to talk about assessment with an online community because we don't want to compromise test security. However, if we're going to change teaching practice, which the #mtbos is doing really well, we have to honor students' learning with the appropriate measures of understanding.

If you just want to see what kinds of questions we're asking, click on the links to tests below. If you want to know about the process, rational and experiences with these tests, keep reading.

As we focus on the Standards for Math Practice during class, it's important that we challenge students to use all of these practices during the tests as well. Many years ago we made a department goal to include an error analysis question on every test. I like that the students know they will see this type of question and they are expected to write and explain their understanding on their test. Here's an example of how we modified an error analysis question on our Solving Quadratics Test. Our previous error analysis question, for this unit, didn't require students to demonstrate understanding, instead they were expected to memorize a formula. On our new error analysis question, students are demonstrating MP3: Construct viable arguments and critique the reasoning of others.

Besides error analysis, we've added open-ended questions to our tests. This allows students to solve a problem in a way that makes sense to them. We honors students' differences during class and celebrate unique answers. I'd like to be able to celebrate students' creativity on their test as well.

Exponential Functions Test (I left off the skills part of this test and am just showing the differentiated piece of the test)

**Differentiation**

In our classrooms, we differentiate by readiness and by choice. Differentiating by readiness on a test is something that I'm still wresting with and trying to wrap my mind around. We have had great success with differentiating by choice on tests. Our first attempt at this was the Exponential Functions test. Most our test was pretty traditional covering topics such as determining if a table represents a linear or exponential function, graphing exponentials, determining end behavior and an exponential vs. linear growth word problem. Influenced by Jo Boaler, we've been having students analyze patterns during class. Because of this focus in class, we wanted to have a pattern question on the test. By giving students a choice of which pattern they wanted to analyze, students had more confidence on the question. They weren't just responding to what I wanted to know, they were creating their own test. It was neat to see students take advantage of choice. Their choices weren't always what I expected and there was a large variety in the class of what they picked.

We have tried to bring choice into more of our assessments since the EF unit. Recently we had this example of comparing different functions on our Graphing Quadratics Quiz. Having choice helps students demonstrate their understanding by not penalizing them for getting stuck on one part.

**Desmos**

We use Desmos almost daily in our classrooms. Because of this, we wanted to make Desmos available to students during the test. We're looking forward to Desmos coming out with a test mode which will turn off sharing and Desmos solver features. Since we didn't want to wait for test mode, we decided to build Desmos solver into the assessments, encouraging students to use it to check their work. For our Solving Quadratics Quiz, we asked students to solve each quadratic equation in three ways. They were allowed to use Desmos solver twice on the whole quiz. The students engaged in deep thinking by simply choosing which methods to use on the different questions.

**Test Length**

Instead of testing every skill and type of question that we presented throughout a unit, we wanted to give fewer questions that helped us know which targets students did and didn't understand. An example of this was our Solving Quadratics Test. Instead of having 9 questions where students have to demonstrate solving with a particular method, we created a test that had 2 solving quadratic equation questions where students showed each of the methods at least once. It was interesting to see how students thought through these problems. In the past, students wouldn't check their answers. But now, because they had to use multiple methods per problem, they were forced to check their work. And knowing the correct solutions helped them demonstrate learning on the methods that they were less comfortable with.

**New Types of Questions**

Old Error Analysis |

New Error Analysis |

We've just begun to scratch the surface of what's possible in assessment. These new questions aren't perfect, but they're a start. What kinds of questions are you asking on your tests? How do you differentiate assessments?

## Comments

## Post a Comment