First, a confession: We’ve spent no small amount of professional time thinking of data as a four-letter word.
This is not to say that we were disinterested in knowing our students and where they were in their progress toward mastery of standards. As veteran teachers, we continually sought ways to know our kids, to check in on their learning, and to confer with them to ensure that they were able to reflect and set goals.
But that word – data, and its accomplice data wall – had come to represent something disconnected from those endeavors. It meant high-stakes testing, decision-making that served scores rather than student learning, and shame in the guise of accountability.
So this year, when our principal issued the mission to go forth and create our data walls, she also did something else significant, something so important that its weight cannot be overstated.
“Make them look how you think they should look,” she said.
Here’s the thing about teachers like us: When it comes to the administrative nuts and bolts, we often prefer to be told what to do. Tell us which square to stand in and for how long at duty, set the activity day schedule, tell us which zone to monitor at the dance. Free up our brainpower and time, which are better devoted to instructional decisions.
On the surface, the format of a data wall might seem like one of those decisions we’d prefer not to dwell on. In reality, however, the structure and content of a data wall reveals our mindset about what evidence to collect and how to examine student learning.
In telling us to design our data walls to make them work for us, our principal gave us the room to think through how we evaluate and communicate student progress. We were long overdue for a data wall makeover (and, quite frankly, an attitude adjustment regarding data altogether), and we love a before-and-after, so please consider the next section of this post as Fixer Upper: Assessment Edition.
The Before – a snippet from one of our old data walls (names changed to protect the adolescent):
|STUDENT NAME||Argument Analysis Pre-Assessment 1/9
RI.8.1 Cite textual evidence
RI.8.2 Determine central idea
RI.8.3 Analyze how a text makes comparisons and analogies
RI.8.4 Determine meaning of words in context
RI.8.6 Determine an author’s point of view or purpose
|“I Have a Dream” Vocabulary Quiz 1/19
RI.8.4 Determine meaning of words in context
This data wall organizes data by assignment and is almost 100% a copy-paste job from one of our old gradebooks. The only addition is the notation of standards represented by each column. Color-coding might allow anyone examining the chart to note patterns of trouble but little else. And this is important: a data wall like this is good at identifying kids as problem spots. We’re headed down a slippery slope when we start talking about our “red,” “yellow,” and “green” kids.
Additionally, and particularly in the case of the Argument Analysis Pre-Assessment, it’s impossible to know students’ progress toward mastery of each of the five standards assessed. If we were to hand it over to someone outside of our classrooms, it’s unlikely they could make any decisions for these students without some serious digging. In fact, examining the data wall for ourselves as it stands was not especially useful or informative.
This was going through the motions – completing the assignment of adding data to a spreadsheet to comply with building and district instructions. It did not make us better at meeting student needs.
The After – a glimpse at this year’s data wall in progress:
|STANDARDS||PRIORITY STANDARD 1: Write arguments to support claims with clear reasons and relevant evidence. [W.8.1]|
|LEARNING TARGETS||I can write claims that address a writing prompt.||I can support claims with specific and logical reasoning and evidence.||I can use transitions and embed evidence effectively.|
|Rachel||85, 90, 85||80, 80, 95||75, 80, 85|
|Robin||55, 60, 60||55, 65, 55||55, 55, 70|
|Stacey||100, 90, 95||95, 95, 95||80, 85, 85|
|Blake||75, 75, 80||90, 85, 85||75, 75, 80|
In designing a data wall that worked for us, we began with our priority standards, established through our district’s curriculum development team. However, we opted to break down the standards even further into learning targets. An examination of our data (by us, an instructional coach, or an administrator) yielded a clearer picture of how to support each kid.
We knew that these targets would be assessed multiple times throughout the grading period, which led to a discussion about the final grade to communicate progress for each target. Should we average? Rick Wormeli had something to say about that. What about other measures of central tendency? What sample size yields enough evidence for us to be confident in what each kid can do?
These questions sparked further discussion of how to design our assessments. Did we need to revamp some old tasks to make sure they were truly assessing what we said they were? And let’s be honest: this kind of evidence gathering and examination takes more time than copy-pasting from a gradebook did. So we took some care with the formatting of our evidence gathering documents to avoid wasting time hunting around for what kids could do.
Our assessment practice is ever-evolving, and we know that we’re not there yet. But we believe that the questions we continue to ask ourselves will help us to be better for our students. We hope you’ll ask us some, too.
Laren Hammonds and Jamie Thomas are 8th grade language arts teachers at Northridge Middle School.