WWNO skyline header graphic
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Local Newscast
Hear the latest from the WWNO/WRKF Newsroom.

Support local, independent journalism on WWNO with your Member Fest gift now! Click the donate button or Call 844-790-1094.

In Data Driven Landscape, Teachers Learn To Interpret Student Test Scores

Alberto G.
/
Flickr

A few years ago, a new phrase became all the rage in education reform: Data driven. Students take benchmark and standardized tests throughout the year, and the tests generate lots of data. But how do teachers turn those data points into lesson plans?

In the new data-driven landscape, getting ready for the school year doesn't just mean carefully arranging desks and decorating the walls with laminated starbusts. It means looking through — and interpreting — reams of student data. Before students arrive, teachers receive their standardized test scores from the past year; a few weeks into school, they give out and get results from benchmark tests. But do they actually learn how to turn those scores into a teaching plan?

According to Ebony Collor, the answer is no. Collor teaches third grade science and social studies at Merrydale Elementary in Baton Rouge. She says she has received lots of numbers, but no tools to interpret them.

"They weren't explained to you,” she says. “You were given a sheet saying 'Hey this is your sheet and these are your results. Go do something with it.'”

Collor's mentor and fellow teacher, Michelle Williams, says this isn't an isolated problem.

"It's not something that teachers are taught in college as a part of coursework, to use data to drive instruction,” Williams says.

Williams says the surge in student data doesn't amount to much if teachers don't know how to use it. So she partnered with Associated Professional Educators of Louisiana to lead a workshop on interpreting student data. Collor immediately signed up.

"I don't think I was gonna get it on my own, winging it,” Collor says.

The workshop's 30 spaces filled up fast. Because this year, for the first time, teachers in Louisiana have to use student data for their teaching plans. Teachers have long set student learning targets — a vision for what the class should know by the end of the year. But now they're required to base those targets on test scores.

At the workshop, Williams passed out a packet filled with sample test results and broke down all the perplexing terms: Core, strategic, intensive, intervention, concentration.

With that plain language in place, Collor could analyze students' academic strengths and weaknesses, and map out what it might take to move them forward.

“I want to see growth,” Collor says. “So the data for me, I think it's a true snapshot of what we need to be doing. It gives us a marker. It gives us a realistic expectation of growth in a year's time.”

But growth isn't just measured in quantitative data, Williams said — a point most teachers at the workshop met with enthusiastic nods. Qualitative data is important too.

“Qualitative data is basically what teachers know about students,” Williams says. Like a tendency to rush through tests, or a stressful situation at home.

“This is not something that's broadcast to the public,” she says. “But children make gains based on where they are.”

Collor left the workshop eager to use numbers and her relationships in the classroom. Her first move? Change the classroom seating based on data.

“Everybody has different learning levels,” Collor says. “If you had two lower students together, no work was getting done. Or the high students would finish everything so fast you'd have to find more things for them. If I have them mixed heterogeneously, it's more collaborative learning.”

In the past, Collor based seating on her students' personalities. Now, with seating based on data, “everybody's on the same pace and the class flows very harmoniously.”

But when she goes to create her student learning targets — to come up with not just a seating arrangement but an actual plan for the year — she faces another stumbling block. This time the problem isn't a lack of training, but a lack of data. Two days before the learning targets are due, Collor still hasn't seen her students' test results.

“There was a software problem with the new system, where they were scanned on the wrong Scantron,” Collor says. "The numbers were just not right."

She heard this happened to several teachers, at several schools. The tests her students took to set benchmarks for the year had to get re-scanned. So Collor waits, armed with new interpreting skills, but no data to interpret.

Support for education news on WWNO comes from Baptist Community Ministries and Entergy Corporation.