Teachers Don’t Want All This Useless Data

26948475_l-too-much-data

One of the most frustrating things I’ve ever been forced to do as a teacher is to ignore my students and concentrate instead on the data.

 

I teach 8th grade Language Arts at a high poverty, mostly minority school in Western Pennsylvania. During my double period classes, I’m with these children for at least 80 minutes a day, five days a week.

 

During that time, we read together. We write together. We discuss important issues together. They take tests. They compose poems, stories and essays. They put on short skits, give presentations, draw pictures and even create iMovies.

 

I don’t need a spreadsheet to tell me whether these children can read, write or think. I know.

 

Anyone who had been in the room and had been paying attention would know.

 

But a week doesn’t go by without an administrator ambushing me at a staff meeting with a computer print out and a smile.

 

Look at this data set. See how your students are doing on this module. Look at the projected growth for this student during the first semester.

 

It’s enough to make you heave.

 

I always thought the purpose behind student data was to help the teacher teach. But it has become an end to itself.

 

It is the educational equivalent of navel gazing, of turning all your students into prospective students and trying to teach them from that remove – not as living, breathing beings, but as computer models.

 

It reminds me of this quote from Michael Lewis’ famous book Moneyball: The Art of Winning an Unfair Game:

 

“Intelligence about baseball statistics had become equated in the public mind with the ability to recite arcane baseball stats. What [Bill] James’s wider audience had failed to understand was that the statistics were beside the point. The point was understanding; the point was to make life on earth just a bit more intelligible; and that point, somehow, had been lost. ‘I wonder,’ James wrote, ‘if we haven’t become so numbed by all these numbers that we are no longer capable of truly assimilating any knowledge which might result from them.'”

 

The point is not the data. It is what the data reveals. However, some people have become so seduced by the cult of data that they’re blind to what’s right in front of their eyes.

 

You don’t need to give a child a standardized test to assess if he or she can read. You can just have them read. Nor does a child need to fill in multiple choice bubbles to indicate if he or she understands what’s been read. They can simply tell you. In fact, these would be better assessments. Doing otherwise, is like testing someone’s driving ability not by putting them behind the wheel but by making them play Mariocart.

 

The skill is no longer important. It is the assessment of the skill.

 

THAT’S what we use to measure success. It’s become the be-all, end-all. It’s the ultimate indicator of both student and teacher success. But it perverts authentic teaching. When the assessment is all that’s important, we lose sight of the actual skills we were supposed to be teaching in the first place.

 

The result is a never ending emphasis on test prep and poring over infinite pages of useless data and analytics.

 

As Scottish writer Andrew Lang put it, “He uses statistics as a drunken man uses lamp posts – for support rather than for illumination.”

 

Teachers like me have been pointing this out for years, but the only response we get from most lawmakers and administrators is to hysterically increase the sheer volume of data and use more sophisticated algorithms with which to interpret it.

 

Take the Pennsylvania Value Added Assessment System (PVAAS). This is the Commonwealth’s method of statistical analysis of students test scores on the Pennsylvania System of School Assessment (PSSA) and Keystone Exams, which students take in grades 3-8 and in high school, respectively.

 

It allows me to see:

  • Student scores on each test
  • Student scores broken down by subgroups (how many hit each 20 point marker)
  • Which subgroup is above, below or at the target for growth

 

But perhaps the most interesting piece of information is a prediction of where each student is expected to score next time they take the test.

 

How does it calculate this prediction? I have no idea.

 

That’s the kind of metric they don’t give to teachers. Or taxpayers, by the way. Pennsylvania has paid more than $1 billion for its standardized testing system in the last 8 years. You’d think lawmakers would have to justify that outlay of cash, especially when they’re cutting funding for just about everything else in our schools. But no. We’re supposed to just take that one on faith.

 

So much for empirical data.

 

Then we have the Classroom Diagnostic Tools (CDT). This is an optional computer-based test given three times a year in various core subjects.

 

If you’re lucky enough to have to give this to your students (and I am), you get a whole pile of data that’s supposed to be even more detailed than the PVAAS.

 

But it doesn’t really give you much more than the same information based on more data points.

 

I don’t gain much from looking at colorful graphs depicting where each of my students scored in various modules. Nor do I gain much by seeing this same material displayed for my entire class.

 

The biggest difference between the PVAAS and the CDT, though, is that it allows me to see examples of the kinds of questions individual students got wrong. So, in theory, I could print out a stack of look-a-like questions and have them practice endless skill and drills until they get them right.

 

And THAT’S education!

 

Imagine if a toddler stumbled walking down the hall, so you had her practice raising and lowering her left foot over-and-over again! I’m sure that would make her an expert walker in no time!

 

It’s ridiculous. This overreliance on data pretends that we’re engaged in programming robots and not teaching human beings.

 

Abstracted repetition is not generally the best tool to learning complex skills. If you’re teaching the times table, fine. But most concepts require us to engage students’ interests, to make something real, vital and important to them.

 

Otherwise, they’ll just go through the motions.

 

“If you torture the data long enough, it will confess,” wrote Economist Ronald Coase. That’s what we’re doing in our public schools. We’re prioritizing the data and making it say whatever we want.

 

The data justifies the use of data. And anyone who points out that circular logic is called a Luddite, a roadblock on the information superhighway.

 

Never mind that all this time I’m forced to pour over the scores and statistics is less time I have to actually teach the children.

 

Teachers don’t need more paperwork and schematics. We need those in power to actually listen to us. We need the respect and autonomy to be allowed to actually do our jobs.

 

Albert Einstein famously said, “Not everything that can be counted counts, and not everything that counts can be counted.”

 

Can we please put away the superfluous data and get back to teaching?

Data Abuse – When Transient Kids Fall Through the Cracks of Crunched Numbers

Screen shot 2015-05-16 at 9.13.34 AM

I was teaching my classes.

I was grading assignments.

I was procrastinating.

I should have been working on my class rosters.

My principals wanted me to calculate percentages for every student I had taught that year and submit them to the state.

How long had each student been in my grade book? What percentage of the year was each learner in my class before they took their standardized tests?

If I didn’t accurately calculate this in the next few days, the class list generated by the computer would become final, and my evaluation would be affected.

But there I was standing before my students doing nothing of any real value – teaching.

I was instructing them in the mysteries of subject-verb agreement. We were designing posters about the Civil Rights movement. I was evaluating their work and making phone calls home.

You know – goofing off.

I must not have been the only one. Kids took a half-day and the district let us use in-service time to crunch our numbers.

Don’t get me wrong. We weren’t left to the wolves. Administrators were very helpful gathering data, researching exact dates for students entering the building and/or transferring schools. Just as required by the Commonwealth of Pennsylvania.

But it was in the heat of all this numerological chaos that I saw something in the numbers no one else seemed to be looking for.

Many of my students are transients. An alarming number of my kids haven’t been in my class the entire year. They either transferred in from another school, transferred out, or moved into my class from another one.

A few had moved from my academic level course to the honors level Language Arts class. Many more had transferred in from special education courses.

In total, these students make up 44% of my roster.

“Isn’t that significant?” I wondered.

I poked my head in to another teacher’s room.

“How many transient students are on your roster?” I asked.

She told me. I went around from room-to-room asking the same question and comparing the answers.

A trend emerged.

Most teachers who presided over lower level classes (like me) had about the same percentage of transients – approximately 40%. Teachers who taught the advanced levels had a much lower amount – 10% or below.

Doesn’t that mean something?

Imagine if you were giving someone simple instructions. Let’s say you were trying to tell someone how to make a peanut butter and jelly sandwich. But in the middle of your instruction, a student has to leave the room and go right next door where someone is already in the middle of trying to explain how to do the same thing.

Wouldn’t that affect how well a student learned?

If someone was trying to give me directions how to get somewhere under those circumstances, I’m willing to bet I’d get lost.

And this assumes the break between Teacher A and Teacher B is minimal, the instruction is disrupted at the same point and both teachers are even giving instruction on the exact same topics.

None of that is usually true.

I did some more digging. Across the entire building, 20% of our students left the district in the course of this school year. About 17% entered mid-year. So at least 37% of our students were transients. That’s 130 children.

The trend holds district wide. Some schools have more or less transients, but across the board 35% – 40% of our students pop in and out over the year.

Taking an even broader view, student mobility is a national problem. Certainly the percentage of student transience varies from district to district, but it is generally widespread.

Nationally, about 13 percent of students change schools four or more times between kindergarten and eighth grade, according to a 2010 Government Accountability Office analysis. One-third of fourth graders, 19 percent of eighth graders, and 10 percent of twelfth graders changed schools at least once over two years, according to the 1998 National Assessment of Educational Progress (NEAP).

And it gets worse if we look at it over a student’s entire elementary or secondary career. In fact, more students moved than remained in a single school, according to a national longitudinal study of eighth graders.

This problem is even more widespread among poor and minority students. The type of school is also a factor. Large, predominantly minority, urban school districts attract the most student mobility. In Chicago public schools, for instance, only about 47 percent of students remained in the same school over a four-year period. Fifteen percent of the schools lost at least 30 percent of their students in only one year.

And this has adverse affects on children both academically and psychologically.

Several studies at both the elementary and secondary levels conclude student mobility decreases test scores and increases the drop out rate.

A 1990s Baltimore study found, “each additional move” was associated with a .11 standard deviation in reading achievement. A similar 1990s Chicago study concluded that students with four or more moves had a .39 standard deviation. Highly mobile students were as much as four months behind their peers academically in fourth grade and as much as a full year behind by sixth grade, according to a 1993 Chicago study by David Kerbow.

It just makes sense. These students have to cope with starting over – fitting in to a new environment. They have to adjust to new peers and social requirements.

Moreover, transients have an increased likelihood of misbehaving and participating in violence. After all, it’s easier to act out in front of strangers.

What causes this problem? Most often it is due to parental job insecurity.

Parents can’t keep employment or jobs dry up resulting in the need to move on to greener pastures.

In my own district, one municipality we serve is mostly made up of low-cost housing, apartments and slums. It is a beacon  for mobility. Few people who haven’t lived here their whole lives put down roots. We’re just another stop on a long and winding road.

“We should be doing something about this,” I thought.

Our legislators should help promote job security. We should make it easier to afford quality housing. We should try to encourage new-comers to become part of the community instead of remain eternal outsiders.

At our schools, we need resources to help this population make the necessary adjustments. We should encourage them to participate in extra-curricular activities, provide counseling and wraparound services.

But we don’t do any of that.

Instead, we gather mountains of data.

We sort and sift, enter it into a computer and press “submit.”

And off it goes to the Pennsylvania Value Added Assessment System (PVAAS).

We don’t use it to help kids.

We use it to blame school teachers for things beyond their control.

Data has value but that doesn’t mean all data is valuable.

We need to know what we’re looking for, what it means and how to use it to make our world a better place.

Otherwise it’s just a waste of precious class time.

And an excuse to continue ignoring all the children who fall through the cracks.


NOTE: This article also was published on the LA School Report and the Badass Teachers Association blog.