Tuesday 12 May 2015

Assessing NAPLAN writing by computer

story from today's Age

Since the National Assessment Program Literacy and Numeracy was introduced in 2008, it has been subject to controversy. This is largely because the NAPLAN scores achieved by individual students and schools are collected and ranked on the My School website.

The transparency this affords parents, principals and government is promoted as a virtue, but there is another story that NAPLAN does not reveal – incremental improvement in student learning. Winning the NAPLAN lottery is meaningless. It's about hoop jumping and not a lot more.

Winning the NAPLAN lottery is meaningless. It's about hoop jumping and not a lot more.

Schools can confect their NAPLAN results by tutoring tor the NAPLAN tests. This is well known inside schools, if not the wider community, and actively encouraged by principals. The NAPLAN industry – from on-line providers to supermarkets where NAPLAN guides are available – underscore the primary importance of doing well in NAPLAN.

My own students have been prepared for the NAPLAN tests this week; I have given them practice NAPLAN questions and assessed them. I expect the class to not be intimidated by the NAPLAN experience.

Where NAPLAN testing fails is that it is a Neanderthal blunt club of a tool to determine progress. I have been working with students gently and systematically in trying to build their confidence and writing skills. Many have benefitted but may not star on NAPLAN. The data may say they are even weak students. The data may say I am a failure as a teacher and my school may have to "please explain" to parents why some students do not do well.

The central issue with NAPLAN, which is the cause of angst in common rooms and staff rooms across the country, is that it is does not measure progress. It reports on learning status at a given point over three days in one week every two years. This needs to be kept in mind.

To be fair, NAPLAN was never intended to be a means to measure schools against one another and thereby single out schools on the basis of test scores for praise or purgatory. It has, however, become something approximating hysteria in some schools.

The essential point of NAPLAN is to identify schools, owing to their results over time, which may need special assistance and support with the establishment of skills. This means appropriate and targeted funding for resources. It was one of the foundation planks of the Gonksi reforms to make educational funding and thereby learning opportunities more equitable.

As a secondary-school English teacher, my concern with NAPLAN testing is that it does not measure development. What about the boy who sits in the back row of my class who is struggling with his writing and reading? His development on a NAPLAN test will seem negligible but I know he has made significant progress.

He, and many like him nationally, are understandably nervous about NAPLAN. Their individual academic growth will not be measured. Their confidence will not be measured and they will be clustered as a mere unit of a statistical number-crunched graph.

This round of NAPLAN tests is likely to be the last paper-and-pencil tests. By 2017 the test's writing component will be assessed by computers. It beggars belief that the drive for more data that can be collected and graphed more quickly means individuality of writing will be compromised.

If parents think this is a good idea, they should think again. Teachers are not supportive of the change, and for good reason. Outsourcing to computers removes assessment that determines quality of writing beyond easily determined errors. This highlights the acute focus on NAPLAN statistics rather than what a child is trying to express. The equation is simple: error count high, computer says, dumb child. Error count low, computer says, bright child.

It is hardly reassuring that the Australian Curriculum and Reporting Authority (ACARA) general manager Stanley Rabinowitz says he is "very confident" that enough progress has been made to guarantee the auto-scoring method would work by 2017.

As with many other English teachers, I have marked probably hundreds of thousands of essays over my career. I think I have a good idea of what students are trying to communicate. I correct the errors and write comments to help student to improve.

Let there be no mistake, computer marking would lessen my workload instantly. It would also give me printouts of neat data. But it would not allow me to see the germination of insight and understanding, or the surprise of something beautiful written from a tender heart that has taken courage to pen. That's what NAPLAN can't assess. Creativity can't be number crunched, and computers don't get it.

Christopher Bantick is a senior literature teacher at a Melbourne Anglican grammar school for boys.

 

No comments:

Post a Comment