Since its introduction in 2008, the National Assessment Program – Literacy and Numeracy (NAPLAN) scheme has been widely debated by teachers, academics and parents.
A great deal of controversy exists over the way the Australian Curriculum Assessment and Reporting Authority (ACARA) use the NAPLAN results, with test data published on the My School Web site to form a ranking of all schools across the country. The top performing schools in the country are clearly identifiable online, along with the poorest performing schools and their geographical areas.
Children across the country sat the sixth annual NAPLAN tests late last month with students in years 3, 5, 7 and 9 required to complete a number of tests across a three day period, evaluating skills in reading, writing, spelling, grammar, punctuation and numeracy.
According to ACARA, the tests are designed to highlight problem areas in learning, allowing teachers and parents to identify areas of student excellence, and areas that need more study.
Principal of Turramurra High School, Stephanie McConnell, says the results of the NAPLAN scheme miss the bigger picture when it comes to evaluating the quality of teaching in schools.
McConnell argues that teachers cater to a diverse range of educational needs everyday, and the data provided by the NAPLAN tests is simply too narrow to denote overall school success.
“NAPLAN testing is good when it’s used for its intended purpose. The way it is now being used is counterproductive for schools, because the results are being used in the wrong way,” McConnell says.
“The snapshot that is provided by NAPLAN becomes a measure of school success, which neglects all of the responsibilities involved in teaching a child. This is really problematic for schools because the data is not being used in a positive way.”
ACARA defines their method of data evaluation as a measure of school accountability. The Web site deems school performance data to be the right of public knowledge, with the implication that low ranking schools will be analysed further to determine whether intervention is necessary.
”The reported outcomes of NAPLAN enable the Australian public to develop a general national perspective on student achievement and, more specifically, an understanding of how their schools are performing,” ACARA states on its Web site.
“Australians can expect education resources to be allocated in ways that ensure that all students acheive worthwhile learning during their time at school.”
Associate Professor of education at the University of Technology, Peter Aubusson, is one of 128 academics that officially supports the ‘Say No to NAPLAN’ campaign. The campaign was founded by the Literary Educators Coalition, who claim that the tests are being misused for a variety of political agendas.
Aubusson argues that data provided by the NAPLAN scheme is unreliable, stating that the sample size being analysed is far too small to draw state and nation-wide conclusions.
“As soon as you analyse information about small populations, such as schools, you’ll find that the data can swing. Teachers have other insights into students, they can take into account whether the data makes sense or not. Outside the classroom, people can look at it in a fairly simplistic way and draw inappropriate conclusions,” Aubusson says.
“Naplan has a significant role to play in Australia. It can provide information to students on critical aspects of their learning, and give useful feedback to teachers. It ought to be used for the purpose of which it was designed.”