An education expert has warned against jumping to conclusions about a school based on its NAPLAN results.
- NAPLAN results are out for the first time in two years, after COVID disrupted the 2020 test:
- For the first time, school data can be compared with other “similar” schools:
- Educators say there is nothing definitive about the results, as each school is unique:
The national data was released this week, allowing anyone to go online to see how students in different schools were faring in reading, writing, and numeracy.
Each student gets a detailed individual report, but the results online show broad data for each school with boxes lighting up green if they are above the national average and red if they are below.
“The online publishing takes them out of context, so it makes it very difficult for a community member or even myself to look those up and form any professional assessment,” said Paul Grover, a former teacher and now lecturer in Education at Charles Sturt University in Albury.
Like comparing ‘apples and oranges’
For the first time, the online NAPLAN tool allows you to see how a school is performing in comparison to others, particularly with schools that are deemed to be similar.
This can change the way the data appears. For example, the newly opened Greater Shepparton Secondary College (GSSC) sits “well below” on reading, writing and numeracy when compared to all students in Australia. But when compared to “students with similar backgrounds”, the squares light up green, showing they are actually “above” average on all those fronts.
But GSSC principal Barbara O’Brien said she did not find that tool to be particularly useful, as each school in the country was unique.
“When you look at those ‘similar’ schools you say, well, they’re not really like us because we’re far more multicultural, or we’ve got completely different demographic kids. Or we’re a regional school and they might be a metropolitan school, things like that, “she said.
“Sometimes it is comparing apples and oranges; we’re not really that much alike.”
Each school also has a profile that shows things such as faculty numbers, percentage of languages other than English spoken at home, and socio-educational advantage.
This can offer a stark contrast when looking at local schools. For example, 53 per cent of students at GSSC sit in the bottom quarter of socio-educational advantage.
In comparison, Trinity Anglican College in Albury has 54 per cent of students in the top quarter of socio-educational advantage.
Mr Grover agreed it was not particularly useful to directly compare schools, as all schools, teachers, students, and communities were different.
But he believed it was important to highlight the different challenges that some students and schools faced.
“Certainly, the Gonski program was designed to address that very aspect, to ensure that students in schools that had socio-economic factors that were impacting on their capacities to achieve their best would be additionally funded,” he said.
For Ms O’Brien, the socio-economic make-up of students was not a major point of focus.
“I do not consider that as relevant as knowing your community and knowing what it needs,” she said.
“In every community, there are going to be different demographics and different socio-economic parts to the community.
“I see that as a richness to our community, that we have such a range within our school, and that we can learn from each other.”