Saturday, January 30, 2010

Saturday Morning Musings - yet more on the My School web site

Another Saturday morning.

Thursday the girls left for their South East Asian trip. They have both travelled internationally before, so I am not worried about that beyond the normal risks associated with particular destinations. Facebook also provides a useful mechanism for keeping in touch.

It will be odd, however, to have them away together for such a long time. I guess that I had better get used to it. It's just part of the normal transition process as they get older, but I feel it a bit since I have been the primary child carer since 1996.

I have continued to dig round in the My School web site. Given that I have already done four posts on this, I don't really want to bore readers with a further detailed analysis. However, I thought that I might make a few more purely personal observations since its still nagging away at me.

Like nearly everybody else in this country, I have been sucked into looking at comparisons between schools. This is actually quite dangerous unless the limitations on the web site approach are kept firmly in mind.

To begin with, the site focuses on a very limited measure, the NAPLAN (National Assessment Program - Literacy and Numeracy) test results. The results say nothing about the broader education provided by the school.

I actually found myself quite angry on this one. I will deal with the technical problems associated with the use of the test results in a moment. But first, I want to take my old school, The Armidale School (TAS) as an example. I will give you the link to the school web site later.

On the Sydney Morning Herald NSW league table the school's secondary wing ranked 85 in the State on the  Index of Community Socio-Educational Advantage (ICSEA). This simply means that TAS parents are generally better off, but not as well off as some parents. On the NAPLAN results, the school ranked 213, quite a long way down the list.  

TAS operates in a very competitive environment, especially on the boarding side. The presentation of these results will positively disadvantage the school. Now Ms Gillard would argue tough luck, the school just has to do better in the NAPLAN tests. This captures the silliness of it all.

TAS is a non-selective school, drawing its students from particular cohorts in particular areas, including a still significant proportion of kids from the land. It prides itself on providing a general education. It has quite wonderful facilities areas like sport, IT (the school has been a leader in the use of IT for more than thirty years) and drama built up over the years. Now look at the web site to see what I mean. 

How should TAS respond to the NAPLAN rankings? Does this mean that the school has to change its educational philosophy and indeed its entry requirements in order to claw its way up the NAPLAN rankings? 

I want now to turn to some general technical problems.

NAPLAN attempts to measure individual performance in two areas, literacy and numeracy at certain specific points along the school path. It is a limited measure with just two annual sets of test data at this point.

Entry level cohorts vary to some degree from year to year in all schools, even in selective schools with their very tight entry requirements. This means, in turn, that school results for individual years will vary as individual cohorts move through the school.

This is actually quite important. I found myself wanting to compare the 2008 and 2009 results, so if 09 was worse than 08 in, say, year 7, then that implied a deterioration in school performance. I suspect that I was not alone here. However, the comparison is quite meaningless unless you actually know the differences between the respective cohorts.

Given the variations in cohorts entering individual schools as well as the huge differences in school populations between individual schools, arguably the only real measure of relative school performance if still a limited one, is the extent to which each school improves the performance of individual cohorts over time. However, there are  still problems here.

By its nature, this measure would tend to favour the general rather than selective school, simply because children entering selective schools are, on average, already at a higher starting point on the NAPLAN scores. There is simply less room for improvement as measured by NAPLAN. Further, this type of measure of school performance by improvement actually says very little about the suitability of schools for individual students. It also still risks twisting school performance.

Speaking just as a parent, I first came across this type of twisting effect at a school speech night a few year's back.

After a general statement on the importance of a general education and the need to treat test results with care, the Headmistress effectively devoted the rest of her speech to extolling just how well the school was doing on various test measures. I was frankly appalled, and I wasn't the only one. Not only did it take attention away from the graduating girls, it left us most of us wondering just where the school was going in terms of its overall philosophy.

Minister Gillard suggests that the publication of the results may assist parents in choosing a school and in identifying performance problems within schools.

Speaking just as a parent, I wondered how I would have used the results had they been available earlier. It would certainly have affected the pool of schools we looked at at the margin in terms of exclusions and inclusions. However, it would probably not have affected our judgements as to school performance. Any parents reasonably close to their children and school generally know how well kids are doing.

  This has become yet another long post. However, I do want to make one final point.

While I have expressed strong reservations about the net benefit of the site in terms of Minister Gillard's stated objectives, I have also suggested that it does have some value in terms of what it tells us about patterns.

Those who read this blog will know that I am interested in Aboriginal education.

Few people know, I suspect, that there are more schools in NSW with 20% to 100% Aboriginal students than there are in the Northern Territory.

Because of the demographic work that I have done as well as my fairly detailed knowledge of geography, I was able to identify and track across the State a number of the NSW schools with high proportions of Aboriginal students.

While the NSW Aboriginal community remains disadvantaged compared to the general population, the results seem in no way comparable to the Northern Territory or to Northern Australia more broadly defined.

Consider, for example, Our Lady of Mt Carmel Primary School in the inner Sydney suburb of Waterloo. The school has 122 pupils of whom 56% are indigenous. It ranks below average on the ICSEA value. Yet in 2008 and 2009 it generally scored average to above average against all schools nation wide on the NAPLAN test.

Interesting, isn't it?      

No comments: