For the headteacher of a primary school the publication of RAISEonline is one of the seminal moments of the autumn term. For the uninitiated, Raise (I can't keep mixing upper and lower case) is where the statisticians at Ofsted take all your school's data and present it back to you in a variety of charts, diagrams and graphs. There are average point scores, percentages and value added measures. Then there are the colours: the dreaded blue if your results are significantly lower than average and wonderful green if they are significantly above.
Why the excitement? Because not only do I get to see how well we compare with everyone else, but this is the first port of call for Ofsted inspectors when they are going to inspect a school. I think I am reassured that under the new schedule published this September, inspectors will be placing far more attention on what they see and find in schools rather than making up their minds before hand from the Raise information. However, it's bound to make an impression so Raise is a key document.
There are some really useful charts which break down the children's performance into different groups: gender, ethnicity, SEN status, pupil premium entitlement. Unfortunately, in a small cohort, which my last year's year 6 were, the numbers involved are often too small to be in any way significant. I am not going to lose too much sleep over why one ethnic group has performed so much worse than the same group nationally when there's only one child in my group and he's got a statement of special educational needs. However, where the data is large enough, it presents schools with really useful information that can be used to target improvements. For example, a couple of years ago the charts showed that although our middle and high achievers at Key Stage 1 were doing really well, as were our very low achievers, those who attained 2c in maths were not always following through to level 4s. This pointed us towards analysing this group in far more detail and bringing in action which have now addressed the issue.
I was rather excited when I discovered this week that the first unvalidated version was available to peruse, and I logged on with a sense of anticipation. This is because we achieved our best ever results last year, something that filled me with a great deal of pride and satisfaction and reflected the tremendous work done by staff and children over the years. Whilst I believe that primary education is about far more than the results of a series of one off tests, it would be ridiculous to pretend that the scores don't matter.
Sadly, since those heady days last July, when we rejoiced in our children's efforts, there have been disappointments. The first was over the summer with the publication of national results.
How stupid is this? As someone who wants all children to experience a rich, diverse primary curriculum that provides them with the knowledge, understanding and skills necessary for them to continue in the next stage of their education, I should be rejoicing that SATs scores rose across both Key Stage 1 and 2. I shouldn't be cynically wondering why, after a number of years where scores have remained stationary, every subject at both key stages rose significantly. I shouldn't be barely surprised that this momentous rise got barely a mention in the media (as compared with stories about falling international comparisons, lazy teachers etc.) I should have been happy - but I wasn't. In a stroke, our excellent results were suddenly not as impressive as they had been when we received them. It's bizarre.
And so to this week's Raise and further frustrations. First, our value added wasn't quite as high as I had anticipated. They've changed the way they work it out from last year, they've included level 6 results and they've given blanket 4bs and 5bs for the writing assessments.
We didn't enter any children for level 6s. I have no problem with children having a go if there is a fair chance they are going to succeed: indeed, nationally 3% of children achieved level 6 in maths. However, are these children really working at a level 6 - the level expected of 15 year olds? Have they been fully taught the level 6 curriculum, have they been coached or crammed in order to pass the test, or are they just very bright children who have been able to deduce enough answers to pass the test? I am looking forward to seeing this 3% taking their GCSEs at the end of year 7 and heading off to university at age 14 - it's the logical conclusion and, of course, it won't happen. Secondaries must be loving this!
The writing results, based on teacher assessment for the first time this year, became a bone of contention as soon as the government announced how they were going to contribute to overall English scores back in May. For a school like mine, where writing has been a major strength over the last five years, judging everyone to be the equivalent of a level 4b, when the reality is that the majority at level 4 are at level 4a, results in unfair comparisons.
So Raise has been a mixed blessing this year. Much to celebrate, but also much to disappoint after such excellent results earlier in the year.
And finally, there is the data that is just badly wrong. One chart shows that one of my children failed to make 2 levels progress in English. He made two levels progress in reading (he was a 3c at age 7 and came out with a test score of 5c). He also made over 2 levels progress in writing (he was a 2b at age 7 and 4a in his teacher assessment.) Clever that.Two levels in reading, two levels plus in writing but only one level in English!