Claudia Civinini explains the scoring system for language centre inspections
THERE WERE two things I needed to learn to love quickly when I started working at EL Gazette: British Council (BC) inspection reports, and the rankings for language centres. There are not many things I love more than a good story. The BC reports with all their entertaining euphemisms can be pleasant to read, especially their sarcastic use of the word ‘some’ in their dry descriptions of schools’ performance.
One extract from the BC’s summary statements include, ‘some teachers were able to model… words effectively’. But I still needed a good story. And I found one in the rankings. The carousel of schools contending for pole positions, rising stars, schools falling off the radar: the gossip can be quite thrilling. But the story of the rankings is mainly the story of an explosion in the number of accredited centres attaining ’excellence’.
These days, more than twenty per cent of schools have achieved half or more of the available areas of strength. These are published on the summary statements of the inspection. This is what we base our rankings on. As you can see on the following pages, there are about 110 schools out of a total 511 accredited schools that scored 53 per cent or more. Besides family-run independent language schools achieving their well-deserved place in the sun, we have representatives from across the spectrum. Language school chains attain high and consistent results, often getting most or all members of their chain in the top 20 per cent. Look at, for example, Bell, Eurocentres, or St Giles. There are 13 university language centres – with the University of Brighton in the lead – many summer schools multi centres and further education colleges, and most boarding schools.
A quick scroll through the EL Gazette archive reveals things have not always been this good. Flashback to 2010, the first year the Gazette published its rankings. The British Council was still operating under its old inspection system, awarding points of excellence instead of strengths. Back then, ranking schools was a tough job. The British Council summaries awarded ‘points of excellence’ for individual areas and ‘areas of excellence’ for the main criteria. Needs for improvement could be noted in both. It was a fiendishly complex system and it was impossible to calculate the maximum score. To calculate the rankings for the Gazette we simply counted up the ‘excellences’. Since the maximum score ever achieved was nine, we set the benchmark at five. Captained by Wimbledon School of English, the list was a mere 29-member club: only five per cent of all schools had made it. Chains were almost absent, if we exclude Bell and Embassy. How did we get from five to twenty per cent?
Research in the US on three university hospitals shows that for inspection results to improve, results must not only be measured, but also published. With the British Council releasing results, standards improved. Our rankings in 2012 already included over 40 language centres. But a big change was about to come, which had an even greater impact. In 2012, the British Council changed its inspection system to the one we know today. Schools are inspected under 15 areas (or 14 if they don’t enrol under 18s). Each area has a number of clear-cut criteria. If a language centres exceeds expectations in more than half of these criteria, it gets a strength in that specific area. Not only did the new system make our job easier (thanks, BC!) but it also allowed schools to clearly understand the BC’s expectations and to concentrate on meeting them. By 2013 there were two systems in play and 73 schools in the list. Top of the old system were English in Chester and ELC Bristol, both members of the new TEN association which also included schools such as Wimbledon and ELC Brighton. The top score under the new system was 10 out of 15 scored by Bishopstrow College, Hampstead School of English, IH Bristol and the now departed Crest School of English.
Results grew year after year as schools quickly mastered the new system. It’s now rarer to see a language centre go down the rankings or fail after re-inspection, unless some external element intervened – a change in the management team, for example. The introduction of the new Care of Under-18s system in 2014, which required language centres to adhere to a set of child safeguarding rules such as police checks on staff, also saw a handful of schools placed under review. But it seems that language centres have mastered that new element as well – it was never a problem for boarding schools, who aced the new requirements, as they were already used to the much stricter national minimum boarding schools standards.
By 2015, Wimbledon School of English had climbed to the top. Their sceptre is now contended by ELC Bristol, joint first with a perfect score of 15 areas of strength. What will the future bring? Beyond the pride of seeing the industry strive towards ever better standards, we still want to see some action. Another school with a perfect score? Maybe a new name? Surprise us!