Defining KPIs clearly is crucial. Badly defined KPIs have unfortunate side effects. The cobra effect is one of my favourites.
Most people agree that going to school is good for children. This means that there is a government focus on truancy as a measure. Sometimes there are legitimate reasons why children cannot get to school. One such reason is snow. A singular focus on truancy backfired spectacularly in the UK as the Telegraph newspaper explained in this article.
"....pupils unable to get into school because of snow had been listed as unauthorised absences – hitting a school’s truancy figures. This often made it easier for a school to close completely."
So when bad weather that prevented some children getting to school, the school was encouraged to shut the entire school. This was because a school closure would not count in the truancy figures. Instead of a few children missing school they deprive every child of an eduction for that day.
Defining KPIs properly and avoiding embarrassment.
How can you avoid this kind of embarrassing KPI misfire? Well three things can help with defining KPIs properly.
- The first is defining KPIs clearly. Defining KPIs should be done in a structured way and written down. This KPI definition checklists will help.
- Understand how your measures interact as a package. KPI trees are the ideal tool to do this. They show how different measures interact. This can help you spot unexpected outcomes.
- Reverse-brainstorm with a group of subject matter experts. In this example your experts would be head teachers. Ask how they could get a 'good' result with a KPI, whilst completely defeating the intended purpose of the measure. You will be surprised by how energised a group of experts becomes when they are asked to break something.
Think of the these three checks as a kind of 'pre-flight' safety check. Maybe you can prevent your own 'light snow closes schools' outcome?