Central government is starting to explain what it intends for its new ‘Office of Local Government’. The Secretary of State explained that the body would analyse data covering areas such as education, recycling and social care and produce an annual report so that: 

“taxpayers will be able to see which councils are going furthest on the environment, which have really transformative children’s services and which are providing the best value for money”. 

Local authorities may be tempted to choose a simple solution to meeting the requirements of the new Office of Local Government. The combination of it not being a local priority, a deadline that suits someone else and having to meet unfunded duties whilst we all having to look for ways to cut services to balance the books means that many Chiefs will be looking for the easiest way to get this off the ‘to do’ list.

The easy way will look something like this: 

  1. Buy software that’s apparently been built for this specific purpose. 
  2. Require each service to input data on a recurring basis
  3. Employ someone centrally to chase the updates
  4. Record a narrative and actions and repeat the cycle

Sounds simple. Here’s why it’ll be painful.

There isn’t widely used software to do this today because no data driven organisations work this way. The data you’re reporting won’t come from the software. It’ll likely be a compound measure that will be derived from at least two other data points, stored in different systems and gather for different (service delivery) purposes. The task of amalgamating and (inevitably) cleaning the data will be added on to someone’s existing duties. The reporting period required won’t align to the rhythms of the service. 

Data-driven organisations have got data visualisation tools and business intelligence platforms which interrogate data in near real time to develop predictive models, identify correlations and causations and enable data to be interrogated. You might know this because you’ve already got one. So your simple solution will be to use sub-scale software or use convoluted and less safe alternatives, creating ever more complexity in your data architecture and ever more arguments about which version is the “truth”.

It’ll be painful because no data driven organisation focuses so much effort on reporting quarterly in arrears (or worse). They might run quarterly reports, but not to find out that the delivery function had a problem five months earlier. By the time you’ve run the reports, agreed the data and had the meeting the problem will have evolved. You’ll be watching stars.

It’ll be painful because it will increase the less appealing aspects of your culture. It’ll make the service performance sovereign, not the experience of the people whose needs it should meet. It’ll drive an individual focus on particular measures rather than a system understanding of what’s happening to people, fragmented across the different parts of your organisation. No one involved will be incentivised to have an honest appraisal of the systemic issues driving performance. You’ll be wanting to bring together more integrated focus and working, while all the time your reporting mechanisms will be driving wedges between your teams.

It’ll be painful because it will retrench your data skills. The incentive to get that data “right” will be huge given its purpose. So spreadsheets and emails will fly around before data is committed to the reporting system and hours will be spent trying to make sure that the numbers for submission to the central machinery give the right answer. And if you’re lucky, the opportunity cost to your organisation will have been that those people could have been building predictive models, or machine learning algorithms. If you’re less fortunate they will find rewarding jobs.

And after all of this Sisyphean effort, you’ll have a worse understanding of your residents’ experience. Take one metric I know too well: the average waiting time on the phone to speak to a customer services agent. When operational management made this their priority measure for success, they used to activate ’purple mode’. They answered the phone quickly and wrote down your name and number to call you back. So the metric looked good (the calls were answered quickly) but the time taken to return the call and whether the call resulted in the resident receiving the service they needed weren’t recorded. Those weren’t PIs.

The average waiting time reflects just one element of the customer experience and only the first step in their journey to having their problem sorted. But it was the only PI.

The average waiting time was (and remains) a function of the number of calls coming in and the number of staff to answer them. But management of PIs doesn’t lend itself easily to enquiring why. Whether the jobs are completed efficiently and correctly is one of the main drivers of demand. But those are separate PIs led by other people.

All those things are now visible on our business intelligence tool, many automatically. It doesn’t ensure that people ask the right questions. But it removes the effort from providing the data. And frees us up to do something with it (identifying people who haven’t accessed the service and targeting the areas where demand drivers need most attention, for example).

Of course, we could ask if any of this is well intentioned. Is comparing the performance of your local authority with one next door going to support levelling up through mobility? Is it going to enhance democratic accountability? Or would a focus on the systemic issues and root causes lead to better outcomes for citizens? That’s for others to answer.

Faced with this choice, the harder and smarter answer would be to:

  1. Start with culture and ways of working: how do you curate the right conversations about problems, performance and progress?
  2. Honour your commitment to the Local Digital Declaration and liberate data from legacy systems that inhibit you from capturing high quality data once at the point where the work is done and used many times to join up services and give the full picture of residents’ experience
  3. Use data to deliver better public services that are more personalised and with faster feedback loops
  4. Work in the open, with residents and partners to show why this approach makes your place one in which citizens can lead better lives, not simplify success down to your accountability to DLUHC’s annual report