There was much to ponder in Dominic Cummings’ now-infamous ‘misfits and weirdos’ blog. However, one thing that stood out to me was that “data scientists” was the first role mentioned in his manifesto to change the way government attempts to get things done. I would suggest that some more fundamental problems are stopping policy and operations becoming more data-led.
The reality is that government doesn’t have a problem with data. It has lots of the stuff. However, most of it is out-of-date and isn’t available to analysts, policymakers and delivery teams in a timely way. What we’re really talking about is extracting information and insight from its data. Of course, that’s not a challenge that is unique to UK government. Many large organisations are struggling to take advantage of the opportunities to plan, deliver and learn more dynamically.
Legacy tech restricting government data
The first challenge to overcome is a technical one. Legacy technologies underpin many government services. Significant capital spending is required to replace ageing systems so that all prescription data, for example, becomes available to data scientists. While so much data is constrained in this way, progress will be limited. It means that analysis and insight will remain partial – or even fractional – and silos will remain between services, departments and other parts of the public sector. There are thousands of statisticians and analysts already working in government; the problem is that they’re often relying on technology from 15 or more years ago.
Challenging the status quo
Data-led transformation will also require a cultural change and a challenge to the status quo. Anyone who takes on this challenge will need to break down established ideas of ‘how we do things’ and redefine ‘the art of the possible’ when it comes to leveraging government data. Not only that, they will need to overcome cultural barriers too. There will need to be a change to the idea that one department or another ‘owns’ particular datasets. Policymakers will need to stop thinking that data is something is provided to them by ‘the weirdos’. Policy, operations and delivery teams will need to realise they all have a role to play and need to work together. Data should be the lifeblood of change.
The need to focus across and between departments (and organisations) may well be what is delaying the recruitment of a CDO (and other roles) and makes it something of a poisoned chalice.
Pull not push
The third challenge is that transformation happens when people in the organisation are initiating change. Change needs to be driven by the user communities; not just pushed out to them. For this to happen, data needs to be available where it can be of value, and the right tools need to be available. If not, then government analysts and data scientists will be restricted to taking very similar paths to their predecessors.
The true potential of data
The availability of data – via APIs – is core to making insight part of a feedback loop into policymaking. Access to richer data sets around service delivery and usage will enable the delivery of more and better insights from government data. From the analysis of longitudinal problems, to finding connections and outcomes where policy, processes and action meets, the true potential of data is realised when it is combined - across services, departments, organisations. This means joining-up and collaborating delivery across central and local government, the rest of the public sector and even with private sector organisations. It should also mean creating a feedback loop into policy. This is definitely new territory and is an order of magnitude more challenging than the task the GDS was tasked with when it launched.
Transforming the government’s approach to data
When GDS was established in 2011, it sought to change how government created and delivered digital services. Most observers would agree that, more or less, it has achieved this aim. It has put the user firmly at the heart of service design, and the user’s perspective is now used as a lever to change the way services are presented, even down to the style of language used. GDS started small with 25 exemplars to reinvent how digital was done. There may be a temptation to follow this approach again, but there are some fundamental differences when it comes to data.
Having recently discussed this subject with a member of the GDS team at GDS Tech Talks: Interoperability and Open Standards, we agreed that the approach may need to be more incremental when it comes to data-led transformation. Start with two departments working together to develop a solution with data flowing (securely, of course) between departments. This should also mean that some APIs are established so that other teams, departments or organisations can make use of the data. Then look to the next exemplar which adds in a third department, then a local authority and so on. The vision of sharing of data between the DWP, DfT, DVLA and local authorities for the delivery of the Blue Badge service is a good example of how this might play out, providing a better user experience and richer insights into take-up, usage and impact.
By the time a handful of exemplars have been delivered, new working practices will be established, lessons learned, legacy systems replaced and, more importantly, a range of APIs will be available that can feed further transformation.
So, the reality is that the value is in hard-to-reach places. The role of CDO is actually to coordinate a range of initiatives and partners to make data available and give data scientists, analysts and policymakers a much less partial view of reality. Putting data science into the heart of policy and its implementation is really about creating links between people, systems and departments.
Dan Klein is Chief Data Officer at Valtech. If you’d like to learn more about our approach to delivering data science and analytics for government, contact us today.