Leonardo Mazzone - 08 August 2025

Building digital components in government

I’m reading Platformland by Richard Pope, an illuminating thesis on how to build digital public services fit for the future. This book has given clarity and context to some realisations about my experience of working for government as a data professional that have been years in the making.

Pope’s perspective was forged in the crucible of the early days of the Government Digital Service (GDS), where he worked as a Product Manager. GDS was formed in 2011, and played a huge role in the UK government’s digitisation successes in the 2010s. The most emblematic achievement was an excellent single government portal, GOV.UK, to which all government departments publish information. However, the assumptions and strategies that were right in 2011 became limiting factors, and a distinct feeling of progress slowing down emerged.

In passing, the book tells some of this story. But most prominently, it’s a taxonomy of patterns and strategies, as hinted by its subtitle (“an anatomy of next-generation public services”). The first of these classifications identifies distinct digital objects: services, credentials, rules, components and data. These objects can be connected in creative ways to fulfill the needs of the public.

In the words of Pope, services are the “things that the public interact with to get an outcome”. As such, they’re a bit special, sitting at the layer that citizens get to - or often, must - engage with directly. To encourage best practices at this key layer, GDS prescribes the hiring of service owners (a job title that will bemuse digital folks outside the civil service), has a Service Manual and a Service Standard. Compliance is monitored through Service Assessments.

But the other types of digital products are just as critical. For example, components are “the common bits that make services work” (Pope again). Data dashboards, software libraries, internal applications, cloud infrastructure don’t work like public services, they work with them. Some of the principles for services translate and encourage virtuous practices, like the focus on accessibility, agility and open sourcing. Some are odd and inapplicable outside the context for which they were designed. The eager adoption of the Service Manual by Digital and Data professionals has been a force for good, but it’s unnecessarily monopolised the conversation about technology in the UK govenment.

Understanding the past

The traditional cornerstone of the interaction with government bureaucracy is The Form. Thus, as GDS started, a core part of the approach was to turn experiences revolving around paper forms into digital services. This is reflected in the elements of GDS’ design system. Most of them are ways of collecting user inputs in a step-by-step journey which ends with a summary page where users can confirm or amend the information they provided.

It is also reflected in the first stand-alone components GDS built, one for collecting payments (typically at the end of a form journey), and one for notifying users (about the progress of an administrative process, triggered by a form submission). Later on, GDS went all in with a component for creating whole forms.

Platformland eloquently defines the challenge: the work of GDS thus far has been impressive, but it’s weighed down by the structures and metaphors of 20th-century government it had to contend with. New bold ideas are needed to reinvigorate digital transformation. There are positive signs that GDS is embracing this. And I’d be surprised if any GDS leader hadn’t read Platformland or even spoken to Pope, who was personally involved in some high-profile digital initiatives in the UK government.

The digital periphery of government

More recently, GDS have started enriching their offer through GOV.UK One Login, a component for proving identity, and a digital wallet. But the bigger picture is still one where

  1. a digital centre is in charge of the “common bits” of digital infrastructure
  2. other parts of government build services on top (which often look a lot like forms)

This explains why GDS has little in terms of advice and best practices for things that aren’t services.

Having a digital centre of government makes sense to drive systematic change across organisations and focus on the highest value interventions. However, a central digital agency can’t have the resources, knowledge or mandate to build all components that could be valuable. Pope says:

Common components […] are unlikely to emerge spontaneously from government agencies. The institutional incentives are not strong enough.

That is certainly the status quo, but it’s also pointed out that:

[…] there need to be clear routes for turning “point solutions” (problems solved within individual services) into common components […]. Internally, both Amazon and Google […] can create and fund a new team off the back of something that is happening organically.

Government departments have built a large and capable pool of Digital & Data professionals. They have started building components that are useful internally. These are often open sourced, as encouraged by GDS. And yet, little of the potential for innovation and reuse within government is realised.

Incentives are part of the problem. At the Department for Business and Trade, my team is building Matchbox, a tool to streamline the deduplication and joining of datasets. We started working on this organically, to simplify the analysis of data on UK businesses, but we now believe there is a large potential in reusing our product across government. We are doing what we can to encourage adoption: separating the reusable code from our specific internal setup, writing extensive documentation, engaging with communities of practice. But at the end of the day, our performance is assessed against value delivered for the department. I wonder if there could be a role for a digital centre of government in sponsoring and supporting initiatives of this kind.

Lean technical teams

Richard Pope (who is not an engineer) writes:

Implementing the approaches described here is first and foremost a technology intervention, not a design one. You can’t design services that reuse data or make real-time decisions in any systematic way if you don’t have the infrastructure to support doing so.

On the other hand, the service manual says:

A team building a government service needs to have people with the following roles or skills either in the team or available to it: product manager; service owner; delivery manager; user researcher; content designer; developer

Of the six roles in the minimum service manual team, only one (listed at the end) can architect a digital system and implement it via code. The focus on these professions might be necessary for products with a large public surface, but it’s inadequate for building internal technologies. It also leads to teams that are too big. Large teams are harder to coordinate, can be affected by social loafing, and have to contend with more noise. Too often the temptation is to make your team even larger to deal with some of that internally generated noise.

T-shaped skills enable leaner teams. It’s not uncommon in tech start-ups to have two types of roles in a team: engineers (responsible for software, data and infrastructure) and designers (responsible for user research, graphic design and interface design). Engineers and designers manage the product and delivery jointly.

This will be too extreme for larger organisations. However, in the right circumstances, giving greater trust, autonomy and decision-making power to technical staff can improve the quality and efficiency of delivery. No matter how competent and curious a Product Manager you are, if you don’t have experience personally building technology, you can’t think at the right level of abstraction and you’re missing a big piece of the puzzle. For almost a year, I took on a management role and didn’t write a single line of code. I now realise that my ability to make good product decisions around novel technologies was severely affected. I won’t be making this mistake again in my career.

Also, in terms of specialisation, the current approach in government feels extreme in the opposite direction. At the time of writing, there are 51 Digital and Data roles. Many of them have significant overlap. Navigating their difference is overwhelming for non-technical colleagues, and often, related professions are deployed interchangeably anyway.

Defining many types of roles is OK when you can pick and choose from them flexibly depending on the specific preferences of your organisation, and on the responsibilities of individual teams. Things break down when teams are formed too prescriptively. For example, content designers or technical writers certainly have the skills to improve internal technical documentation. But deploying them on a product-by-product basis as gatekeepers of all written text would not be a good use of public money. It would also add friction that decreases the velocity of teams, and discourage a sense of collective team ownership over a product as a whole (another typical tenet of tech start-ups).

Just build a thing

Much has been written about the four “Agile project phases” (discovery, alpha, beta, live) outlined by the Service Manual. There are some reasonable arguments in defence of this model.

There is also much controversy. For example, Ben Whitfield-Heap, a former product manager at NHS England, wrote “It’s time to phase out – a rethink of the traditional service standard phases”:

Even though these phases seem to follow a logical order, I started to learn that sticking to them too rigidly stopped the team I was working in from continually evolving and realising value. […] Even for small, low-risk bets, we’re wrung through a time-boxed 10 to 16-week Discovery and Alpha period until we’re allowed to put our chips on the table.

Some (not all) internal technologies are low-risk, meaning it’s cheaper to try and build a thing, see it fail and learn valuable lessons along the way (in the true spirit of Agile1).

Too often I’ve heard the argument that you shouldn’t build during a discovery, as that is marrying the team to a solution before the landscape has been properly explored and user needs understood (more on user needs later). I think that is based on a misunderstanding. Building small prototypes is the way engineers explore ideas and possibilities. It’s a good thing to do, as long as these prototypes are understood for what they are, a mechanism for thinking and learning, not sunk development cost. The increasing capabilities of Large Language Models have further lowered the barrier to producing ephemeral code.

Data as an afterthought

As a data practitioner, this is a particular pet peeve of mine - there is not a mention of thinking about data models, data validation at source or data reuse in any of GDS’ core technical resources. This affects how teams get formed when a product is designed, and ultimately the quality of analytical layers you can build on top of your products. In the meanwhile, in countries like Estonia it is illegal to store the same government data in multiple places2, so all infrastructure must be built around interoperability. One can only dream.

The only tool I know of that is popular across the UK government and wasn’t built centrally is called Splink and was created by the Ministry of Justice. It lets you implement probabilistic models for deduplicating and linking data records. From the GOV.UK article:

The MoJ and its agencies have numerous administrative data systems. These systems were developed at different times for different purposes, and there is no consistent person identifier that is used across systems.

This results in challenges when analysts and researchers need to perform analysis that spans multiple systems, such as understanding journeys through the justice system, or repeat users of justice services.

Matchbox was built to interoperate with Splink. First, build probabilistic models for single datasets and pairs of datasets, then combine many of these models and evaluate them using Matchbox. It’s not by chance that multiple UK government departments are investing in data matching technologies. The problems we’re trying to address are endemic and are in part linked to cultural characteristics of this country that I consider positives, at least in their intentions.

When I engage with the state in Italy (where I was born), you are often asked questions that seem wholly irrelevant to the task at hand and frankly, feel like administrative overreach. On the other hand, the UK is a bit of an outlier in not having a national ID system. Some may remember the gripping story of how a National Identity Registry was set up and subsequently destroyed, largely due to concerns about its implications for civil liberties.

But the limitations of not having consistent identifiers are too glaring, which is the driving force behind the new GOV.UK One Login. I look forward to a day when products like the one I’m building will be largely irrelevant. Until then, all we can do is increase the level of sophistication of our data matching approaches, with the right governance and ethical guardrails in place. Ultimately, protecting civil rights by reducing the effectiveness of the state is counterproductive. It’s much better to implement robust systems of accountability.

Nouns and verbs

The Service Manual has fairly strong opinions on how to name your service. For example, good service names:

describe a task, not a technology; do not need to change when policy or technology changes; are verbs not nouns

Examples provided include “Register to vote” and “Get help with court fees”. Personally, I’ve never thought of those as names, but as succinct descriptions of user needs. That’s fine - from the perspective of citizens, services needn’t have names, but purposes. Users don’t have to think of “Register to vote” as a distinct digital entity, they have to experience it as a journey addressing their need.

That said, building services that aren’t named like paper forms (with confounding combinations of digits and letters), or whose names are plucked from 19th-century literature or the Norse pantheon (as privately-educated senior civil servants once favoured) is a great gain for the public.

Outside of services, using verbs and divorcing names from technology is unnecessarily restrictive. A different GDS page gives some more general advice:

Your product name should be self-descriptive […]

These GDS product names clearly communicate their purpose: Travel Advice Publisher; Manuals Publisher; Smart Answers

These GDS product names are ambiguous and possibly confusing: Panopticon; Whitehall; Maslow; Magna Charta

The self-descriptiveness of names can’t be much of a problem for civil servants, who after all use a plethora of commercial software with the most bizarre names. How does “Slack” relate to messaging, “Excel” to spreadsheets, or “Jira” to issue tracking? Sometimes, distinct names are needed to describe separate entities that operate in a similar domain. In this case, plainer names are more confusing and ambiguous than the alternative.

From Thesaurus.com:

Common nouns refer to generic things while proper nouns refer to specific things. For example, the noun country is a common noun because it refers to a general, non-specific place. On the other hand, the noun Spain is a proper noun because it refers to a specific country located in Europe.

“Manuals Publisher” has the feel of a common noun, which is not what it wants to be. It can only serve as a proper noun when the context restricts its meaning, i.e. GOV.UK Manuals Publisher. When naming Matchbox, we couldn’t apply GOV.UK branding.

In summary, we decided our component shouldn’t be a verb. It should be a noun, and an interesting one at that. Obscure service names put distance between the government and the public. Boring component names create ambiguity in an organisation and fail to be memorable for colleagues and leaders.

Elusive user needs

From the Service Manual page on naming services:

If you’re having problems naming your service, it might be because you have not scoped your service correctly. In this case, you should review your user needs.

Like service names, service design revolves around user needs. Other parts of digital infrastructure are informed by user needs, which they need to indirectly enable. However, organisational and technical needs sometimes play a bigger role.

Unlike for data, a significant portion of the Service Manual focusses on user research. “Understand users and their needs” is the first rule of the Service Standard:

Focusing on the user and the problem they’re trying to solve - rather than a particular solution - often means that you learn unexpected things about their needs.

The real problem might not be the one you originally thought needed solving. Testing your assumptions early and often reduces the risk of building the wrong thing.

Pope quotes American computer scientist Rob Kling, who in the 1970s said that

Programmers would often […] impose systems on users. This occurred when they were explicitly insulated from close contact with computer users and when they viewed themselves as change agents “reforming” an inefficient organisation

User-centred design is an essential framework whose utility is not questioned by anyone I’ve met in government. It is however useful to think about when it is insufficient on its own. Pope writes:

User-centred design assumed that users didn’t have incompatible interests. This misalignment of needs is not uncommon in public services, and Universal Credit, as a policy, has many of them. […] Attempting to design for a system that intentionally created burdens for the public using user needs didn’t really work - at least not on its own. Rather than user needs, the Universal Credit digital account was created around the management of the fluctuating administrative burdens the service placed on users.

The misalignment of incentives can also occur when designing internal tools. There is value in trusting individuals to know how to do their job, and building products around what they’re expressing as needs. And there is a place for organisations to steer processes and behaviours in a way that contrasts with established practices and sensibilities.

In the space of technological change, many worthy objectives are difficult to express in terms of user needs. Reducing technical debt, improving the interoperability of systems or reducing your reliance on an external supplier, all benefit users through multiple levels of indirection.

And yet, at the start of our project, we’ve been frequently asked to frame the value of Matchbox in terms of user needs. This was not an easy feat. Our claim that “users need to match company data reliably and easily” was met with the question “OK, but what do they need to match the data for?” which is a bit like talking about the value of printers in a office in the 80s and asking “Sure, but what kind of things will need to be printed?” to which the answer is, all sorts. You can of course start from some well-understood representative use cases, but that is inevitably reductive, partly because the absence of the capability suppresses the spontaneous manifestation of many use cases of which you’re not aware yet. At some point, you need to rely on the judgement of your technical experts to determine what is a key enabling technology, or you risk suppressing whole classes of activities which will indirectly harm the ability of the state to cater to user needs, implement effective policy, control costs, respond to change and crisis.

Charting the future

I am excited about what the future of Digital and Data in government holds. As we think of less burdensome and more effective ways to benefit the public, I hope we also learn to diversify our approaches to digital technology in government. This means:

  1. incentives to experiment at the digital periphery of government, and to turn point solutions into reusable components
  2. a departure from deciding about priorities, plans, resourcing, design, and naming as if everything were a public-facing service
  1. From Wikipedia: ‘Through incremental development, products have room to “fail often and early” throughout each iterative phase instead of drastically on a final release date’. 

  2. From Estonia’s Public Information Act: “Establishment of separate databases for the collection of the same data is prohibited.”