
All green! Why passing the government Service Standard assessment matters more than ever
Our government Service Standard assessment results are in — and I’m thrilled to say they’re all green!
Our team, working hand-in-glove with our central government partner The National Archives, passed the assessment with a ‘green’ rating across all 14 points of the Service Standard.
For any team, this is a fantastic achievement. It’s the culmination of months of hard work, user research, and true collaboration.
More than a test: A partnership for better services
From the outside, a service assessment can seem like a daunting exam.
From my experience at the Government Digital Service, I’ve heard from both civil servants and fellow suppliers who describe it as a rigorous and challenging experience. It’s a formal check-in with a panel of cross-government experts who spend a day digging into every aspect of your service — from the technology choices and security to, most importantly, your team’s deep understanding of user needs.
But the process is fundamentally collaborative. It’s there to “help you make the service better.” And that was exactly our experience.
The assessment wasn’t a finish line we had to cross alone but a checkpoint we reached together with our client. The questions from the assessors were tough but fair, pushing us to provide evidence for our decisions and ensuring the user remained at the heart of everything.
For a supplier, this level of scrutiny is invaluable. It fosters a discipline that goes beyond just delivering software. It makes us prove we’ve understood the problem, tested our assumptions with real people, and built something that is not only functional but also accessible, equitable, and secure. It’s a shared benchmark of quality that puts the citizen first.
How do you pass a government Service Standard assessment?
By taking the time to fully understand the problem and working closely with The National Archives, we were able to achieve the right outcome.
Here are some reflections to help other government teams and suppliers pass their service assessments.
- There was a clear problem definition from the start. Working with the service owner, we agreed the problem users were facing and aligned it with the service vision
- As a team, we kept focus on outcomes rather than outputs, working with the civil service team to adopt the same approach
- We balanced pace with quality, ensuring prototypes were good enough for learning but not over engineered
- The team fostered trust and openness with the client and with the assessment panel, ensuring the process remained collaborative.
Read: Service assessments —A welcome update for government, and for suppliers
Positive feedback
We were thrilled by the positive comments we received. Here’s a quote from a director at The National Archives.
“Special shoutout and kudos for delivering an outstanding walkthrough of design prototypes. The feedback from the multidisciplinary team was exceptional, which included:
- ‘The way you’ve built up your knowledge and your almost ‘magical’ understanding of how we work is incredibly impressive.’
- ‘Your insights and leadership in this space have been truly inspiring to watch.’
- ‘This has been incredibly helpful and it’s great to see what we’ll be getting back.'”
Director for Public Records Access and Government Services at The National Archives
Case study: Secure and cost-effective archiving of court judgments with serverless technology
Building for the future: The Service Standard, disruptive tech and AI
Achieving the standard today is one thing. But maintaining it for the future is another, particularly with the rise of disruptive technologies like artificial intelligence.
A key question for government and suppliers alike is: how do we innovate responsibly?
The Service Standard provides an excellent framework for doing just that.
Principles like “Understand users and their needs” and “Make the service simple to use” are the perfect antidote to technology-led solutions. They force us to ask the right questions before we even begin to think about AI:
- What user problem are we solving?
- Would an AI-driven feature genuinely make this service simpler, faster, and more accessible?
- Or would it add unnecessary complexity?
The standard’s focus on “making the service safe and secure” and having a clear plan for being “supported” provides the guardrails to experiment with AI in a controlled, ethical, and auditable way.
It encourages us to build services that are not only intelligent but also robust and trustworthy.
We look forward to working with our government partners to explore how emerging technologies can deliver even better outcomes for citizens, using the Service Standard as our guide for responsible innovation.
Passing this assessment is a brilliant validation of our team’s approach and our partnership with the government. We’re thrilled with the result, and even more excited to continue the work of building great public services.
Latest blogs
-
Which AI workflow works best to train public sector developers? Our Cyberfirst interns find out
-
Latest updated carbon reduction plan: Committed to Net Zero by 2050
-
Escaping government legacy technology is like Tetris… but not as you know it
-
AI can’t fix a broken foundation – here’s how tackling government legacy unlocks it
-
Digitisation and legacy modernisation: Setting the foundations for government AI
-
The great legacy escape: Ditch the spreadsheets, drop the paper