The Digital Service Standard sets the benchmark to ensure that the federal government’s websites and online applications are simpler, faster and easier to use.

The standard has been released as an alpha version for comment and is based on UK Government’s Digital by Default Service Standard.

The standard will be supported by a revamped Digital Service Design Guide.

The 16 criteria and how Briarbird supports the standard

There are 16 criteria in the standard, which I’ve listed below, along with the ways in which Briarbird provides support.

1. Understand user needs, conduct research to develop a deep knowledge of who the service users are and what that means for digital and assisted digital service design

We offer the full range of customer research services including interviews, usability testing, co-design workshops, surveys and so on.

2. Establish a sustainable multi-disciplinary team that can design, build, operate and iterate the service, led by an experienced service manager

We provide project team mentoring and assist web teams refine their approach and processes.

3. Adopt a user-centred design approach

We use a user-centred approach in all our planning, strategy and specification work, using direct research wherever possible. We also ensure a h5 user focus in all our content writing and editing.

4. Establish benchmarks to measure user satisfaction, digital take-up, completion rates and cost per transactions and report performance publicly

We offer these services, typically as part of developing online strategies.

5. Evaluate what data, tools and systems will be used to build, host, operate and measure the service and how to adopt, adapt or procure them

We don’t offer specialised services in this area.

6. Assess what personal user data and information the service will be providing, using or storing and put in place appropriate measures to address security risks, legal responsibilities and privacy considerations

We don’t offer specialised services in this area.

7. Build the service using agile, iterative and user-centred methods

We provide project managers who utilise these methods.

8. Build the service with common look, feel, tone and function that meets the needs of users

We provide prototyping and design mock ups; we are often involved in reworking existing families of website so that they have a common look and feel.

9. Use web service APIs, open standards and common government solutions where possible and make all new source code open and reusable where appropriate

We don’t offer specialised services in this area.

10.Test the service on all common browsers and devices, using dummy accounts and selecting representative samples of users

We don’t offer specialised services in this area.

11.Integrate the service with any non-digital interactions

We provide the strategic plan and framework that identifies how digital and non-digital interact.

12.Put appropriate assisted digital support in place that’s aimed towards those who genuinely need it

We don’t offer services in this area.

13.Consolidate or phase out existing alternative channels where appropriate

We provide the strategic plan and framework that identifies how channels should be aligned or rationalised.

14.Undertake ongoing user research and usability testing to continuously inform service improvement

We offer the full range of customer research services including interviews, usability testing, co-design workshops, surveys etc.

15.Use data and analytics tools to collect and report performance data; informing continual service improvements

We provide analytical services that allow evidence-based service planning and improvements.

16.Provide ongoing assurance, supported by analytics, that the service is simple and intuitive enough that users succeed first time unaided

We provide analytical services that allow services to be monitored against goals or benchmarks to ensure that users succeed first time unaided.


  • Recent posts