Digital

Census 2022 – meeting the Digital Standard

February 14, 2024 by 2 Comments | Category Digital Assurance Office, Digital Scotland, Technology Assurance Framework

Guest blog by Laura Johnstone, Continuous Improvement team, Digital Assurance Office.

The Digital Assurance Office have been working with the National Records of Scotland to capture and share some of their experiences from the delivery of the Census Programme. This is the third in a series of case studies. You can read the earlier case studies, and our other insights, on our blog.

For over 200 years Scotland has relied on a ten year Census to underpin national and local decision making. The 2022 Census was the first predominantly digital Census collection. The Census is a long term programme and is undertaken by the National Records of Scotland.



Context

The Census Programme was assessed against the Digital First Service Standard (D1) in 2018. The DFSS was refreshed to become the Digital Scotland Service Standard in 2021.

The DFSS had not been embedded into the Census Programme at the outset. This led to early assessments making a high number of recommendations around end-to-end service design and the adoption of user-centred approaches.

As the programme progressed, significant effort was made to address the D1 requirements. The programme took as many opportunities as possible in the remaining 3 years to make the user experience as good as it could be.

Key activities

• Following early recommendations from the first D1 assessment, a Digital Assurance Manager was embedded within the programme to ensure that:

1. recommendations from the assessment were being managed through a documented action plan
2. D1 activities were scheduled and managed appropriately throughout the programme lifecycle
3. teams were prepared for the assessment process and rehearsals scheduled ahead of assessment
4. good digital delivery practices were adopted, continuously working across the programme to improve digital maturity/understanding of D1.

• A positive working relationship between the NRS Digital Assurance Manager, Digital Assurance Office and Digital Standard Assessment Team helped to effectively plan, design and respond to assessments taking into account project readiness, resource availability, other assurance activity and interdependencies.

• Being clear on the purpose, scope, ambition and architecture for the digital services within the Census programme was critical for shaping D1 assessments. The first D1 assessment was for the Complete Census Questionnaire. Subsequently there was separation of the Outputs project from Collect which helped to provide clarity around the nature of the different services which had different users and audiences.

• Being explicitly clear on the scope of each service enabled the Census programme to structure, manage and get maximum value from the subsequent D1 assessments.

• Strong sponsorship from the SRO and NRS Executive Management Board ensured that where additional resource was needed to respond to the D1 recommendations, support was in place to act and implement quickly.

Key reflections and learning points

1. Understand the Digital Standard. A solid understanding of the Digital Standard as a way to design and deliver a service needs to be embedded from the outset of the project, at all levels in the organisation. Encourage wide understanding and training – it is everyone’s responsibility to meet the Standard.

2. Critically consider the multi-disciplinary, cross-functional team you will need to deliver a digital service and how you will resource it.

3. Have a clear scope and understanding of the digital service you are delivering. Align the programme to deliver around the Service and enable collaboration between teams.

4. There needs to be a shared understanding of the requirements, standards, profession and specialisms needed to deliver an end-to-end government digital service. Each profession should be respected and boundaries between roles understood. Seek specialist support and expertise as required.

5. Value digital assurance. It has been widely acknowledged by the delivery team that the assessment process helped to keep the team focussed. It ensured delivery in accordance with good practice and acted to unify the programme team, giving renewed focus on gaps, issues and actions to be implemented ahead of future assessments.

6. Maintain regular and pro-active engagement with the Digital Assurance Office to ensure that assurance assessments are designed and timed for most impact.

7. Recruit experienced practitioners who have public sector delivery experience using Waterfall, Agile and Hybrid methodologies and therefore know how and when to adapt and change to meet the needs of the programme.

Find out more

The Technology Assurance Framework (TAF) is designed to support programmes and projects to deliver successful outcomes and ensure that the lessons learned from previous experience are reflected and embedded in future practice.

The Digital Assurance Office are working with organisations who have had assurance through the TAF to share insights which might help others deliver digital projects.  If you want to get involved – or have thoughts on what insights would be helpful to share – contact us at DigitalAssurance@gov.scot.

For more information about this case study contact censuscorrespondence@nrscotland.gov.uk

For advice on designing and delivering high quality digital services visit Scottish Government Digital Support Hub (DSH).

The Scottish Digital Academy (https://digitalacademy.gov.scot) is the public sector centre of expertise for digital capability and can provide information, advice and guidance on developing digital, data and technology skills to support transformation.

For further information and signposting to advice and support on programme and project management contact the Programme and Project Management Centre for Expertise.

The Scottish Government programme and project management principles are available and apply to any project of any size.


Tags: , , ,

Comments

  • Bob Miller says:

    All of which is sound upstream advice, but the bottom line is the Census had a very poor take up rate compared to historic norms and the rest of the UK, so what is being done to learn lessons from that (particularly by going out to those who did not complete it and ask why), and to ensure these are applied?

    • deborahamzil says:

      National Records of Scotland (NRS) has committed to publishing an evaluation report to parliament upon conclusion of the census programme in late 2024. The report will cover the full lifecycle of the programme, reflecting on what worked well and the required lessons learned.

      Posted on behalf of National Records of Scotland.

Leave a comment

By submitting a comment, you understand it may be published on this public website. Please read our privacy policy to see how the Scottish Government handles your information.

Your email address will not be published. Required fields are marked *