Department of Internal Affairs

A. Problem Analysis

 1. What was the problem before the implementation of the initiative?
Historically the independence of State Sector agencies has led to a lack of collaboration across government. Content across the government domain is almost entirely written and delivered based on the structure of government, not on the needs of the user. Content is frequently duplicated and finding the authoritative material can be difficult. Content is also not optimised for people who don't know what to look for. This is not just a problem in New Zealand, but in other developed countries as well. The problem has been described clearly in the OASIS (Organization for the Advancement of Structured Information Standards) standard that informs our approach to redeveloping “Most governments are structured around a set of vertically-integrated silos or stovepipes - agencies, departments, ministries. By and large, it is these silos which the Governments of developed countries have spent billions of dollars "e-enabling" since the 1990s. “However, this is an ICT investment strategy which is fundamentally not customer-focused, because the needs of citizens, businesses and others cut across the organisational structures and hierarchies of government. It has inevitably resulted in low levels of take-up for e-services. Governments in developed countries are now grappling with the legacy of thousands of fragmented, silo-focused websites…” The website was a directory of all of New Zealand government. Its primary purpose was to help people find New Zealand government information and services without knowing the structure of government. It was first built in the 1990s as Since then there have been two major site redevelopments that have been tied to rebrandings with domain name changes (first and then The site was last rebuilt in 2008 with very few improvements or promotion. It was mostly a 'links farm' – providing lists of links to external websites, without providing any real context for the user. While the site had an average of 59,000 unique visits per month, analysis of user engagement found that it has high rates of task abandonment. More than two-thirds of users failed to click on search results or a content link. As the purpose of the site was to be a referrer (directing people to the right place in government), this is strong evidence that the site failed to meet the needs of the user. A revised version of a directory website for New Zealanders needing government information was needed. We hear it a lot that people ‘don’t know where to start when it comes to government’ and a lot of people accessing government information struggle to use a computer, let alone understand what they need. We needed to make it easy for them to find out.

B. Strategic Approach

 2. What was the solution?
The initiative is an update and rebrand of the public online face of the New Zealand government. It places an emphasis on delivering services and information in a user-centred manner; sharing solutions, reusing common technology, and delivering better services for less cost. has been designed for New Zealanders – our public or anyone looking for New Zealand Government information. The goal is to make it easier to interact with government online. We are committed to speaking the language of our users, not the language of government. All of our content is written in plain English and organised around user needs, not government structure. Main objectives: vision: making it easy for people to find, access and use government information online Goal 1: Design and deliver information that meets user needs Goal 2: Be an authoritative source of cross-government information Goal 3: Promote open and transparent government The Digital Engagement team within the Department of Internal Affairs (DIA) proposed the solution based on user research and a review of how other jurisdictions were solving the common problem of making it easier for people to find, access and use government information. The site was launched by the Prime Minister and the Minister of Internal Affairs on 29 July 2014. It consists of 17 information hubs taking content from 44 government organisations and integrating it around the topic areas identified by the public. All the information has been written from the user’s perspective in plain language and ensuring accessibility for the broadest audience, including people with disabilities. The information has been tested with users in a number of ways, including via a public beta website, launched in August 2013. is led by user feedback, focussing on the users’ needs and experience. Feedback received from the public beta site was incorporated into the design of The initiative delivers directly to the reforms and strategies below by providing an easy to use source of online government information, designed for the customer, not the structure of government. The New Zealand Government launched its Better Public Service (BPS) reform programme in February 2012. The essence of the reforms is to deliver a public sector that is “more innovative, efficient and focused on delivering what New Zealanders really want and expect” with clear expectations that “… getting information from government, should be easy.” The initiative delivers against a number of the Better Public Service goals and priorities including:  • The Chief Executive of the DIA is the Government Chief Information Officer (GCIO), responsible for reforming the use of ICT (Information and Communications Technologies) across the state sector. The all-of-government (AoG) ICT Strategy and Action Plan includes Action 1.1: “Redevelop as the primary entry point for citizens to obtain information, including a mobile enabled version” • The Government Information Services (GIS) Group within the DIA provides advisory services and products to support the information service goals for AoG, the GCIO and the DIA. GIS is the trusted advisor and steward for the New Zealand government online domain. It leads, influences and champions the improvement of digital information capability within government, its partners, and within DIA. The Rethink Online strategy includes four principles including “Design for people’s needs: Cluster online information and services around shared topics and audiences” • DIA lead one of the ten BPS Results with a target to increase the number of digital transactions for New Zealanders and make it easier for people to transact with government online

 3. How did the initiative solve the problem and improve people’s lives?
The project represents the first transformational change in how government delivers services to New Zealand. It took a private sector approach to design and development – user-centred, agile, iterative development – delivering effective online products that customers use. The project applies those techniques to government information delivery online. We developed a live beta (test) website and developed that into a fully fact-checked, production ready website. It is innovative because it does not rely on government employees to decide how users want information. We brought together content from 44 different government organisations and created topics that are relevant to our users, making interacting with government easier. The project team explored overseas trends and completed research focussing on user needs, information architecture, and user experience and applied that knowledge to make the information easy to find, access and use. The information currently sits on a web site which has been developed using open source code. Interested parties, including government agencies, can re-use the code and build it into whatever they need. All the contact information in the government directory can also be accessed through a public API (Application Programme Interface). (An API is further explained in answer 7).

C. Execution and Implementation

 4. In which ways is the initiative creative and innovative?
Elements of the action plan To start, we looked at what other governments were doing online. We found good ideas but didn’t know what would work in New Zealand, so we asked people what they thought. We ran focus groups and user testing to find out how people wanted government information online. In order to develop the site, the GIS team at Internal Affairs proposed taking a phased approach and to work in an iterative manner. For our approach we decided to: 1. Base decisions on user needs 2. Build an evidence base 3. Start small and iterate. We started with an alpha site. This was a wire-frame with basic content that we could test with users to see if our ideas worked. Then we launched our beta site,, which was used to gather feedback and additional testing. Content was grouped in information hubs with labels like ‘driving and transport’ and ‘travel and immigration’. Nothing was grouped by agency; it’s not how the user thinks. The content was based around common tasks which link out to government sites. Agencies fact checked the content before the site went live. Main activities/chronology March 2013 Alpha website GIS built an 'alpha' site in order to test information design concepts and content models. The alpha site was a limited release site, and it was used as an early prototype. It enabled DIA to demonstrate and test early concepts with other government agencies and user groups. August 2013 – May 2014 Beta website – actively tested and iterated A beta website was launched in August 2013 and tested and refined during this period. This was a publicly available website which allowed users to provide feedback, report problems and contribute to improvements. The site was used as a basis for formal user testing, and feedback from testing and the public was folded into the iterative design process. It delivered as an improved website based on user feedback and provided complete, fact-check content that was supported and managed by the project team. The website provided ‘thin content’ that provides summary information and links out to agency sites. The ‘thin content’ provides broad context for the information and links out to existing websites. It is more extensive than the ‘links farm’ content that was available on and it is accessed through improved information architecture. For example, if the user is looking for information on what they have to do when they have a baby, the ‘thin content’ provides a summary of the things the user must do, and then provides links to where they can do those things. 29 July 2014 Live website – officially went live. This phase provides an online all-of-government presence for citizen-focussed content with an operating model for ongoing development. The website provides context to inform the users’ journey that may require them to interact with multiple agencies. The processes enable the creation and maintenance of ‘thick content’ in addition to the current ‘thin content’ as part of business as usual. It replaces the government-focussed and often duplicated content spread across many agencies. It also allows agencies to work collaboratively on shared content on Since the launch, the focus is on continual improvement of the product from a content enrichment perspective to making the product easier for users to find information and eventually to gain a much more visible online presence.

 5. Who implemented the initiative and what is the size of the population affected by this initiative?
The Digital Engagement team led this development project. The team sit within GIS at the Department of Internal Affairs and includes specialists in government online delivery. Key civil servants on the project include Digital Engagement Manager Laura Sommer, Product Owner Jared Gulian, Information Architect Nathan Wall, content designers including Victoria Wray, Gail Connelly and Katie Johnston. We also had secondees from the Ministry for Social Development and Developers, Business Analysts and Project Management resources from DIA’s Technology Services Group including Karen Hansen, Nicola Duncan and James Goodman. The support and advocacy of DIA’s Chief Executive, Colin MacDonald, has been instrumental in the delivery of this initiative. The site had made their entire code open source and we adapted the front-end templates – the basic design elements of the site – rather than starting from scratch. One of the great things the UK did was to use ‘responsive design’ so we didn’t have to build a separate mobile site – the code responds to the device. The content was fact-checked by 44 government organizations. The contact details were fact-checked by more than 400 government organizations. Silverstripe (private sector) provides the Common Web Platform (CWP) – the technology required to support the preferred operating model is dependant on the CWP. Optimal Experience (private sector) conducted the user testing and research on our behalf. The Citizens Advice Bureau helped us by providing their data on high demand services to help us prioritise the information we provide. An ‘International Working Group on Digital’ was also set up to meet regularly and share learnings across jurisdictions. Talking with other governments we have found that the issues we face are global. It’s been valuable to learn about what they have implemented, what’s worked and what the ‘lessons learned’ are.
 6. How was the strategy implemented and what resources were mobilized?
Approved in stages, the development of was managed as a complete project, fully funded by the Department of Internal Affairs. A capex budget of $797,839 was allocated for the beta stage in FY12/13 and $1.256m for the move from beta to production in FY13/14. The budget for both stages included all technical and human resource costs as well as the specialist areas (outsourced) of research and user testing. The costs cover 3 main categories: 1. Usability - user testing, research, design elements and customer focus 2. Content - developing a content strategy and structure, finding relevant content from existing agency websites, fact checking and creating ‘user journeys’ showing how to get something from government. 3. Tech - hosting, infrastructure, development, security and accessibility testing Roles within the project team were filled by individuals seconded from the Digital Engagement Team, DIA’s Technology Shared Services group and other public sector agencies. As and when required throughout the project, additional resource was created through specialist contractors. Deliverables such as user research and testing was outsourced to specialist private sector organisations and international colleagues experiences were leveraged wherever possible. The system settings in New Zealand are set up to support individual agencies meeting their objectives. Each agency has their own appropriation and any cross-agency work relies on agencies voluntarily contributing to the initiative. Over the past few years, agencies have been forced to manage with lower baselines and, with the introduction of the Better Public Service reforms, more and more cross-agency initiatives are asking for contributions. Products like, where the benefits fall to the public and not necessarily government agencies, do not fit with this funding model. The Central Agencies have recognised that, to deliver the Better Public Service reforms, more sophisticated models are required. While these models are being developed, funding for the current financial year was required to maintain the momentum of A one off Crown fund was established and an application was made against this for $2.98m operate and continue to develop This was approved in August 2014 by Ministers. This significant investment shows the strategic importance of this initiative and the high expectations of the difference that it will make to the public.

 7. Who were the stakeholders involved in the design of the initiative and in its implementation?
1. User-led The development of was led by user research, testing and feedback. An initial round of research was completed to explore customer expectations, focussing on user needs, information architecture and customer experience. These insights were folded into the design of the beta site and two subsequent rounds of user testing were validated and analysed to make improvements and contribute to the subsequent production site. This has been a key success factor for the initiative and is the point of difference with the previous government portal and the majority of agency sites. 2. Content Content was written based on existing text from agency websites and organised by the information hubs identified through user testing. The text was re-written to meet plain English standards, merged from multiple sources to provide a comprehensive view of the topic area rather than an agency-focused view, and gaps identified. In some cases, content that did not exist in one place was created. 3. Directory The new Government Directory has information about more than 400 government organisations along with their contact details. This is one of the aspects of the product whose customers include other government organisations as it is the only full directory available. 4. Application Programme Interface (API) The site uses a public API which is available for anyone who wants to reuse content from An API is a set of instructions and standards that allows communication from one site to another. Content from can be automatically extracted in a variety of formats and republished in other locations. This ensures robust information is automatically updated when something changes on the API. The benefits of using APIs are about transparency and openness, it saves time, ensures information is correct and can be shared through many sources. 5. Transparency and communication We have kept a running commentary on the project through the Web Toolkit. Our Minister and senior staff have been briefed at key points, we have posted regularly on a couple of Yammer sites, provided presentations to key groups in government, offered information sessions to web teams and collaborated with other teams across government and through our own intranet. We have also published a number of blog posts about what we’ve learned, tools we use, the approach we’ve taken to writing content, and encouraging other agencies to think about their users’ needs.

 8. What were the most successful outputs and why was the initiative effective?
Planning and development of as an overall initiative utilised the Agile Development approach, however the departments Programme Management Office requires projects to follow their processes and documentation requirements. Following these two methods has caused the team some challenges (more in question 9), however it has ensured that the team were able to progress at pace with the support and confidence of key parts of the organisation Each stage of the initiative was divided into separate projects with a business owner as the project executive and a project board. Combining departmental processes and the Agile methodology the following documents were developed and approved for each of these stages and include a strict governance processes to monitor and evaluate each stage’s development and implementation activities. Business case Product Definition Solution Strategic Approach and Solution Architecture Risk Assessment and full risk and issues logs System Design Specification Security Risk Assessment Environmental Build Document Technical User Guide Release Notes Install Guide Test Reports Service Desk Handover Guide Service Level Agreement Project Plan Product Development Plan Product Delivery Plan Communications Plan Security Penetration Testing Security Vulnerability Report Benefits Realisation Plan Status Reports Exception Reports End Stage Reports External review (independent) Code Quality Review Solution Acceptance Certificate System Security Certificate Usability Report (independent) Engagement Plan Research Reports Sprint Documentation Throughout the project stages, monthly project board meetings were held where, utilising the documents and processes noted above, all progress was discussed and any actions to address delays, risks or issues documented and actioned.

 9. What were the main obstacles encountered and how were they overcome?
The delivery of encountered some significant obstacles, some overcome and some of which remain work in progress post implementation. The main being: 1. Using Agile Development methodologies Based on 4 values that fundamentally differentiate it from the traditional approaches that have such high failure rates, the Agile Manifesto values: “Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more” The difficulty lay in using Agile methodologies whilst delivering the heavy processes, documents and plans of the government’s well embedded Waterfall approach. The two processes operated in parallel, often duplicating resources/effort. Successfully an important outcome has been building an understanding and comfort level within the department that Agile methodologies provide sufficient governance and monitoring with progress actually more visible to the executive leadership team. The Chief Executive now champions what he sees as a much more effective way of working and its use, internally, is growing. 2. Integrating Information and Fact Checking Deconstructing information from so many government organisations re-constructing it to ensure it was easy to find and consume was huge - fact checking took 9 months of a full time resource to complete. There were constant iterations between agencies very attached to their existing content and the team who were determined to present integrated information in plain English content for the user. 3. Funding and Operating Model Systems in NZ are currently set up to support individual agencies with very specific silo’d accountabilities. The funding and operating models for products and services like are still being developed, meaning finding sufficient funding to continue the project and the ongoing operation remains a challenge.

D. Impact and Sustainability

 10. What were the key benefits resulting from this initiative?
Delivering better public services within tight financial constraints is one of the Government's four priorities. The better our delivery, the more the public service can respond to the needs and expectations of New Zealanders. The benefits resulting from this initiative to the public are: • integrated government information, reducing the effort and knowledge required to navigate individual agency websites, improving access to services; • presenting content that’s easily accessible and meets the needs of the widest range of citizens possible rather than reflecting agency structures and service delivery; and • access to information enabled across a wide range of online (including mobile) devices, recognising our user’s needs. The information delivers is accurate and authoritative – public sector agencies have all reviewed and approved the content – and the site currently delivers the highest value information as determined through user research and testing. We built content around ‘user topics” and users now get a more consistent experience, obvious starting points, and easy-to-understand, plain English content. The site is optimised for search engines and is built using ‘responsive design’ which means it provides a consistent experience and is accessible on any device – desktop, laptop, tablet or smart phone. Meeting New Zealand Web Standards ensures that it is accessible to anyone, including those with disabilities. The impact has on the delivery of better public service is measured via: Time saved A primary objective has been to deliver information that makes it easier and quicker for people to find what they need. Before creating content (written from the user’s perspective) the team (experienced web professionals) measured how long it took them to find existing information that was spread across many government locations. One example was finding information for dog owners on fines and infringements - scattered across a number of websites it took 6 ½ hours to find all the relevant information online, including reading 5 pieces of legislation. Now people can see it all on Flesche reading scores The Flesch Reading Ease test helps determine how easy or hard your text is to read. The test provides a score based on the length of words and sentences, plus the number of syllables. A score of 65 or more is considered plain English. Before publishing content is evaluated to ensure it meets this score at a minimum. This approach has contributed to being a finalist in three categories in the 2013 Writemark Plain English awards.  Usability The System Usability Scale (SUS) has become an industry standard and we regularly use it to help us measure how easy it is for people to use and consistently achieve scores above average. Google Analytics The main performance measure currently utilised is the ‘Google Engagement Score’ indicating users engagement level based on how they’re using the site. This data-driven insight into content value is an important measure for the impact on users and is integrated with user feedback to direct product improvements. It will take several months of data gathering before the score is appropriately calibrated, the current score is 38.8% against a target of 50% highlighting potential issues with the layout of the home page that are currently being user tested. User feedback We have obtained feedback from people via user testing, working with information intermediaries such as the public libraries and the Citizens Advice Bureau as well as through seeking direct feedback via The following quotes show the direct impact this initiative is making for the public: “No frilly bits that you don’t need.” “It doesn’t make me feel apprehensive.” “I’d use this so much – it’s so easy!” “Whoever wrote this needs a medal.” “It’s just right there: it gives you all the information, but nothing too long either.” “It’s the sort of website I’d go to even if I didn’t need things – to learn stuff!” “I can do this task quite easily and not be flustered by it.” “Thank you for putting all of this information in one place. I work in a public library and we frequently have questions from users who don't know where to start with accessing government resources online. This site is an asset. It is well laid out, easy to navigate, and I appreciate the attitude behind putting lots of calls for feedback on the pages. I also enjoyed the Analytics section.” “I manage a small rural library branch. I rarely have all the relevant print resources or expensive database subscriptions that a city library may have at hand. This new site absolutely ROCKS and will really, REALLY help baby libraries like us point our more needy (i.e. technically challenged) customers in the right direction. Thank you so much for pulling all this together, it is exactly what was needed.”

 11. Did the initiative improve integrity and/or accountability in public service? (If applicable)
Development of was initially influenced by a similar site GOV.UK, which has proved hugely successful with UK users. Inspired by the work of the UK’s Government Digital Service team and their collaborative approach, throughout the design and development of we adopted their commitment to user-centred design, feedback and transparency, as well as their design principles. By building on what they had done, and adapting their basic design elements, we were able to save time and money to deliver a better quality website. is now in turn influencing government web developers worldwide including Australia, Scotland and Canada with this user centred and integrated approach. There are four key lessons we learned about the process of sharing learnings with our overseas counterparts: 1) Don’t start from scratch. Build on what others have already done. 2) Start small and iterate. Adapt as needed. 3) Share what you do even if it doesn’t work like you expected. 4) Base your decisions on evidence: user testing, feedback and more user testing. Prior to the development of, agencies did not have access to a common platform for the delivery of cross-agency citizen-centred content and continued to work in silos. By implementing this operating model, provides a platform for agencies to use and develop their content collaboratively with across government. We have dedicated a lot of time and energy to user testing to make sure we have got it right, and we’ve published the results of our user testing on the Web Toolkit, so anyone can use them. The ‘iterative’ bit is important too. We started with something we know isn’t perfect because its evolution needs to be driven by users. It will change as more research shows us what’s needed next, as information gets updated or completely re-thought by agencies, as new gadgets get popular, or as feedback tells us how to improve it. Importantly (and we’ve been told revolutionary) is sharing: we want other agencies to use our research, borrow our content style guide, reuse code and adopt the Common Web Platform (CWP). Many agencies are already looking at improving their websites so we hope that they take advantage of work already done by us, and others, so we're all focused on the same goal: meeting users’ needs. We have already had a lot of interest from agencies in the way we’re using analytics and how they can leverage this.

 12. Were special measures put in place to ensure that the initiative benefits women and girls and improves the situation of the poorest and most vulnerable? (If applicable)
Overall the initiative has been a success – it was delivered on time and on budget. The launch of the site by New Zealand’s Prime Minister, Minister of Internal Affairs and GCIO was successful, generating positive media interest nationally and internationally. There was a spike in visitors to the site and it was met with little critical comment. The project team engaged with 44 government agencies to check the accuracy of content and as a result relationships and processes established to allow for further government content to be improved. The initiative is an excellent example of government delivering a website accessible to visually and cognitively impaired users and completely meeting its own guidelines (WCAG2) of accessibility. The fortnightly development release cycle was a success and is now included as part of the DIA Change Management Sub Process for CWP. We also achieved excellent results in the Code Quality and Security Assessments. The main issue with the initiative was not having an operating budget and model in place when the project finished. It was demotivating for the team and considerable resource was expended internally to secure funding past the project launch. Due to the funding uncertainty, the website was launched without a management structure or strategy. There was considerable internal debate over the process and outcome of the design of the website. Inexperience amongst the team around Agile meant the problems arose around the acceptance criteria of user stories and roles within the team. There were also some teething problems with development on the Common Web Platform, including the testing environment on CWP not allowing for cross browser, cross device test capability and with the team being dependant on personal devices. Recommendations for future projects would be that a project should not be started unless operating budget is secured. The costs and benefits of building a website internally should be compared to the costs and benefits of outsourcing to specialist digital agencies. Having specialist digital agencies pitch for the project would have allowed for contestability of ideas at an early stage. Using photos in a website that requires copyright is a long process involving intense negotiations with the vendor, DIA legal, DIA Kaumatua for cultural sensitivity, and DIA signoff protocols. It is recommended that any future use of imagery confront the signoff process early. The use of specialist Agile trainers early in the project life would have been beneficial, as well as the availability of infrastructure for testers to be able to adequately test across browsers and devices. Cohesive planning between Development and Testing roles should also be factored early in the development process. Lastly, the greater visibility of a project budget and management along with considering the use of delegated authorities to the Project Manager.

Contact Information

Institution Name:   Department of Internal Affairs
Institution Type:   Government Agency  
Contact Person:   Brigitte Birch
Title: Incubation Manager  
Telephone/ Fax:   +64 4 470 4595
Institution's / Project's Website:  
Address:   PO Box 805
Postal Code:   6140
City:   Wellington
State/Province:   Wellington

          Go Back

Print friendly Page