Professional Documents
Culture Documents
DISTRICT COUNCIL
DATA QUALITY
STRATEGY
MARCH 2008
CONTENTS
Item
1.
2.
3.
4.
Introduction
Definitions
Current Position
Characteristics of Data Quality
4.1. Accuracy
4.2. Validity
4.3. Reliability
4.4. Timeliness
4.5. Relevance
4.6. Completeness
5. Data Quality Standards
5.1. Governance and Leadership
5.2. Policies
5.3. Systems and Processes
5.4. People and Skills
5.5. Data Use and Reporting
6. Review and Action Plan
Management Arrangements Scores November 2006
Action Plan for Improvement
Page
1
1
2
2
2
2
3
3
3
3
3
4
4
5
5
5
6
Appendix 1
Appendix 2
1. Introduction
The Council needs information that is fit for purpose in order to manage services and
account for performance. Information is used throughout the organisation to make
judgements about the efficiency, effectiveness and responsiveness of services and in making
complex decisions about priorities and the use of resources. Service users, and in particular
members of the public, need accessible information to make informed decisions and
Regulators and government departments must satisfy their responsibilities for making
judgements about performance and governance.
The 2006 Local Government White Paper, Strong and Prosperous Communities, and the
Local Government and Public Involvement in Health Act 2007 set out a new performance
framework for local services. This places greater reliance on data quality, to provide robust
data for local performance management, and to inform performance assessments. It also
emphasises the need for local public services to use information to reshape services
radically and to account to local people for performance. As increasing reliance is placed on
performance information in performance management and assessment regimes, the need to
demonstrate that the underlying data are reliable has become more critical.
In November 2007 the Audit Commission published Improving information to support
decision making: standards for better quality data. This paper encourages public bodies to
improve the quality of the data used for decision making, presenting a set of clear and
concise standards, based on accepted good practice, which can be adopted on a voluntary
basis.
The Council has published a Policy Statement for Data Quality 1 which outlines a commitment
to data quality through the adoption of the Audit Commissions Standards for Better Data
Quality. This strategy builds on the Policy Statement and outlines an approach to improving
Data Quality across the Council. Consistent, high-quality, timely and comprehensive
information is vital to support good decision-making and improved service outcomes.
2. Definitions
The terms data, information and knowledge are frequently used interchangeably and are
defined in the following table. This document and the standards it introduces, focuses on
data; that is, the basic facts from which information can be produced by processing or
analysis.
Data
3. Current Position
During 2006 the Audit Commission implemented a revised approach to the audit of
performance indicators in local government. This required the Councils External Auditors to
conclude on the arrangements for monitoring and reviewing performance, including
arrangements to ensure data quality. A score was attributed, derived from a number of key
lines of enquiry (KLOE) and areas of audit focus and evidence under the following;
The arrangements for 2005/06 achieved an overall score of two or adequate performance
for the Councils management arrangements in respect of data quality. Details of the scores
for each of the key lines of enquiry, which combine into the overall score of two, are shown at
Appendix 1. The subsequent recommendations made by the External Auditor have been
taken into account when developing this Strategy.
4. Characteristics of Data Quality
The Audit Commission have identified six key characteristics of good quality data.
4.1. Accuracy
Data should be sufficiently accurate for the intended use and should be captured only
once, although it may have multiple uses. Data should be captured at the point of activity.
Data is always captured at the point of activity. Performance data is directly input into
PerformancePlus2 (P+) by the service manager or nominated data entry staff.
Access to P+ for the purpose of data entry is restricted through secure password
controls and limited access to appropriate data entry pages. Individual passwords
can be changed by the user and which under no circumstances should be used by
anyone other than that user.
Where appropriate, base data, i.e. denominators and numerators, will be input into
the system which will then calculate the result. These have been determined in
accordance with published guidance or agreed locally. This will eliminate calculation
errors at this stage of the process, as well as provide contextual information for the
reader.
Data used for multiple purposes, such as population and number of households, is
input once by the system administrator.
4.2. Validity
Data should be recorded and used in compliance with relevant requirements, including
the correct application of any rules or definitions. This will ensure consistency between
periods and with similar organisations, measuring what is intended to be measured.
Relevant guidance and definitions are provided for all statutory performance
indicators. Service Heads are informed of any revisions and amendments within 24
hours of receipt from the relevant government department. Local performance
indicators comply with locally agreed guidance and definitions.
PerformancePlus is performance management software purchased in April 2006 from InPhase Ltd.
4.3. Reliability
Data should reflect stable and consistent data collection processes across collection
points and over time. Progress toward performance targets should reflect real changes
rather than variations in data collection approaches or methods.
Source data is clearly identified and readily available from manual, automated or
other systems and records. Protocols exist where data is provided from a third party,
such as Hertfordshire Constabulary and Hertfordshire County Council
4.4. Timeliness
Data should be captured as quickly as possible after the event or activity and must be
available for the intended use within a reasonable time period. Data must be available
quickly and frequently enough to support information needs and to influence service or
management decisions.
Performance data is requested to be available within one calendar month from the
end of the previous quarter and is subsequently reported to the respective Policy and
Scrutiny Panel on a quarterly basis. As a part of the ongoing development of
PerformancePlus it is intended that performance information will be exported through
custom reporting and made available via the Three Rivers DC website. This will
improve access to information and eliminate delays in publishing information through
traditional methods.
4.5. Relevance
Data captured should be relevant to the purposes for which it is to be used. This will
require a periodic review of requirements to reflect changing needs.
We have a duty to collect and report performance information against a wide range of
statutory indicators. These are set out in the context of the Governments White
Paper Strong and Prosperous Communities. Where appropriate each service will
identify reliable local performance indicators to manage performance and drive
improvement. These are reviewed on an annual basis to ensure relevance.
4.6. Completeness
Data requirements should be clearly specified based on the information needs of the
organisation and data collection processes matched to these requirements.
These Standards reflect the KLOE as described in paragraph 3. Below, this Strategy
identifies the extent that we currently meet with these standards and recognises those areas
that are not yet fully developed.
5.5.3. Data will be used appropriately to support the levels of reporting and decision
making needed (for example, forecasting achievement, monitoring service
delivery and outcomes, and identifying corrective actions). Evidence is
provided so that management action is taken to address service delivery
issues identified by reporting.
5.5.4. Data which are used for external reporting are subject to rigorous verification,
and to senior management approval.
5.5.5. All data returns are prepared and submitted on a timely basis, and, where
appropriate, are supported by a clear and complete audit trail.
6. Review and Action Plan
We are currently awaiting the outcome of the annual assessment of data quality undertaken
by our External Auditors during 2007. On receipt of the assessment and report we will
develop and publish an action plan to address those areas requiring further attention with a
view of improving our overall score against the KLOE.
Appendix 1
Management Arrangements Scores November 2006
Theme and Key Line of Enquiry
Score
1.2
1.3
The body has effective arrangements for monitoring and review of data quality
Policies
2.1
A policy for data quality is in place, supported by a current set of operational procedures
and guidance.
2.2
Policies and procedures are followed by staff and applied consistently throughout the
organisation.
There are appropriate systems in place for the collection, recording, analysis and reporting
of the data used to monitor performance, and staff are supported in their use of these
systems.
3.2
The body has appropriate controls in place to ensure that information systems secure the
quality of data used to report on performance.
3.3
Security arrangements for performance information systems are robust, and business
continuity plans are in place.
3.4
The body has communicated clearly the responsibilities of staff, where applicable, for
achieving data quality.
4.2
The organisation has arrangements in place to ensure that staff with data quality
responsibility, have the necessary skills.
Data Use
5.1
The body has put in place arrangements that are focused on ensuring that data supporting
performance information is also used to manage and improve the delivery of services.
5.2
Overall Score
Key to score
1 = below minimum requirements inadequate performance
2 = only at minimum requirements adequate performance
3 = consistently above minimum requirements performing well
4 = well above minimum requirements performing strongly
Appendix 2
Action Plan for Improvement
To be developed on receipt of the 2007 Assessment and Report