We pride ourselves on doing great work that delivers big impacts. Check out these examples of how we do it.
Verifying and Validating Performance Measure Data
- National Science Foundation (NSF)
Nexight helped the National Science Foundation (NSF) assess the quality of its publicly reported performance measurement data. Our verification and validation approach improved NSF’s performance measurement procedures, resulting in improved data quality while also building NSF’s internal measurement capacity.
The Challenge
Establishing Confidence in Measurement Data Quality and Evidence Building Capacity
Federal agencies are required to publicly report on progress towards their goals in their annual performance report and ensure the performance data contained therein is of sufficient quality:
- The Government Performance and Results Act (GPRA) Modernization Act of 2010 requires that agencies describe in their annual performance plan the means used to verify and validate performance information.
- OMB Circular A-11 requires agencies to have verification and validation (V&V) techniques in place to ensure the completeness and reliability of all performance measurement data contained in their annual performance plans and reports.
- OMB Circular A-136 requires that agency heads attest to the completeness and reliability of the performance data they report.
Desiring to meet these requirements and be confident that its performance data is complete and reliable, NSF sought to establish a credible and independent V&V process. NSF enlisted Nexight to verify the reliability of the methods NSF used to collect, process, maintain, and report performance data; and to validate NSF’s performance data.
In addition, the Foundations for Evidence-Based Policymaking Act of 2018 along with circulars and memoranda, including the Presidential Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking, call for federal agencies to enhance their capacity to build and use evidence. To help NSF develop this capacity, Nexight has leveraged the V&V process to disseminate guidance, model best practices, and coach measure owners.
Our Solution
A Custom Methodology for Verifying and Validating Performance Measure Data
V&V Handbook
Nexight Group implemented a V&V assessment process that is considered a best practice in the federal government. We started by developing an NSF V&V Handbook. The handbook provided the NSF measurement community with a reference document that describes the V&V process, the V&V criteria on which measures are assessed, and the requirements for meeting the criteria. The five criteria are:
- Complete: The results reported for the measure represent the entire population of elements within the measure’s specified scope.
- Consistent: Data are collected, analyzed, and reported using clearly defined and uniform procedures across people and time periods.
- Accurate: Data collection processes and procedures maximize data quality and minimize the potential for errors in the collection, recording, and reporting of performance information.
- Timely: Data are collected and updated at regular intervals, such that reporting deadlines are met, and performance information is current enough to inform decision-making.
- Valid: Data can be used to draw useful and meaningful conclusions about the agency’s performance that inform internal and external agency decisions.
Assessment Report
Nexight Group conducted annual V&V assessments on NSF’s publicly reported measures (12-20 measures per year) for six reporting cycles. Each year we discussed the nature and use of each measure with the measure owner, obtained supporting documentation, and determined the completeness and reliability of the data based on the Handbook’s five assessment criteria. We examined data files for each measure and confirmed whether reported data was correct. We prepared 3rd and 4th Quarter reports identifying each measure’s strengths and weaknesses relative to each criterion and, where appropriate, provided recommendations for improving data quality. Our reports contained a statement attesting to the extent to which the data are complete and reliable. The V&V reports serve multiple audiences:
- Measure Owners: Helped build measurement capacity by explaining how the V&V criteria in particular, and measurement principles in general, apply to their measure. Measure-specific recommendations in the report helped measure owners improve their measures.
- Leadership: Provided an overview of the quality of performance information across the agency and general recommendations for improving agency data quality
- External Stakeholders: Provided assurance that complete and reliable performance data is available for agency decision-making
Building Measurement Capacity
Nexight’s approach has been to build internal measurement capacity within NSF as we conduct the V&V assessments; this has allowed us to help NSF build measurement capacity through the following products and services:
- Performance Measure Description Form (PMDF): Identifies the scope of each measure, the process used to collect data, potential sources of error, and efforts to mitigate error; serves as a high-level standard operating procedure (SOP) for each measure and is an effective practice for ensuring consistency and reliability, especially when staff turnover occurs.
- Case Studies: Illustrate an important measurement principle relative to meeting the V&V criteria. Each focuses on a fictitious performance measure that encounters a common pitfall or challenge; describes what went wrong, why it’s a problem, and the program’s solution; and concludes with a key take-away that highlights an important principle of data quality (most effective when used in conjunction with a consultation).
- Guidance Documents: Focus on a particular aspect of data quality; include examples and worksheets to help measure owners apply the guidance and meet the V&V criteria (most effective when used in conjunction with a consultation)
- Consultations: Allow programs to bounce ideas off us and allow us to suggest how to enhance the quality of measures before they undergo the V&V assessment.
Lessons Learned Report
At the end of each assessment cycle, we held a lessons-learned session with measure owners, getting their views on what worked and what could be improved, which we documented in a lessons-learned report. More importantly, we used those results to improve all aspects of the process.
Impact
- Improved quality of performance data: Programs have implemented our recommendations to improve measure definitions, clarify measure scope, specify data collection procedures, and enhance quality controls. These actions have improved data quality.
- Matured internal assessment and evidence building capacity: Programs have adopted best practices for developing and using performance measures, including the creation of SOPs to ensure consistency, setting targets with meaningful thresholds, explicating assumptions, and specifying data limitations. The V&V process has also given programs a more precise and accurate language for discussing the completeness, reliability, and usability of performance data, and created a forum for frank discussions about how to improve data quality.
- Effected a positive change in attitude toward the formal V&V assessment: Most programs have become willing participants in the process because they recognize its potential to improve the usefulness of their mandated performance measures.