APQC's Open Standards Benchmarking Validation Process

Validation is what sets APQC's benchmarking data apart from other sources of benchmarking data. Our approach is unique to APQC, and results in high-quality data that you can trust.

Checklist

APQC's benchmarking validation process consists of 4 steps:

  • Pre-validation checks
  • Logical validation
  • Statistical validation
  • Validation reporting

Each submission received here at APQC is subjected to each of the 4 steps. Submissions sometimes go through the validation process two to three times before they are accepted into the Open Standards Benchmarking database. Submissions are regularly reviewed for validity in light of new data.

APQC's Validation Process

Step 1 - Pre-validation Checks

Upon receipt of either an OSB survey or Rapid Performance Assessment our analysts perform a series of pre-validation checks, including:

  1. Membership verification for the submitting entity. Submissions not received from  APQC members or companies that meet APQC's strict participation guidelines, will be subject to fees as outlined in the Fee Schedule.
  2. Inclusion of the appropriate participating business entity’s name and North American Industry Classification System (NAICS) code for the entity.
  3. Period end-date verification: The period end-date must be for a closed 12 month period of time.  OSB does not accept future period end-dates, reflecting hypothetical, pro forma, or pro rata data.
  4. Completion rate: OSB only processes OSB surveys at least 80% complete or RPA submissions 80% complete.

Step 2 - Logical Validation

After passing the pre-validation checks, the analyst then performs a series of logical checks to verify appropriateness of responses as prompted by the survey questions.  This includes, but is not limited to:

  1. Numbers expressed are appropriate to the units of measure.
  2. Allocations sum to the totals.
  3. Measures of subsets are never greater than the measures of whole sets.
  4. Data expressed is consistent with the participating entity’s demographic information.
  5. Consistency checks of the provided information. APQC uses specific survey design principles to identify and catch typical survey participation errors.

Step 3 - Statistical Validation

After logical validation checks, the analyst calculates all metric values for which data elements are present and conducts statistical validation on the resulting metric values. APQC employs the Interquartile Range (IQR) method to establish upper and lower bounds to indicate a range of acceptable values. Metric values that fall outside of the established range are flagged for further investigation by the analyst. The analyst may leverage multiple peer groups for comparison in determination of the significance of any outlier against peers with similar industry, size, and regional characteristics. All data elements are then identified as potential outlier factors in statistical outlier metric values. The analyst makes a determination as to the outlier factor’s comparative nature to data elements unlikely to result in statistical outlier metric values.

Step 4 - Validation Reporting

After completion of initial validation checks, the analyst delivers a validation report that allows the respondent to both update or confirm any responses submitted. The report includes:

  • All metric values calculated from data elements provided along with summary statistics for all participants for comparative analysis, metric formulas, and metric categories.
  • All questions and responses received from the participating business entity along with any currency conversions applied, any logical and statistical validation flags, and OSB analyst comments.
  • Space for the respondent to update or confirm received responses and comment on any logical or statistical flags identified by the analyst for further investigation.