Distinguishing Design Verification from Design Validation in Medical Device Development

Published on December 4, 2025.

Why It Matters

In the analysis of medical device performance and regulatory adherence, the terms “Verification” and “Validation” (often abbreviated as V&V) are frequently conflated. However, under FDA Quality System Regulations (21 CFR Part 820) and ISO 13485, these represent distinct phases of the design control process with unique objectives and evidentiary requirements. A clear understanding of this dichotomy is essential for evaluating the completeness of a Design History File (DHF) and assessing adherence to regulatory frameworks. Misinterpreting these stages can lead to confusion regarding whether a device was tested against its engineering specifications or against the actual needs of the user.


The Regulatory Distinction

The fundamental difference between verification and validation is best summarized by the standard industry mnemonic:

  • Design Verification: “Did we build the device right?”
  • Design Validation: “Did we build the right device?”

Design Verification is defined in 21 CFR 820.30(f). Its primary objective is to provide objective evidence that the Design Outputs meet the Design Inputs. This is an engineering-focused exercise. It confirms that the device meets the technical specifications established at the beginning of the project. If a Design Input specifies that a catheter must withstand a minimum of 10 Newtons of tensile force, verification involves a mechanical test demonstrating that the catheter withstands at least 10 Newtons (or breaks at a force greater than 10 Newtons, depending on how the specification is written).


Design Validation is defined in 21 CFR 820.30(g). Its objective is to ensure the device conforms to defined User Needs and Intended Uses. This phase assesses the device in the hands of the end-user (or a representative user) under actual or simulated use conditions. Validation confirms that the technical specifications (even if met) actually result in a product that functions as intended in a clinical environment. For example, a catheter might meet the 10-Newton tensile requirement (Verification), but if the surgeon cannot maneuver it through tortuous anatomy without hand fatigue, the device may fail Validation.


Testing Methodologies

The testing protocols generated during these phases differ significantly in scope and methodology.


Verification Testing

Verification protocols are typically quantitative and bench-based. They rely heavily on standardized test methods, such as those published by ASTM or ISO. Examples include:

  • Dimensional Analysis: Confirming the device geometry matches the engineering drawings.
  • Mechanical Testing: Fatigue testing, tensile testing, or three-point bend testing to ASTM standards.
  • Software Unit Testing: Confirming that specific blocks of code execute inputs and outputs correctly.
  • Electrical Safety: Testing to IEC 60601 standards for leakage current or dielectric strength.


Validation Testing

Validation protocols focus on the efficacy and safety of the device in a use-scenario. These tests must be performed on initial production units or their equivalents, as specified in 21 CFR 820.30(g), which requires validation to be performed “under defined operating conditions on initial production units, lots, or batches, or their equivalents.” Examples include:

  • Human Factors Studies: Observing users interacting with the device to identify use errors or interface difficulties (in accordance with IEC 62366 and guidance such as ANSI/AAMI HE75).
  • Simulated Use Testing: Using an anatomical model to replicate a surgical procedure.
  • Clinical Evaluations: Data collected from human trials (IDE) or evaluating existing clinical data.
  • Packaging and Transport Validation: Ensuring the sterile barrier remains intact after shipping simulation (e.g., ASTM D4169).


The Traceability Matrix

A central component of a compliant Design Control system is the Traceability Matrix. This document serves as the roadmap linking the entire design process. It visually demonstrates the cascade of requirements:

  1. User Needs (What the doctor wants) map to…
  2. Design Inputs (The engineering spec) map to…
  3. Design Outputs (The final drawing/recipe) map to…
  4. Verification Reports (Evidence the spec was met) map to…
  5. Validation Reports (Evidence the user need was met).


In a technical review, the Traceability Matrix is utilized to identify “orphaned” requirements. If a Design Input exists without a corresponding Verification test, the design control process may be considered incomplete. Conversely, if a test is performed that does not map back to a specific Input or User Need, the necessity and relevance of that data may be questioned.


Table 1: Example Traceability Matrix (Class II Bone Screw)

IDUser NeedDesign Input (Spec)Design Output (Drawing/Spec)Verification (Test Method)Validation (User Test)
R.1The surgeon must be able to drive the screw without stripping the head.DI.1.0: Driver interface shall withstand 5.0 Nm of torque without deformation.DO.1.0: Dwg 101, Hexalobe Recess Dimensions per ISO 10664.Ver.1.0: Torsional Yield Test (ASTM F543).Val.1.0: Simulated Use: Cadaver lab with orthopedic surgeons.
R.2The implant must not corrode inside the body.DI.2.0: Material shall be Ti-6Al-4V ELI per ASTM F136.DO.2.0: BOM Item 304, Raw Material Certs.Ver.2.0: Chemical Analysis & Passivation Verification (ASTM F86).Val.2.0: Biocompatibility Assessment (ISO 10993-1).
R.3The sterile barrier must remain intact during shipping.DI.3.0: Seal strength shall be > 1.0 lb/in.DO.3.0: Tyvek Pouch Spec #405.Ver.3.0: Peel Strength Test (ASTM F88).Val.3.0: Distribution Simulation (ASTM D4169) & Shelf Life.

Litigation Context

When analyzing a medical device file in the context of a product liability claim or patent dispute, technical experts and counsel typically focus on the continuity of the V&V process.

Relevant documents often examined include:

  • The Design History File (DHF): To establish the chronology of testing.
  • Verification Protocols and Reports: To assess if the sample size was statistically significant and if the acceptance criteria were defined prior to testing.
  • Validation Reports: To determine if the testing environment adequately simulated the actual conditions of use.
  • Non-Conformance Reports (NCRs): To identify if any failures occurred during V&V and if they were technically justified or if the design was altered in response.

Additionally, experts often examine when specific tests were conducted relative to design changes, manufacturing process modifications, or the reporting of adverse events. The temporal sequence of V&V activities can reveal whether validation was truly completed before commercial distribution or whether it was conducted retrospectively, a significant regulatory and liability consideration.


Document Availability and Discovery Considerations

Under 21 CFR 820.180 (General Requirements) and 21 CFR 820.186 (Quality System Record), manufacturers are required to maintain Design History Files and associated verification and validation documentation as part of their quality system records. These documents must be made readily available during FDA inspections and audits.


Specifically, 21 CFR 820.180 requires that quality system records “shall be readily available for review and copying by the Food and Drug Administration.” Given this regulatory mandate, manufacturers maintain these records in organized, retrievable systems as a matter of routine business practice.


Consequently, requests for DHF documentation, verification protocols, validation reports, and traceability matrices during litigation discovery typically do not constitute an undue burden. These are not documents that require reconstruction or special compilation. They are maintained in the ordinary course of business to satisfy ongoing regulatory obligations. The organizational systems required for FDA readiness directly facilitate production in response to discovery requests.


Furthermore, FDA Form 483 observations and Warning Letters frequently cite inadequate document control or inability to promptly produce design control records. A manufacturer’s claim of difficulty in producing such records during discovery may itself raise questions about Quality System compliance.