Skip to content

Verification & Validation: two Compasses

In software engineering, project success doesn’t rely on a single metric, but on the balance between two complementary forces. As Barry Boehm theorized in the 80s, quality is played out on two distinct fronts: building the product right, and building the right product.


1. Verification: mastering compliance

Verification is an internal control process. It ensures that work products (code, specifications, architecture) strictly adhere to the requirements set during previous phases.

To verify effectively, we use two complementary approaches:

The fundamental question: are we building the product right?


2. Validation: proving relevance

Validation steps in to confront the software with its end users and its real-world context. It evaluates whether the system meets business needs, often going far beyond what was formally written in the tickets.

It relies primarily on dynamic testing, because it is by manipulating the full system (or a functional prototype) that we can judge its relevance. We are no longer looking at whether “the button does what is written,” but whether “the button actually solves the user’s problem.”

The fundamental question: are we building the right product?


3. Analysis: why one cannot exist without the other

The power of these concepts lies in their synergy. As experts, we know that an imbalance between these two pillars is often the root cause of project failure.

The technical mirage (successful verification, failed validation)

This is the scenario of an application with irreproachable code, zero detected defects, and an exemplary architecture. If the interface is confusing or if the functionality meets no real user need, the product is a failure. We have “perfectly” built something useless.

The giant with feet of clay (successful validation, failed verification)

Conversely, an application can perfectly solve a critical problem and thrill users during early demos. But if it lacks verification (instability, security flaws, technical debt), it will collapse under its own weight as soon as it needs to scale. Business success is then short-lived.


4. A pragmatic approach: maximizing synergy

To transform quality into a growth lever, these two activities must dialogue throughout the product life cycle.

  1. The shift-left approach: by introducing static verification as early as the requirements phase, we secure future validation. A poorly formulated or ambiguous need is a validation defect in the making that can be killed in the egg.
  2. Total traceability: every verification effort must be linked to business value. If a test does not contribute to system stability (verification) or the success of a user journey (validation), it likely has no real value.
  3. Independent perspective: verification benefits from being rigorous and automated, while validation often requires a more exploratory and empathetic eye, capable of questioning the very relevance of the feature.

Conclusion

Verification and validation are two sides of the same coin. The former brings the rigor and durability necessary for maintenance, while the latter brings the vision and market fit.

A high-performing quality strategy doesn’t choose a side. It uses verification to guarantee the reliability of the implementation and validation to ensure the real utility of the product. It is in this duality that a sustainable software is built, capable of creating real and lasting value.


Share this post on:

Previous Post
Quality engineering: a pragmatic approach
Next Post
Code coverage: understanding the metrics