In publishing Portable Stimulus And Integrated Verification Flows, where I came up with the graphic that plots different verification techniques as a function of design scope vs. abstraction, I gave myself an opportunity to think more critically about verification than I have in the past. I’ve been recognizing different patterns and habits from my experiences and getting a better feel for how they’ve helped or hindered teams I’ve worked with. My most recent navel gazing has lead to a new and improved view for how design and testbench code matures over time; a way to quickly summarize quality in a way that’s meaningful and useful.
This is the same diagram I published a few weeks ago but with different classifications overlaid. I call them 5 steps in design maturity: broken, sane, functional, mature and usable. In my mind, every feature we create, every chunk of code we write, progresses through these 5 classifications on it’s way to delivery.
Broken: Either broken or assumed to be broken, this is where every line of code we write starts. A line of code stays broken until we have unit tests or directed tests that prove otherwise.
Sane: With stimulus that has little or no variation from the happy path, a design successfully demonstrates basic functionality. Stray from the happy path with too much randomization, however, and code likely breaks down.
Functional: The scope of tests increases beyond the happy path, a design shows it is capable of supporting variation in stimulus and configuration as well as support for expected interactions between interdependent variables.
Mature: In a wider increase in scope, a design shows it is capable of supporting wild variation in stimulus and configuration as well as support for unforeseen interactions between interdependent variables, scenarios and events.
Usable: From broken to mature, tests are applied from an engineering perspective to verify compliance relative to a specification. In the final step, a usable design is one that’s been tested under loads created from a user perspective to gain confidence a design functions well under real-world conditions.
I like this idea of a design going through a series of steps with respect to maturity for a few reasons. First, it implies that usable features are produced through a crawl-walk-run-like progression. Features are born broken. Over time, assuming a constructive verification strategy, they grow and mature to the point they become usable. This feels like a natural progression to me; one that’s hard to avoid and better off acknowledged and accepted. Bypassing steps or taking steps out of order deliberately incorporates the kind of growing pains that increase the risk features reach customers before they are usable.
Second, it associates each step in the progression with specific verification techniques. The association between maturity and verification technique gives verification engineers another reference point for identifying the most productive techniques for a given level of design maturity. This point brings me back to my comment of thinking about verification more critically.
Finally, it gives teams a basis for describing design maturity. With a little practice, it gives them a way to quickly set or convey expectations in a reasonably objective way.
Of course, these 5 steps also apply to verification infrastructure. Though, of course, verification infrastructure must move through the progression earlier than a design (i.e. it’s hard to suggest a usable design with broken verification infrastructure ;).
How do these 5 steps measure up to your view of how designs mature over time?