Up until now, we’d been discussing the justification of using TDD in an ASIC development flow. Hopefully, we’ve convinced you to try it. In this post we’ll introduce a TDD framework that has been developed for SystemVerilog to help you use this design technique.
A couple of weeks ago, just after we got started with TDD month, Neil added the link to the posts on several industry forums, and got this comment from Alex Gnusin on the verification guild:
“Is it a Designer responsibility to test each line of code? In this case, there is a need to provide designers with working methodology to verify their code…”
To which Neil responded:
Alex: I’m not sure if you guessed we’d be covering the topic of a working methodology – aka: unit test framework – but if you did, I’d like to thank you for that nice bit of foreshadowing!
Alex is right, a proper framework is pretty important for anyone doing TDD; primarily because it gives you the opportunity to get up and running quickly.
In the software world, there are are number of wildly popular unit test frameworks. JUnit might be the best known (for those doing java development) but there are about a million others (as you can see with a quick trip over to Wikipedia). A unit test framework is critical for TDD, that’s why myself and Rob Saxe (both formerly of XtremeEDA) put one together a couple of years ago for people wanting to do TDD with SystemVerilog.
First presented at SNUG San Jose in 2009, SVUnit is a unit test framework that provides:
- structures for creating and running unit tests
- utilities for reporting test status and results
- a test “runner” that executes a set of test suites and report on their overall success/failure
- scripts to automate the mechanical aspects of creating new unit test code
This post briefly covers the usage model for using the framework. Actually, I’ll split the post into three: this post covers how to use such a framework, the next post will discuss where to use it when building your verification components, and my last post will discuss how RTL designers can potentially use it. As this post is more a teaser than a full blown article, you should ask the folks at XtremeEDA for a copy of the SNUG paper that Rob and I created. It gives full details on the framework design and usage model. Of course, the next question you’ll be asking is: where can we get the code? We’re working on that… more details very shortly.
At its most basic, there are three hierarchical classes that you need to know about:
- svunit_testcase: an abstract base class that you derive from and add your TDD code into.
- svunit_testsuite: which aggregates and runs the svunit_testcase objects.
- svunit_testrunner: which aggregates and runs the svunit_testsuite objects, and reports the errors.
The following UML class diagram gives an overview of the classes and the hierarchical relationship between the three classes. For those of you unfamiliar with UML, this diagram shows that a svunit_testrunner holds one or more svunit_testsuite objects which holds one or more svunit_testcase objects.
Every testcase that you create derives from the abstract svunit_testcase base class object. It has three methods that you’ll need to know about:
- setup(): where you can optionally setup the pre-conditions for the test.
- run_test(): where you add your test code.
- teardown(): where you can optionally restore the verification component back to reasonable defaults.
There also two macros to help in creating the tests:
- FAIL_IF(expression): similar to an assertion where the testcase fails if the expression is true. e.g., FAIL_IF(my_packet.calc_crc() == TRUE);
- FAIL_UNLESS(expression): corollary to FAIL_IF, where the testcase fails if the expression is false. e.g.,`FAIL_UNLESS(my_packet.get_parity(data) == 0);
Then you can start creating unit tests in a class called class_under_test_unit_test (which is stored in a file called class_under_test_unit_test.sv). This class derives from the svunit_testcase base class.
Add your testcode into the run_test() function. If required, you can add pre-test and post-test code in the setup() and teardown() tasks, respectively. As discussed in previous posts, the idea is to create a set of tests that focus on one aspect, or an atomic feature of your verification class. These tests will incrementally build over time as you develop your verification class. I suggest you create a set of tasks inside your classname_unit_test that are called by the run_tests() task. e.g.,
task run_test(); `INFO("Running Unit Tests for class: packet_bfm:"); test_pack_bytes(); test_unpack_bytes(); test_send_data(); endtask : run_test
There is some “plumbing” code that needs to be created to instantiate and run these testcases. As this is a tedious, repetitive task we created some basic scripts that generate the code for the svunit_testsuite and svunit_testrunner classes. Ideally, all you’ll ever need to write is the unit test code. The script finds all the source files that match the pattern *_unit_test.sv, and then creates a corresponding class (derived from svunit_testsuite class) responsible for instantiating and running your unit test objects. The script then wraps those test suite classes into a test runner class (derived from the svunit_testrunner) class responsible for instantiating and running the test suite objects. At the end of the test, the auto-generated svunit_testrunner object reports on the pass/fail of the unit tests.
As discussed in previous posts, the flow of using TDD is to create the tests first, and then add just enough code into the class being tested until the tests pass. The flow of creating a unit test with the svunit framework is similar:
- Create your unit test class derived from svunit_testcase. If the ‘Class Under Test’ does not exist, create an empty class that your unit test class can instantiate.
- Add you test case into the run_test() (or better yet in a separate task that run_test() calls).
- Create the plumbing code by calling the script to auto-generate the derived svunit_testsuite and svunit_testrunner classes. The script also creates a Makefile to run the tests). Don’t touch any of these auto-generated since everytime you run the script they will be overrwritten.
- Run the test (make sure it fails!!).
- Add just enough code into your class under test until the test passes.
- Repeat steps 2 to 5 until all aspects of your class under test are completely tested.
The svunit framework provides you with some tools to make TDD easier to use — you focus on the individual tests; the framework runs those tests. Of course, there is more to this framework, but I would just repeating the contents of the paper in a post. Just ask our friends at XtremeEDA for a copy of the paper.
This post provides a brief introduction to the SVUnit framework. The next posts will provide some thoughts and guidance on where to use this technique for verification and RTL As well, I’ll discuss some future directions for this pre-UVM framework.
3 thoughts on “Test Driven Development: Introducing the SVUnit Framework”
that is an interesting post. In my team’s experiment’s with TDD, we have been using a “home grown” framework, perhaps not as well constructed as your. I would definitely be interested in any follow up discussion on SVUnit.
I was wondering if you have given any thought to the use of Formal verification for the purpose of unit testing. As long a the unit is sufficiently small, the state space should be manageable for the Formal execution to not be too long. Also there is less of a need for a framework and harness infrastructure. Just a thought, and I am curious if anyone out there in AgileSOC land has tried this
Regarding the SVUnit framework, we have an announcement very shortly about its availability. I’ll just state that while SvUnit is an excellent starting point, I do have some thoughts on other ‘stuff’ that needs to be added to make it even more usable. The more I use other s/w oriented frameworks I realize that there is stuff lacking in the SVUnit.
As to formal, I have definitely thought that using formal verification is an excellent complement to TDD for unit testing. As you indicate, the state space is small and manageable, no need for harness, and it is quite fast. Unfortunately, I have not yet had the opportunity to use formal in any projects — so I leave any further comments to those who have experience in it. But I definitely think it is a good complement.
The reason I think it is complementary is because it is only verifying the RTL side of the equation. If we had formal checking tool for verification components that would be very cool, but until then a TDD framework for classes will need to be there.
What I’m really not sure about, due to my lack of experience in using formal tools, is in the other value of using a TDD framework — where you get to work out the usage model for the unit under test. While a formal tool can verify it is correct by construction, I’m not sure it can help with that design aspect. But that’s just me doing a thought experiment, perhaps someone else developed a formal workflow that prove me wrong — and I’d be very happy to be proven wrong in this case.
Thanks for the thought-provoking discussion.
Indeed, formal verification (model checking) is an excellent complement to dynamic simulation – especially for the smaller RTL blocks. Combined together, these two approaches do the best verification job I ever seen 😉
However, formal model checking has special requirements for the code it is dealing with. The code has to be mostly synthesizable (with few exemptions), for both design and testbench. I am not sure that any object-oriented framework can be synthesized and therefore used for formal analysis.