Up until now, we’d been discussing the justification of using TDD in an ASIC development flow. Hopefully, we’ve convinced you to try it. In this post we’ll introduce a TDD framework that has been developed for SystemVerilog to help you use this design technique.
A couple of weeks ago, just after we got started with TDD month, Neil added the link to the posts on several industry forums, and got this comment from Alex Gnusin on the verification guild:
“Is it a Designer responsibility to test each line of code? In this case, there is a need to provide designers with working methodology to verify their code…”
To which Neil responded:
Alex: I’m not sure if you guessed we’d be covering the topic of a working methodology – aka: unit test framework – but if you did, I’d like to thank you for that nice bit of foreshadowing!
Alex is right, a proper framework is pretty important for anyone doing TDD; primarily because it gives you the opportunity to get up and running quickly.
In the software world, there are are number of wildly popular unit test frameworks. JUnit might be the best known (for those doing java development) but there are about a million others (as you can see with a quick trip over to Wikipedia). A unit test framework is critical for TDD, that’s why myself and Rob Saxe (both formerly of XtremeEDA) put one together a couple of years ago for people wanting to do TDD with SystemVerilog.
First presented at SNUG San Jose in 2009, SVUnit is a unit test framework that provides:
- structures for creating and running unit tests
- utilities for reporting test status and results
- a test “runner” that executes a set of test suites and report on their overall success/failure
- scripts to automate the mechanical aspects of creating new unit test code
This post briefly covers the usage model for using the framework. Actually, I’ll split the post into three: this post covers how to use such a framework, the next post will discuss where to use it when building your verification components, and my last post will discuss how RTL designers can potentially use it. As this post is more a teaser than a full blown article, you should ask the folks at XtremeEDA for a copy of the SNUG paper that Rob and I created. It gives full details on the framework design and usage model. Of course, the next question you’ll be asking is: where can we get the code? We’re working on that… more details very shortly.
At its most basic, there are three hierarchical classes that you need to know about:
- svunit_testcase: an abstract base class that you derive from and add your TDD code into.
- svunit_testsuite: which aggregates and runs the svunit_testcase objects.
- svunit_testrunner: which aggregates and runs the svunit_testsuite objects, and reports the errors.
The following UML class diagram gives an overview of the classes and the hierarchical relationship between the three classes. For those of you unfamiliar with UML, this diagram shows that a svunit_testrunner holds one or more svunit_testsuite objects which holds one or more svunit_testcase objects.
Every testcase that you create derives from the abstract svunit_testcase base class object. It has three methods that you’ll need to know about:
- setup(): where you can optionally setup the pre-conditions for the test.
- run_test(): where you add your test code.
- teardown(): where you can optionally restore the verification component back to reasonable defaults.
There also two macros to help in creating the tests:
- FAIL_IF(expression): similar to an assertion where the testcase fails if the expression is true. e.g., FAIL_IF(my_packet.calc_crc() == TRUE);
- FAIL_UNLESS(expression): corollary to FAIL_IF, where the testcase fails if the expression is false. e.g.,`FAIL_UNLESS(my_packet.get_parity(data) == 0);
Then you can start creating unit tests in a class called class_under_test_unit_test (which is stored in a file called class_under_test_unit_test.sv). This class derives from the svunit_testcase base class.
Add your testcode into the run_test() function. If required, you can add pre-test and post-test code in the setup() and teardown() tasks, respectively. As discussed in previous posts, the idea is to create a set of tests that focus on one aspect, or an atomic feature of your verification class. These tests will incrementally build over time as you develop your verification class. I suggest you create a set of tasks inside your classname_unit_test that are called by the run_tests() task. e.g.,
task run_test(); `INFO("Running Unit Tests for class: packet_bfm:"); test_pack_bytes(); test_unpack_bytes(); test_send_data(); endtask : run_test
There is some “plumbing” code that needs to be created to instantiate and run these testcases. As this is a tedious, repetitive task we created some basic scripts that generate the code for the svunit_testsuite and svunit_testrunner classes. Ideally, all you’ll ever need to write is the unit test code. The script finds all the source files that match the pattern *_unit_test.sv, and then creates a corresponding class (derived from svunit_testsuite class) responsible for instantiating and running your unit test objects. The script then wraps those test suite classes into a test runner class (derived from the svunit_testrunner) class responsible for instantiating and running the test suite objects. At the end of the test, the auto-generated svunit_testrunner object reports on the pass/fail of the unit tests.
As discussed in previous posts, the flow of using TDD is to create the tests first, and then add just enough code into the class being tested until the tests pass. The flow of creating a unit test with the svunit framework is similar:
- Create your unit test class derived from svunit_testcase. If the ‘Class Under Test’ does not exist, create an empty class that your unit test class can instantiate.
- Add you test case into the run_test() (or better yet in a separate task that run_test() calls).
- Create the plumbing code by calling the script to auto-generate the derived svunit_testsuite and svunit_testrunner classes. The script also creates a Makefile to run the tests). Don’t touch any of these auto-generated since everytime you run the script they will be overrwritten.
- Run the test (make sure it fails!!).
- Add just enough code into your class under test until the test passes.
- Repeat steps 2 to 5 until all aspects of your class under test are completely tested.
The svunit framework provides you with some tools to make TDD easier to use — you focus on the individual tests; the framework runs those tests. Of course, there is more to this framework, but I would just repeating the contents of the paper in a post. Just ask our friends at XtremeEDA for a copy of the paper.
This post provides a brief introduction to the SVUnit framework. The next posts will provide some thoughts and guidance on where to use this technique for verification and RTL As well, I’ll discuss some future directions for this pre-UVM framework.