Automating digital design and verification tracking with testplanner

Published:

Topics: Open ASICs, Open FPGA, Open source tools

In digital design, ensuring systematic verification of all design aspects requires not only proper planning of design verification and test development, but also tools to track and make sure that design development and verification goes according to the plan. Gaining visibility into the design with an interconnected RTL source, testplans, documentation and test results is crucial to ensure comprehensive validation, especially in large, complex projects such as CHIPS Alliance’s Caliptra RoT project which Antmicro is participating in.

Open source tooling is a natural fit for collaborative development of this kind, and OpenTitan (another open source Root-of-Trust project) has been showing the way in developing tools and workflows to maintain a coherent system state while enabling users to navigate between different abstraction levels of a design. One of the tools originating in OpenTitan is testplanner, used for parsing testplans written in the HJSON format into a data structure that can be then integrated with documentation. Antmicro has been collaborating with the OpenTitan project for a long time, and is now working with lowRISC (the OpenTitan steward organization) to extend and adapt tools like testplanner for Caliptra and other projects that we’re working on.

In this blog we describe a prototype of standalone, enhanced testplanner, constituting its own pip package, with extra features we developed for our needs such as test result interlinking, source code links. With even more improvements under way, we hope the new, upgraded tool will eventually be adopted across open source RTL projects like OpenTitan and Caliptra.

We will also showcase its application in CHIPS Alliance’s I3C core where it is used for generating testplans in the project documentation and producing simulation result tables.

Testplanner illustration

Testplanner as a standalone module

The standalone version of testplanner is a utility capable of generating documentation pages for testplans expressed in a machine-readable HJSON format. The testplans consist of testpoints, which are basically a collection of tests that together cover a single feature. Testpoints have an additional description string that can supply further information about the functional details of the test, and when a standard convention is used, those can be further split up into smaller pieces in certain output configurations.

While a collection of testpoints is normally defined as a single hjson file, an “include” directive lets you generalize certain steps or reuse parts of other flows. Each testpoint is also a part of a stage, which aggregates data about testpoints both inside and across testplans.

The use of a HJSON schema - the H standing for a human-friendly user interface to JSON - makes testplans created in this manner both human-writable and machine-readable, letting users automate testplan, documentation and test results management.

New features

To generalize testplanner for use in non-OpenTitan projects, we started by extracting the codebase from the monorepo, packaging it for isolated use through Python’s pip, and abstracting out some of the OpenTitan-specific assumptions to accommodate a variety of possible scenarios. The first functional change we introduced was the ability to add arbitrary stage names inside testplan files, as well as making it possible to not assign a stage to a testpoint. We then started prototyping testplans and seeing what additional features our use cases might need.

When working on the I3C core, we used cocotb to implement testbenches. This was in itself an interesting opportunity for extending testplanner, as by default the tool is meant to be consumed in an UVM environment (which we use as well in many other projects, so keeping backwards compatibility was important).

Cocotb produces results in XML, structured similarly to xunit tests outputs. Since testplanner expects simulation results in HJSON, we added a cocotbxml-to-hjson script that converts the cocotb XML test results to HJSON for further processing by testplanner.

An interesting advantage of using structured data representations for the entire test design and implementation process is that everything can be interlinked, which makes jumping between sources, documentation and results extremely easy. Thus, a second batch of changes to testplanner included improving the navigation between all the artifacts involved:

Moreover, this internal linking is incremental and reflects the provided data; if, for example, simulation results are not available, corresponding documentation links will not be generated. When documentation link mappings are skipped, the template also adjusts to skip them.

Backend improvements

The generation backend was extended with abilities to add the generated linking between implementation files, project documentation and basic navigation between testplans and the summary. We introduced an additional rendering method that can output Excel-compatible .xlsx spreadsheets documenting the testplan (for legacy compatibility). When (optionally) using a description that adheres to the convention outlined below, the description field can be used to generate multiple columns, while remaining compatible with the regular Markdown backend for the HTML output.

There are 4 distinct fields in the descriptions that can be parsed:

  • Testbench - describes the architecture of the test itself,
  • Intent - outlines the desired effect and area of coverage,
  • Stimulus - enumerates the steps that are going to be taken to start and run the testbench, and describes the inputs to the test,
  • Check - describes the tested output and the desired final state of the tested device.

A single paragraph beginning with a section header will cover a single field in such a case, while being just a regular Markdown bullet list or string, as shown in the example below.


{
  "name":"Example testplan",
  "testpoints": [
    {
      "name": "Example testpoint",
      "desc": '''
              Testbench:
              * Two modules connected to each other output-to-input

              Intent:
              * Test to prove that the modules are stackable.

              Stimulus:
              * Input a clock signal to the clock input of both modules.
              * Test should feed the first module a randomized sequence of data.
                * Edge cases should be taken into consideration, eg. an empty data lane.

              Check:
              * The output of the first module should be checked for correctness of calculation.
              * The output of the second module should be checked for correctness of calculation based on the results from the first module.
              '''
      "stage": "tests",
      "tests": ["test1", "test2"],
      "tags": [""]
    },
    {
      "name": "Another example testpoint",
      "desc": '''
              Intent:
              * Showcase that not all fields and sections are required.
              '''
      "stage": "",
      "tests": [""],
      "tags": [""]
    },
  ]
}

As we can see, the testplan is divided into smaller testpoints, which in turn can include multiple tests that can be associated with a stage and tagged.

If the desired output plan is xlsx, the showcased description would be split into four separate columns if the desired output plan is xlsx, with each containing only the desired section. All of the sections are optional - unused text will be left in a default comments column, and skipped columns will remain empty.

Use case: test management for the I3C project

An example use of testplanner can be found in the I3C project where we use it to organize and document tests implemented for modules as well as the entire design. The usage of testplanner can be found in:

  • The project’s Makefile, where we:
    • Build documentation for tests of the I3C core, which can later be found in the Design Verification chapter of the documentation
    • Parse design simulation results from tests, generated by cocotb in XML format, which is parsed by testplanner’s cocotb-to-hjson script, converting them to HJSON files with simulation results
    • Enrich documentation for tests of the I3C core, providing links to test implementations, test results and summaries
  • Design Verification chapter of the I3C documentation, which is an autogenerated chapter describing tests based on testplans, linking to implementations of tests where possible
  • Per-testplan test summaries describing the implemented tests, their status and processing time
  • Full testplanning summary demonstrating the implementation status of the testplans

Comprehensive verification for hardware-software co-design

With a variety of verification and test planning tools such as testplanner, as well as Coverview and Topwrap, Antmicro can help you ensure effective and comprehensive validation at every stage of a digital design project.

We offer support in adapting and integrating open source tools, including those developed within the OpenTitan and Caliptra projects, with existing workflows to facilitate cross-team collaboration and hardware-software co-design. If you would like to learn more about our engineering services, don’t hesitate to contact us at contact@antmicro.com.

See Also: