Static Analysis: Is It Necessary for Your Automatic Build?

Is This Extra Step Necessary?
Static analysis has the potential to become an integral part of your regular development as well as your continuous integration setup. Is it necessary for your automatic build and build security? The short answer: Yes. We will explain why.
There are main types of analysis that are used in the continuation process: static and dynamic analysis. Static analysis is run without compiling the code. It takes a direct look at your source code or your already-compiled output files to do its testing. This is the key advantage over dynamic analysis which requires the compilation or execution of code, or human testing which requires the human eye and their knowledge to analyze the code. Read: more costly and time-consuming.
This process becomes necessary with medical devices as it gives a deeper look into the underlying health of your code. Here is some info on the FDA’s suggestion for using static analysis, which includes another static analysis tool: CodeSonar. Let’s look at how static analysis can eliminate some major problems further along.
Benefits of Static Analysis
Here are some of the common problems static analysis can eliminate:
- Commonly overlooked bugs that show themselves during manufacturing or worse yet, in customer’s hands
- Gives you a look into the structure of the code
- Reinforces using coding standards
- Easy to automate for faster and frequent tests
- Code coverage – You can be certain each line of code has been analyzed
- Automated tools have higher accuracy in finding issues than humans
- Gives you, programmer/engineer, more time to do the fun stuff: coding
Here’s an Example
There are many static analysis tools available. The programming language, cost, coding standards, and type of analysis all take part in what tool you may want to bring into your system. Here is one we are familiar with: coverity.
Coverity is a bit different from other static analysis tools, such as CPPCheck or Lint, as you send your files up to their cloud for analysis where you can then grab the report. Here’s a quick example of what a report can look like. This example shows Coverity running on a simple template we have on our EmbedOps system.

An example report of Coverity static analysis
Now, this is just a small project without any errors. However, you could definitely expect it to pick up resources leaks, dereferences of NULL pointers, use of uninitialized data, memory corruptions, buffer overruns and more.
How Can You Do It?
You can run these programs manually, but you’re setting up static analysis because you ultimately want automation. Here’s an example of how you can add this tool to your CI/CD pipeline in our EmbedOps pipeline.
First, you will need to add an analysis stage to your CI/CD YAML.
stages:
- build
- analysis
- test
Next, I’ll want to put together a Coverity job template to actually run through the analysis. This one is a bit more complicated than other tools, so don’t let this one intimidate you much. And if you do use Coverity, hopefully, this one can help you out. Most tools will have some help on their sites to get their tool integrated into CI/CD pipelines. (Here’s Coverity’s to prove my point.)
Dojo Five’s EmbedOps Job Template:
.coverity_job_template:
stage: analysis
#As this is an analysis job, we would only like it to run on merge requests
only:
- merge_requests
script:
# Grab coverity's tools and install them on the machine
- curl -o /tmp/cov-analysis-linux64.tgz <https://scan.coverity.com/download/linux64> --form project=$COVERITY_SCAN_PROJECT_NAME --form token=$COVERITY_SCAN_TOKEN
- tar xfz /tmp/cov-analysis-linux64.tgz
- mv $CI_PROJECT_DIR/cov-analysis-linux64-*/ $CI_PROJECT_DIR/sam_d21_cnano.X/
# Create the xc32 template using the files provided by Microchip
- ln -fs /opt/microchip/xc32/v2.50/etc/Coverity/mchip_xc32/ $CI_PROJECT_DIR/sam_d21_cnano.X/cov-analysis-linux64-*/config/templates/
- sam_d21_cnano.X/cov-analysis-linux64-*/bin/cov-configure --template --compiler xc32-gcc --comptype mchip:xc32
- prjMakefilesGenerator -v ${PROJ_PATH} && pushd ${PROJ_PATH}
# Run the tools
- cov-analysis-linux64-*/bin/cov-build --dir cov-int make all
# Send the compilation up to coverity's cloud
- tar cfz cov-int.tar.gz cov-int
- curl <https://scan.coverity.com/builds?project=$COVERITY_SCAN_PROJECT_NAME> --form token=$COVERITY_SCAN_TOKEN --form email=$GITLAB_USER_EMAIL --form [email protected] --form version="`git describe --tags`" --form description="`git describe --tags` / $CI_COMMIT_TITLE / $CI_COMMIT_REF_NAME:$CI_PIPELINE_ID "
To give a quick breakdown of this:
stage:
– Added to include this job in the analysis stageonly: -merge_requests
– As a rule of thumb, we find having the analysis done on merge requests as a good practice. We don’t need the analysis to run every commit we make, but having it run on a merge request right before merging in the code is the perfect spot to have those unseen errors aired out!script:
– The magic. The implementation. This is our implemented script based on Coverity’s documentation. To go over it quickly, we grab Coverity’s tools, create a compiler template using Microchip’s xc32, make the files and throw them up to Coverity’s cloud. Then you log onto Coverity and check out the analysis. Each and every merge request will populate.
A couple of notes about Coverity:
- While Coverity supports a bunch of compilers, Microchip’s xc32 is not one of them. Fortunately, Microchip provides its own template specifically for Coverity so we could bring that in. If your compiler isn’t in the supported list and one isn’t provided, you will need to write your own template to use Coverity’s analysis.
- Most static analysis tools do not require sending your code to the cloud, and their output files can be found directly on the machine that ran them. In that case, you would want to add an
artifacts:
section to your job that will give downloadable reports/output files for you to see rather than needing to log into a cloud.- For example, CPPCheck would use an artifact in this way. Its
artifacts:
section would come after thescript:
artifacts: name: "$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA-cppcheck-report" paths: - ${OUTPUT_DIR}
- For example, CPPCheck would use an artifact in this way. Its
- Free vs paid. Unlimited scans on any size code-base is offered for paid users, but scans are limited per week depending on the size of the project for free users. Free plans also can’t be used on private repos.
Hopefully, this can get you moving in the right direction for bringing in the wonders of automated static analysis. Most of the work is already done on most tools, you just need to find it and bring it into your pipelines!
So to answer the question: Static Analysis. Do I really need it? Well no, you don’t need it, but your code and sanity will thank you for it! If you’d like to make sure your firmware is secure, reach out! If you have questions about an embedded project you’re working on, Dojo Five can help you with all aspects of your EmbedOps journey! We are always happy to hear about cool projects or interesting problems to solve, so don’t hesitate to reach out and chat with us on LinkedIn or through email!