Quality Assurance

wolfSSL Software Development Process and Quality Assurance

The wolfSSL ecosystem consists in several software modules and components, each with specific goals and purposes. 

  • The wolfCrypt cryptography engine is a lightweight crypto library offering a complete and updated set of the most popular algorithms, as well as progressive ones such as HC-128, RABBIT and  Post Quantum cryptography. wolfCrypt may be used to safely store data at rest and is consumed by most other product offerings from wolfSS Inc.
  • The wolfSSL embedded TLS library supports the latest standards up to the latest specifications. This library utilizes the algorithms offered by wolfCrypt to implement secure communication protocols and mechanisms to secure data in transit.
  • wolfMQTT is a MQTT client library which uses secure MQTT over TLS by default, through a TLS-secure MQTT session.
  • wolfSSH is a client and server library implementation of the SSH 2.0 protocol, designed to fit the most resource-constrained embedded devices, while offering a full protocol support, including SFTP, SCP and providing authentication via public key and/or password-based.
  • wolfBoot is a secure bootloader designed following the open standards for secure remote firmware updates on IoT systems.
  • wolfTPM is a library to access the functionality of TPM 2.0 secure cryptography devices.
  • wolfSentry is an intrusion detection and prevention system designed for small resource-constrained embedded systems.

All our software products are engineered using the quality standards required by our process, which is briefly summarized in this document.

Each step in the software lifetime is regulated by strict rules and testing criteria (including stringent fuzz based testing) that ensure the detection of defects and regressions in the code very early.

Download Now

Get the latest open source GPLv2 version now!


Software engineering process overview

From a software engineering standpoint, the life cycle of the design and development of wolfSSL software components is structured in three steps:

  • Identification and analysis of software requirements and specifications
  • API-oriented software design
  • Software module development

Each step is then verified through a specific set of quality control procedures, including:

  • Unit tests and periodic coverage analysis
  • API consistency tests
  • Integration tests on multiple architectures/compilers and use cases/combinations of compile time configuration options
  • Interoperability tests against other implementations
  • Formal algorithm and module verifications
  • Fuzz testing

In order to improve the safety of the code, and in order to detect potential defects or misbehaviors, additional quality controls are regularly applied to the software modules, in particular tools like static analyzers, dynamic memory diagnostic tools, fuzzers and more, being added all the time.

Distributing the source code under a GPL license and exposing the entire development process publicly on the GitHub platform guarantees that hundreds of users, contributors and people interested in the project are constantly aware about every change in the code, and the conversations generated during code reviews. Corporate security organizations, cybersecurity partners and academic researchers give very valuable contributions by constantly studying new vulnerabilities and carefully exploring the potential attack surfaces that may be harder to identify while writing the code. wolfSSL Inc. takes vulnerability management very seriously and has a precise and detailed checklists to run in case of emergency fix and rapid release.

Due to the constantly changing nature of the specifications, the software design process must be flexible enough to accommodate updates in the algorithm implementation, the usage recommendations and the guidelines to implement secure protocols and mechanisms.

NIST distributes specifications and guidelines through “Special Publications” (SP). Through its collaboration of industry organizations, government agencies and academic institutions the National Cybersecurity Center of Excellence (NCCoE) as part of NIST. Similarly to IETF publications, the approach of NIST consists in releasing frequent updates and amendments to earlier publications, in order to keep the guidelines updated. The process of updating the guidelines within NIST is regulated by the “NIST Cryptographic standards and guidelines development process” (NIST.IR.7977). Cryptographic functions in wolfCrypt follow the latest specifications from NIST regarding algorithms and their implementation process. As explained later, NIST publications and software tools play an important role in the algorithms and modules verification phase.

WolfMQTT is implemented upon the specifications provided by OASIS, initially covering  MQTT version 3.1.1 approved in December 2015. WolfMQTT has since then evolved to support the specifications in OASIS MQTT Version 5.0, approved as the latest MQTT standard in March 2019.


Software requirements and specifications

The guidelines for the implementation of cryptography, secure communication protocols and secure firmware updates mechanisms are described by open standards. These standards are maintained and documented by several organizations. WolfSSL software projects import specification documents from three major organizations, namely:

  • IETF, the Internet Engineering Task Force, a large open international community of engineers, in charge of publishing and updating the documentation for the Internet protocols stack, which nowadays also includes secure communication protocols and algorithms used in ciphersuites
  • NIST, the United States National Institute of Standards and Technology, providing guidelines for processes, modules and algorithms, recognized globally as best-practices in cryptography
  • OASIS, the Organization for the Advancement of Structured Information Standards, a global nonprofit consortium that works on the development of open standards for the IoT and data exchange

IETF releases new specifications in the form of “Request for Comments” (RFC). These are individually numbered publications which are published after a peer review process, which often requires multiple draft phases. Once a RFC is assigned its unique number, it is never modified again. In order to update a standard track that has been published in a RFC, it is possible to issue a new RFC that may contain amendments, corrections or replacing of existing RFCs that were previously published. Newer RFCs can supersede older ones by making them obsolete or deprecated. RFCs cover the specifications for a large part of wolfSSL communication modules, such as the TLS protocol standard (RFC8446), DTLS (RFC6347), TLS extensions (RFC6066) and several others. The wolfCrypt library follows the recommendations for the implementation of the cryptographic primitives based on the algorithms supported, such as RSA  public-key cryptography (RFC8017), or the ChaCha20/Poly1305 for AEAD (RFC8439) and many others. WolfSSH has been designed and developed upon the specifications of the RFC4250-RFC4254 series, documenting SSH-2 as proposed internet standard. WolfBoot has been initially designed and developed according to the guidelines of the draft-ietf-suit-architecture, which later on became RFC9019.

Software design

Most of the software components developed by wolfSSL are in the form of a structured library, with an API oriented design. Once functions are part of the API they will never change their signature, their purpose or the meaning of their return values. This ensures compatibility across different versions of the library. If a feature is added to an existing functionality, a new API function is created, which accepts different arguments, or extends specific interfaces. The API function calls are formally documented in the module user manual.

One of the most important aspects to keep in mind during the design phase is the correct meaning, propagation and verification of the error codes across the different layers of the API. Each error code has unique and well-defined meanings that are explained in the manual. This facilitates the identification of run-time errors in the application using the library.

Due to the dynamic, changing nature of the specifications around cryptography and secure protocols, the design process must adapt accordingly. New specifications are analyzed and integrated in the existing module architecture, ensuring that this does not break existing features by keeping the existing API function signatures immutable in time.

Software development and traceability

All software at wolfSSL is developed and maintained following the continuous integration practice, via a centralized git mainline repository. All source code is public and accessible at any time during the development, under a GPL license.

The life-cycle of software components in the wolfSSL ecosystem is different from the typical open-source development process, and it is designed to comply with modified condition decision coverage (MC/DC) process. WolfSSL owns and maintains the entire code base, which means that there are strict rules in place regarding changes and updates of the mainline. Modifications to the repositories are only allowed by wolfSSL engineers, who also have to comply with a strict peer-review policy before any change is merged into the master branch.

To contribute to wolfSSL repositories, a developer must submit a “pull request” through GitHub. The request is then reviewed by one or more wolfSSL engineers (depending on the size, the impact and the nature of the patch). This often results in requests for alignment with the purposes in the design documents, changes in the code, re-adaptations and improvements before the code is accepted for merging. Only approved contributors are allowed to submit their code to review. Contributors outside the wolfSSL engineering team must be approved beforehand before the code is considered for inclusion.

Quality assurance

The first verification for the functionality of the code is performed locally, on the development PC of the contributor. Git commit and push hooks ensure that the code can be submitted only if it passes a first set of functionality and unit tests. Once the pull request is published, a full round of non-regression and integration tests are automatically started and the status of the pull request is updated with the test results. In order for the pull request to be accepted for inclusion, it must pass the peer review and the non-regression and integration tests. The tests are automatically re-triggered during the review process every time that the code for the pull request is modified. For a more detailed insight on our testing and review process, check out Overview of Testing in wolfSSL.

Quality control automation

At wolfSSL we have deployed an hybrid (on-site + cloud based) infrastructure, using Jenkins to coordinate the workload between the nodes to apply quality control on a regular basis. This includes the execution of software tests every time a contribution is evaluated for inclusion in the mainline, as well as other types of quality control applied on a regular basis (e.g. nightly, weekly).

The reason behind the hybrid approach is due to the portability trait of the wolfSSL software ecosystem. The software must run on several different hardware architectures, and interoperate with specific hardware components, such as hardware crypto modules and TPMs. Using physical machines on some of the jenkins nodes provides mechanisms to configure and control specific hardware targets, including microcontroller boards that can be configured and programmed automatically.

Some tests require a long time to run. Continuous integration tools are very useful to split the application of the quality control jobs over a longer time, to ensure that every test is performed on a regular basis.

Formal algorithms and modules verification

In order to validate the correctness and the adherence to the standards implemented, wolfSSL software components are tested using tools and procedures recommended by NIST. This includes a full set of functional tests using a set of well-known input values (test vectors) and expected results. The correctness of many cryptographic algorithms can also be verified by inspecting the intermediate results of the calculations.

NIST also issues a series of publications (FIPS 140) coordinating the requirements and standards for cryptographic modules for use by departments and agencies of the federal Government. Through the effort of specialized accredited third-party laboratories across the U.S. and Canada, two validation programs are made available to certify the compliance with the FIPS 140 regulations. wolfCrypt has achieved the FIPS 140-2 certification, and has already applied for the recently approved FIPS 140-3 certificate. FIPS certification requires that the cryptographic module is successfully submitted through the validation of the two programs:

  • Cryptographic Algorithm Validation Program (CAVP) provides validation testing of approved cryptographic algorithms implemented.
  • Cryptographic Module Validation Program (CMVP) certifies the module for use by the Government and regulated industries for securing sensitive data and information.

The same types of tests used in CAVP/CMVP are repeated on the mainline automatically, to ensure that modifications in the code do not impact on the integrity of the algorithms and the modules as specified by FIPS 140.

Interoperability tests

One very effective way of verifying that the behavior of the protocols remains the same throughout the continuous integration, is by running interoperability tests with different implementations of the same standards. WolfSSL quality control infrastructure provides a number of scheduled tests that use a different implementation as the remote endpoint of the communication, and to compare results of cryptography operations starting from common vectors.

Unit tests

Unit tests are mandatory for all core modules, and run on the developer’s machine upon new commits.

The coverage of the unit tests is measured on a weekly basis, to ensure that there are no coverage regressions when adding new functionality to the code. WolfSSL developers receive a full code coverage report in their mailboxes every week thanks to the automation provided by the Jenkins infrastructure for continuous integration.

API consistency verification

One specific set of tests verifies that changes in the implementation do not alter the usage of the API from the application development perspective. These tests are never updated, with the only exception of adding new functions to the API. The API must remain consistent across versions, as it is the contract between the application and the library. Verifying all these aspects is the goal of the subset of tests that are running nightly to verify the adherence to the requirements.

Integration tests

Due to wolfSSL portability, it is necessary to expand the test domain by including custom configurations, which require to compile the software for different architectures and combination of compile-time configuration options and test applications. Both real and virtual machines are used as targets for running the test suite. Automating the tests on different architectures (x86, ARM, PowerPC, RISC-V, MIPS, …) ensures that architecture-specific regressions or bugs can be detected and identified early during the process, and the expected behavior remains consistent in all cases. Thanks to the hybrid model of our continuous integration infrastructure, several targets are connected to Jenkins nodes that are in charge of running the software tests through a wide range of specific hardware and software configurations and use cases.

Safety assessment: looking under the hood

The continuous integration infrastructure also automates the execution of several analysis tools.

Static analysis tools look for any inconsistency in the code exploring all the different combinations of compile-time options, and following different code paths. These tools can detect a wide range of programming mistakes, potential errors, and undefined behaviors in the language that may not be covered by the compiler, by applying rigorous checks at the source code level. The tools used by wolfSSL and automated in the CI include cppcheck, clang static analyzer (scan-build), Facebook infer and others.

Memory analysis is performed on a regular basis to look for bugs related to memory handling. WolfSSL uses valgrind memcheck tool, clang sanitizer and other dynamic analysis to run the code. These tools detect memory errors such as accessing uninitialized or previously freed memory, using undefined or uninitialized values, memory leaks and more.

Fuzzers are a very important resource to improve the robustness of the code towards unexpected situations. The goal of these tools is to attempt to cause malfunctions in the code by injecting a large number of random inputs in quick succession. Fuzzing is often a very effective way to detect bugs and vulnerabilities in the code that could go unnoticed for a long time. At wolfSSL we constantly run fuzzers to feed the API functions and the transport back-end, periodically rotating all the possible seed values for the PRNG regulating the output values mutation.  With mutation-fuzzing, a bug that is triggered with a given seed value can be reproduced by relaunching the same test with the same seed value manually, allowing for easy reproducibility and analysis through instruments and debuggers. Since these kind of tools must be aware of the application domain, the protocol structures and the characteristics of the data, wolfSSL uses two main fuzzers that have been written for the purpose. WolfFuzz operates over memory buffers and fuzzes the internal cryptography operations. This mechanism allows very fast fuzzing and the entire range of 4 trillion PRNG seeds is tested in three months. A second tool, wolfSSL Network Fuzzer, runs over TCP/IP. For this reason it is much slower but more flexible to test the code that enables security for data in motion.

Vulnerability management

At wolfSSL we take vulnerabilities very seriously, and we are committed to release a new version of the software within 36h from the disclosure. This ensures that, in case of responsible disclosure, the vulnerability is fixed way before its details or any proof of concept to reproduce are made public.

A vulnerability claim triggers emergency procedures, consisting in a Standard Operating Procedure (S.O.P.)  that has to be completed to speed up the resolution of the issue and the release of a new version. The vulnerability claim is verified within the first 120 minutes. In this phase, a document is created to be distributed internally to the engineering team, to document the issue and all available instruments along with instructions on how to reproduce it so that the error can be confirmed and the proposed fix can also be assessed to determine if the issue has been addressed completely. The fix may take between a few minutes up to 24h, depending on the complexity of the issue. When the fix is ready it is submitted either in the form of a internal patch or a public pull request when it has been determined that the fix will not leak critical information to would-be attackers that monitor the wolfSSL git repository. The path or public PR is then reviewed by multiple engineers, since code changes are still validated through both manual code reviews and the test procedures. After a few iterations in the review process loop, the automated integration server verifies the fix by running all the necessary pre-release tests. At the end of the verifications, if all the test passed, a new release is issued,  and all users and customers are notified through all available communication channels.

Contact wolfSSL at facts@wolfssl.com with questions about the wolfSSL embedded SSL/TLS library, or to get started with wolfSSL in your project!  wolfSSL supports TLS 1.3FIPS 140-2/3DO-178, and much more!