# What is Conform?

Conform is a toolset for trace-based testing and verification of cyber-physical systems.

Auxon's SpeQTr language lets you easily write complex specifications of how your system should behave, building a test suite or directly encoding system requirements. With Conform, you first record system execution using Modality (opens new window) and then evaluate specifications against the recorded execution. This means exercising your system is separated from verifying your system, simplifying each part. You can keep different versions of tests over time, share results across teams, recall past results, and easily evaluate tests against historical data to determine when a problem was introduced. When you encounter a failure, Conform can tell you exactly which part of the specification was violated, putting you well on your way to resolving the issue.

# Core Concepts


Conform is built on top of Auxon's Modality database. You can review Modality's basic concepts in its introduction. (opens new window)

# Specifications

Specifications are part of the SpeQTr language and, as the name suggests, let you specify how you expect your system to behave. You can describe nominal behavior, known recovery modes, and explicitly prohibited behavior. A simple specification could say that you should never observe a catastrophic failure event, or that you should observe a successful completion event. More commonly you will write specifications with triggers, which allow you to specify exactly what you expect the system to do or not do following each occurrence of the given trigger pattern.

In the below example we see a very simple specification. It uses the prohibited case block to verify that there are no battery voltage readings outside the allowable range.

behavior "Battery voltage"
    prohibited case "Voltage is outside allowable range"
        voltage_reported@battery(_.voltage < 11.5 OR _.voltage > 12.3)

# Exercising vs Verifying Your System

Traditional tests consist of a careful orchestration of exercising and verifying your system—you make the system execute some tasks, verify it performed correctly based on the data available at that point, and repeat. Conform lets you decouple the two aspects of testing, improving both. Your test script now just needs to put your system through its paces while recording its execution with Modality, without worrying about pauses, assertions, surfacing data for testability, or anything else. The actual suite of tests you run is kept in Conform, presenting a number of advantages. You can iterate on your tests independently of your execution script, easily share or recall test results, and run new tests against previously recorded execution.

# Logical Analysis

Conform's logical analysis capabilities make investigating failing tests much faster and easier. Most specifications have a large number of failure modes, since they can describe complex patterns of events with various filters based on event attributes or time between events. Without Conform, failing tests typically just inform you that you expected one thing but got another. When a specification fails, however, Conform tells you what made it fail. For example, Conform could tell you that a specification of a complex initialization routine failed because in one instance the controller polled for completion more than the allowable number of times. The first step of investigating a failing test is trial and error to determine which specific part of the larger test caused the failure, before you then go figure out how the illegal behavior actually occurred. Conform's logical analysis does the first step for you, letting you get straight to understanding how things went wrong.

Here we see the result of evaluating an example specification. The specification failed, and Conform can tell us why: the specification requires a "message sent" event to be followed by a "message received" event within a certain amount of time, and the "message received" event took too long. This already narrows our investigation, since we know that the message was not entirely lost, but rather the system took too long to acknowledge its receipt.

$ conform spec eval --name example
No explicit workspace provided. Using the default workspace.
evaluation of spec "example"
[  ] segment "Run 96d823a3-d5bd-47a8-8700-a250121b598e"
[  ]   behavior "Pipeline messaging"
Failures Detected
Behaviors: [0: Passed] [0: Inconclusive] [1: Failed] (stored as 34119aa2-1130-4322-baa0-52576532f49f)

 1 | behavior 'Pipeline messaging'
 6 |     nominal case 'it is received by the consumer'
 7 |         tx
 8 |            -> within 850us
   |               ^^^^^^^^^^^^ Did not match
 9 |         "Received measurement message"@consumer (_.interaction.remote_nonce = tx.nonce)
10 |     end
   = error: When evaluating the behavior 'Pipeline messaging', no expected cases
            matched. The case 'it is received by the consumer' partially matched,
            but an event matching the subexpression 'within 850us' was not found.

This example also shows a second advantage of using Conform. The summary results line shows 1 failure, followed by (stored as 34119aa2-1130-4322-baa0-52576532f49f). Evaluation results are stored under a unique ID in Conform. This makes sharing results across teams and coordinating investigations much easier, letting you more quickly get to the actual problem solving.

# Conform is made by the team at Auxon

If you'd like to start a discussion or schedule a demo, contact us. (opens new window)

Otherwise, move on to Installation.