Workflow for Test Generation
Part of model verification and validation is generating tests that comprehensively assess model behavior. This process involves verifying that the model meets intended test requirements and checking model coverage to identify any missing testing scenarios. Simulink® Design Verifier™ provides comprehensive support for achieving these steps through automatic test generation. The topic describes the three important activities in the model-based testing. For more information on model-based testing, see Model-Based Design with Simulink
- Perform Requirements-based testing 
- Achieve model coverage 
- Achieve generated code coverage 
During the early phases of development, Simulink projects begin with textual requirements. As you move through the development process, you refine requirements into specifications that you implement in code. After developing the model, there are two key steps to generate tests:
- Run the test cases based on the requirements to check if the model's output matches the expected output. 
- Evaluate model coverage by creating test cases that test all parts of the model and find any dead logic. 
Follow these steps to verify that the model meets its requirements and fully covers its design. Additionally, by reusing and expanding the tests, you can achieve code coverage and verify that both the model and code behave consistently.
Compatibility Consideration
For a model to generate tests by using Simulink
            Design Verifier, it has to be compatible with Simulink
            Design Verifier analysis. For example, a model is incompatible when the solver step is
                set to variable size. You can run a check to assess your model
                for compatibility with Simulink
            Design Verifier. For more information, see Simulink Design Verifier Checks. The process of checking model compatibility includes capabilities where, a model
                is incompatible for generating tests. For more information, see Check Model Compatibility. In such cases, use methods like block replacement to make the model suitable for
                analysis. For more information, see Block Replacement Effects on Test Generation.
Perform Requirements-Based Testing
Simulink projects begin with textual requirements which serve as a basic framework to create a specification model. This specification model is an executable entity that you can use to perform requirements-based testing using Simulink Design Verifier and Requirements Toolbox™. To perform requirements-based testing in Simulink Design Verifier:
- Manually create test cases based on the textual requirements. 
- Transform the textual requirements into a specification model by using a requirements table. This model acts as an executable entity for testing. For more information on how to construct specification models and use them for requirements-based testing, see Use Specification Models for Requirements-Based Testing. 
- Use Simulink Design Verifier to generate tests for each row in the requirements table. Before you use Simulink Design Verifier, verify that all requirements are consistent and complete. 
- Generate and link tests to requirements by using the specification model. For more information, see Generate and Export Tests from Requirements Table Blocks. 
- Use custom blocks or APIs to define intended requirements conditions as testing targets. For more information, see Model Requirements and Isolate Verification Logic with Observers. 
Analyze and Address Missing Model Coverage
You can generate test cases by designing a model based on the input-output pair of requirements. The test cases help you check if the design model accurately reflects the original requirements. You can run these test cases on the model and check for consistent outputs. Model coverage assists you to identify the missing requirements and highlights the need to improve the test suite based on coverage metrics. Simulink Design Verifier offers workflows to enhance your test suite by generating tests for model coverage metrics. This helps to detect unreachable parts of your design.
- Generate tests for coverage metrics. Focus on generating tests for condition/decision, MCDC (Modified Condition/Decision Coverage), and relational boundary metrics. For more information, see Generate Test Cases for Model Decision Coverage and Enhanced MCDC Coverage in Simulink Design Verifier. Define these metrics specifically for model and C-functions. 
- Use model coverage or custom conditions to detect any dead logic. Evaluate and justify any dead logic. For more information, see Dead Logic Detection. 
- Implement a top-off workflow to wrap your model or subsystem under test with an additional harness or "top-off" model to generate tests that fill any coverage gaps left by existing tests. For more information, see Achieve Missing Coverage in Generated Code of RLS and Achieve Missing Coverage in Custom Code. 
- Use Simulink Design Verifier within the Simulink Test™ workflow to achieve complete coverage. For more information, see Achieve Missing Coverage in Closed-Loop Simulation Model and Achieve Missing Coverage in Referenced Model. 
Analyze and Address Missing Code Coverage
To check if the design model aligns with the requirements, run your test cases to check if both produce the same output for the same input. Using Simulink Coverage™, you can identify any uncovered objectives. If the initial requirements-based test cases do not achieve complete coverage, use Simulink Design Verifier to extend the test cases and address any missing code coverage identified in the generated code. If coverage is still incomplete, generate additional test cases or analyze the design and the requirements to address gaps. This process helps you achieve full coverage. Additionally, Simulink Design Verifier supports logging the expected output during model coverage test generation, enabling back-to-back testing of the model with the generated code in a software-in-the-loop environment.
- Implement a top-off workflow to generate tests that fill any coverage gaps left by existing tests. Use Simulink Design Verifier within the Simulink Test workflow to achieve complete coverage. For more information, see Generate Test Cases for Embedded Coder Generated Code. 
- Log expected outputs for model coverage tests and export these tests to Simulink Test to complete the testing. For more information, see Export Test Cases to Simulink Test. 
- Create tests that specifically target code functions associated with a particular subsystem. Check if these tests thoroughly evaluate the functionality and logic of the code functions. For more information on how to generate test cases for a subsystem, see Generate Test Cases for a Subsystem. 
Cross Product Interactions
The table describes how Simulink Design Verifier interacts with other products to provide a testing environment:
| Product | Interactions | Related information | 
| Simulink Coverage | 
 | |
| Simulink Test | 
 | 
 | 
| Requirements Toolbox | Generate test cases for requirements table and linking | Generate and Export Tests from Requirements Table Blocks |