Main Content

Fix Metric Threshold Violations in a Continuous Integration Systems Workflow

This example shows how to use the Metrics Dashboard with open-source tools GitLab® and Jenkins® to test and refine your model in a continuous integration systems workflow. Continuous integration is the practice of merging all developer working copies of project files to a shared mainline. This workflow saves time and improves quality by maintaining version control and automating and standardizing testing.

This example refers to the Airframe example project openExample('simulink/AirframeProjectExample') and these files that you must provide:

  • A MATLAB® script that specifies metric thresholds and customizes the Metrics Dashboard.

  • A MATLAB unit test that collects metric data and checks whether there are metric threshold violations.

This example uses the Jenkins continuous integration server to run the MATLAB unit test to determine if there are metric threshold violations. Jenkins archives test results for you to download and investigate locally. GitLab is an online Git™ repository manager that you can configure to work with Jenkins. This diagram shows how Simulink® Check™, GitLab, and Jenkins work together in a continuous integration workflow.

Workflow for continuous integration with GitLab and Jenkins

Project Setup

In addition to the files in the Airframe example project openExample('simulink/AirframeProjectExample'), you must provide these additional files:

  • A MATLAB unit test that collects metric data for the project and checks that the model files contain no metric threshold violations. For more information on the MATLAB unit tests, see Script-Based Unit Tests.

  • A MATLAB script that specifies metric thresholds and customizes the Metrics Dashboard. For more information on how to customize the Metrics Dashboard, see Customize Metrics Dashboard Layout and Functionality.

  • A setup.m file that activates the configuration XML files that define metric thresholds, sets custom metric families, and customizes the Metrics Dashboard layout. For this example, the setup.m script contains this code:

    function setup
        % refresh Model Advisor customizations
        Advisor.Manager.refresh_customizations();
            
        % set metric configuration with thresholds
        configFile = fullfile(pwd, 'config', 'MyConfiguration.xml');
        slmetric.config.setActiveConfiguration(configFile);
        
        uiconf = fullfile(pwd, 'config', 'MyDashboardConfiguration.xml');
        slmetric.dashboard.setActiveConfiguration(uiconf);
    end
    
    On the Project tab, click Startup Shutdown. For the Startup files field, specify the setup.m file.

  • An sl_customization.m file that activates the Model Advisor configuration file to customize the Model Advisor checks. For more information on creating your own Model Advisor configuration, see Configure Compliance Metrics.

  • A run script that executes during a Jenkins build. For this example, this code is in the run.m file:

    % script executed during Jenkins build
    function run(IN_CI)
         if (IN_CI)
            jenkins_workspace = getenv('WORKSPACE');
            cd(jenkins_workspace);
        end
    
        % open the sl project
        slproj = simulinkproject(pwd);
        
        % execute tests
        runUnitTest();
        
        slproj.close();
        
        if IN_CI
            exit
        end
    end
    

  • A cleanup.m file that resets the active metric configuration to the default configuration. For this example, this code is in the cleanup.m file script:

    function cleanup    
        rmpath(fullfile(pwd, 'data'));
        Advisor.Manager.refresh_customizations();
    
        % reset active metric configuration to default
        slmetric.config.setActiveConfiguration('');
        slmetric.dashboard.setActiveConfiguration('');
    end
    
    On the Project tab, click Startup Shutdown. For the Shutdown files field, specify the cleanup.m file.

  • A .gitignore file that verifies that derived artifacts are not checked into GitLab. This code is in the .gitignore file:

    work/**
    reports/**
    *.asv
    *.autosave
    

GitLab Setup

Create a GitLab project for source-controlling your Project. For more information, see https://docs.gitlab.com/ee/index.html.

  1. Install the Git client.

  2. Set up a branching workflow. With GitLab, from the main branch, create a temporary branch for implementing changes to the model files. Integration engineers can use Jenkins test results to decide whether to merge a temporary branch into the main branch. For more information, see

    https://git-scm.com/book/en/v2/Git-Branching-Branching-Workflows.

  3. On the left side bar, under Settings > Repository, protect the main branch by enforcing the use of merge requests when developers want to merge their changes into the main branch.

  4. Under Settings, on the Integrations page, add a webhook to the URL of your Jenkins project. This webhook triggers a build job on the Jenkins server.

Jenkins Setup

Install GitLab and TAP plugins. The MATLAB unit test uses the TAP plugin to stream results to a .tap file. To enable communication of the test status from MATLAB to the Jenkins job, Jenkins imports the .tap file.

Create a Jenkins project. Specify these configurations:

  1. In your Jenkins project, click Configure.

  2. On the General tab, specify a project name.

  3. On the Source Code Management tab, for the Repository URL field, specify the URL of your GitLab repository.

  4. On the Build Triggers tab, select Build when a change is pushed to GitLab.

  5. In the Build Environment section, select Use MATLAB Version and specify the MATLAB root, for example, C:\Program Files\MATLAB\R2022a.

  6. In the Build section, execute MATLAB to call the run script. The run script opens the project and runs all unit tests. For the project in this example, the code is:

    matlab -nodisplay -r...
     "cd /var/lib/jenkins/workspace/'18b Metrics CI Demo'; run(true)"

    For more information, see Continuous Integration Using MATLAB Projects and Jenkins.

  7. In the Post-build Actions tab, configure the TAP plugin to publish TAP results to Jenkins. In the Test Results field, specify reports/*.tap. For Files to archive, specify reports/**,work/**.

    The TAP plugin shows details from the MATLAB unit test in the extended results of the job. The Jenkins archiving infrastructure saves derived artifacts that are generated during a Jenkins build.

Continuous Integration Workflow

After setting up your project, Jenkins, and GitLab, follow the continuous integration workflow.

Phase 1: Feature Development

  1. Create a local clone of the GitLab repository. See Work with Files Under SVN in MATLAB.

  2. In Simulink, navigate to the local GitLab repository.

  3. Create a feature branch and fetch and check-out files. See Branch and Merge Files with Git and Pull, Push, and Fetch Files with Git.

  4. Make any necessary changes to the project files.

  5. Simulate the model and validate the output in the Simulation Data Inspector.

  6. Run MATLAB unit tests. For more information, see runtests.

  7. Add and commit the modified models to the feature branch. See Branch and Merge Files with Git and Pull, Push, and Fetch Files with Git.

  8. Push changes to the GitLab repository. See Branch and Merge Files with Git and Pull, Push, and Fetch Files with Git.

  9. In GitLab, create a merge request. Select the feature branch as source branch and the target branch as main. Click Compare branches and continue.

  10. If the feature is not fully implemented, mark the merge request as a work in progress by adding the letters WIP: at the beginning of the request. If the merge request is not marked WIP:, it immediately triggers a build after creation.

  11. Click Create merge request.

Phase 2: Qualification by Using Continuous Integration

  1. If the letters WIP: are not at the beginning of the merge request, the push command triggers a Jenkins build. In the Jenkins Setup part of this example, you configured Jenkins to perform a build when you pushed changes to GitLab. To remove the letters, click Resolve WIP status.

  2. Navigate to the Jenkins project. In Build History, you can see the build status.

  3. Click the Build.

  4. Click Tap Test Results.

  5. For this example, the MetricThresholdGateway.m unit test did not pass for three metrics because these metrics did not meet the thresholds. To investigate this data, you must download the data locally.

    Test results for MetricThresholdGateway

Phase 3: Investigate Quality Issues Locally

  1. Download the archived results to a local Git repository workspace.

  2. Unzip the downloaded files. Copy the reports/ and work/ folders to the respective folders in the local repository.

  3. To explore the results, open the project and the Metrics Dashboard.

    Metrics Dashboard showing modeling guideline compliance, model size, and model architecture

  4. To resolve the test failures, make the necessary updates to the models. Push the changes to the feature branch in GitLab.

  5. Integration engineers can use Jenkins test results to decide when it is acceptable to perform the merge of the temporary branch into the main branch.

See Also

|

Related Topics

External Websites