Quality Objectives Dashboard

To monitor the quality of your code against predefined software quality thresholds or user-defined thresholds, use the Quality Objectives dashboard.

In the Project Overview dashboard, use the Quality Objectives card to get a quick overview of your progress in achieving a quality objective threshold. From the Threshold drop-down list, select a threshold and view the percentage of findings that you have already addressed. The card also displays the number of findings you still need to address to reach the threshold. Click this number to open the REVIEW perspective and see these findings in the Results List.

For a more comprehensive view, open the Quality Objectives dashboard. In the Summary section, you can use the Threshold drop-down list to pick a threshold and see the remaining open issues, including a breakdown for each category, such as code metrics or coding rules.

In this Quality Objectives dashboard, 96% of the findings required to achieve threshold SQO2 have been addressed. There are 17 open issues, including 12 Code Metrics, and 5 Systematic issues. Open issues are issues with a review status of Unreviewed, To fix, To investigate, or Other.

The table shows the current progress of code quality for all quality objective thresholds. To view the Results List for a set of open issues, click the corresponding value in the table.

Customize Software Quality Objectives

Depending on the requirements of your team or project, you can customize the thresholds that you use as pass/fail criteria to track the quality of your code. For instance, you might want your team to address all MISRA C®:2012 directives to achieve SQO level 2. To set custom thresholds for the quality objectives, click Quality Objectives Settings on the Quality Objectives dashboard. You must have the role of Administrator to customize the quality objective settings. Users that have the roles Owner or Contributor have a read-only view of the quality objective settings.

Quality Objectives settings view

To make changes to the existing thresholds selection, click a findings family, for instance MISRA C:2004, then select a node or expand the node to select individual results. For each family of results, you can view the nodes by group or by category when available.

When you select nodes in the leftmost part of the table:

  • Box with checkmark indicates that all entries under the node are enabled.

  • Partially filled box indicates that some entries under the node are not enabled.

For the quality objective thresholds under the SQO columns:

  • Box with checkmark indicates that all the entries that are enabled under the node on that row apply to this threshold.

  • Partially filled box indicates that some of the entries that are enabled under the node on that row do not apply to this threshold.

Expanded Language extensions node.

For example, in the previous figure, the Language extensions node is expanded. The check box next to the node is partially filled since rule 2.1 is not enabled. For the thresholds, all the rules that are enabled under the node apply to thresholds SQO5 and SQO6. Rule 2.2 does not apply to SQO4, which is why the check box for SQO4 is partially filled.

For Run-time Checks, you can customize the percentage of findings that you must address or justify for each threshold.

For Code Metrics, you can customize the value of the different metrics for each threshold.

When you make a selection for a threshold, all higher thresholds inherit that selection. For instance, if you select a coding rule for SQO3, the rule is also selected for SQO4, SQO5, and SQO6. By default, when you first enable a node or result, it applies only to SQO6.

The changes that you make to the quality objectives thresholds apply to all the projects in Polyspace Access. Before making changes to the settings, make sure that you inform all Polyspace Access users.

To save your changes, click Save. You cannot recover previous custom settings. To reload the predefined thresholds, click Back to default.

The quality objectives statistics for a project are recalculated when:

  • You upload a new run for the project.

  • You select a finding and make a change to any of the fields in the Result Details pane.


The Quality Objectives settings and the calculated statistics for a project might be out of sync if you review the statistics after the settings were changed but before the project statistics were recalculated.

Related Topics