Main Content

stopMeasuring

Class: matlab.perftest.TestCase
Namespace: matlab.perftest

Designate end of measurement boundary

Description

stopMeasuring(testcase) designates the end of a measurement boundary. Invoke this method and the startMeasuring method to restrict measurements to the code between the startMeasuring and stopMeasuring method calls. Defining this boundary allows you to exclude setup, verification, and teardown code from the measurement.

The performance framework permits multiple, nonnested calls to the startMeasuring and stopMeasuring methods within each method that is tagged with the Test attribute. When creating bounded performance tests, keep the following in mind:

  • A test method that calls the startMeasuring method must call the stopMeasuring method in the scope of the same test method.

  • A call to the startMeasuring method must have a subsequent call to stopMeasuring method in the scope of the same test method. Similarly, a call to the stopMeasuring method must have a preceding call to startMeasuring.

  • You cannot call the startMeasuring and stopMeasuring methods inside a while loop that has the keepMeasuring method in the condition. Similarly, you cannot have a while loop that has the keepMeasuring condition between calls to startMeasuring and stopMeasuring.

  • If a test method has multiple calls to startMeasuring and stopMeasuring, then the performance framework accumulates and sums the measurements.

If the framework encounters unsupported use of startMeasuring and stopMeasuring within a test method, it marks the corresponding MeasurementResult instance invalid.

example

stopMeasuring(testcase,label) designates the end of a measurement boundary and labels the measurement with label. Specifying a measurement boundary with a label is similar to specifying one without a label. A call to stopMeasuring with a label must have a preceding call to startMeasuring with the same label in the scope of the same test method. If a test method has multiple boundaries with the same label, then the performance framework accumulates the measurements by label and computes the sum. The performance framework does not support nested measurement boundaries.

The label is appended in angle brackets to the test element name in the Samples and TestActivity properties of the MeasurementResult.

example

Input Arguments

expand all

Instance of the test case, specified as a matlab.perftest.TestCase object.

Measurement boundary label, specified as a valid MATLAB identifier. A valid MATLAB identifier is a character vector or string scalar of alphanumerics (AZ, az, 09) and underscores, such that the first character is a letter and the length of the character vector is less than or equal to namelengthmax.

Examples

expand all

Create a performance test class, fprintfTest. The performance testing framework measures the code between the calls to the startMeasuring and stopMeasuring methods. This boundary restricts the performance testing framework to measuring only the call to the fprintf function. It excludes setup and teardown actions, and qualifications testing.

classdef fprintfTest < matlab.perftest.TestCase
    methods(Test)
        function testPrintingToFile(testCase)
            file = tempname;
            fid = fopen(file, 'w');
            testCase.assertNotEqual(fid, -1, 'IO Problem');
            
            stringToWrite = repmat('abcdef', 1, 1000000);
            
            testCase.startMeasuring();
            fprintf(fid, '%s', stringToWrite);
            testCase.stopMeasuring();
            
            testCase.verifyEqual(fileread(file), stringToWrite);
            fclose(fid);
        end
    end
end

Create a performance test class, fprintfTest2. Multiple boundaries (calls to startMeasuring and stopMeasuring) enable the performance framework to measure the code that opens the file, writes to the file, and closes the file.

classdef fprintfTest2 < matlab.perftest.TestCase
    methods(Test)
        function testPrintingToFile(testCase)
            file = tempname;
            
            testCase.startMeasuring();
            fid = fopen(file,'w');
            testCase.stopMeasuring();
            
            testCase.assertNotEqual(fid,-1,'IO Problem');
            stringToWrite = repmat('abcdef',1,1000000);
            
            testCase.startMeasuring();
            fprintf(fid,'%s',stringToWrite);
            testCase.stopMeasuring();
            
            testCase.verifyEqual(fileread(file),stringToWrite);
            
            testCase.startMeasuring();
            fclose(fid);
            testCase.stopMeasuring();
        end
    end
end

Run the performance test and view the sample summary. The performance framework measured that the mean time to open, write to, and close the file for the testPrintingToFile test was approximately 0.02 seconds. Your results might vary.

results = runperf('fprintfTest2');
T = sampleSummary(results)
Running fprintfTest2
........
Done fprintfTest2
__________


T =

  1×7 table

                 Name                  SampleSize      Mean      StandardDeviation      Min        Median       Max   
    _______________________________    __________    ________    _________________    ________    ________    ________

    fprintfTest2/testPrintingToFile        4         0.017003        0.0004943        0.016651    0.016814    0.017736

Create a performance test class, examplePerfTest. The first test has labeled test boundaries for generating an array of random numbers, measuring a call to svd with a single output, and measuring a call to svd with multiple outputs. The second test has an unlabeled boundary around the call to svd.

classdef examplePerfTest < matlab.perftest.TestCase
    methods(Test)
        function testSVD1(testCase)
            testCase.startMeasuring('arrayGen')
            X = rand(1000);
            testCase.stopMeasuring('arrayGen')
            
            testCase.startMeasuring('SVD_1out')
            S = svd(X);
            testCase.stopMeasuring('SVD_1out')
            
            testCase.startMeasuring("SVD_3out")
            [U2,S2,V2] = svd(X);
            testCase.stopMeasuring("SVD_3out")
            
            testCase.verifyEqual(S,diag(S2),'RelTol',1e-14)
        end
        
        function testSVD2(testCase)
            sz = 732;
            X = rand(sz);
            
            testCase.startMeasuring()
            [U,S,V] = svd(X);
            testCase.stopMeasuring()
            
            testCase.verifyTrue(isdiag(S))
            testCase.verifyTrue(issorted(diag(S),'descend'))
            testCase.verifySize(S,[sz sz 1])
        end
    end
end

Run the performance test and view the sample summary. Your results might vary. The labels from testSVD1 are appended in angle brackets to the test element name in the results.

results = runperf('examplePerfTest');
T = sampleSummary(results)
Running examplePerfTest
..........
..........
..........
..........
Done examplePerfTest
__________


T =

  4×7 table

                   Name                    SampleSize      Mean       StandardDeviation       Min        Median        Max   
    ___________________________________    __________    _________    _________________    _________    _________    ________

    examplePerfTest/testSVD1 <arrayGen>        21        0.0096508        0.0012428        0.0087596    0.0092564    0.013911
    examplePerfTest/testSVD1 <SVD_1out>        21          0.11978        0.0098172          0.10585      0.12274     0.13575
    examplePerfTest/testSVD1 <SVD_3out>        21          0.30664         0.020991          0.26882       0.3051     0.35018
    examplePerfTest/testSVD2                   11          0.13294         0.011135          0.11127      0.13557     0.15162

Version History

Introduced in R2016a