Test Solutions from Learners
When you create assessment items in MATLAB® Grader™, define tests that assess whether the learner solution meets your criteria for success.
Define Tests
To define tests, you can use predefined tests available in the Test Type list or write your own MATLAB code. The predefined test options include:
Variable Equals Reference Solution
— Check the existence, data type, size, and value of a variable. This option is only available in the Test Type list for script assessment items. A default tolerance of1e-4
applies to numeric values.Function or Keyword is Present
— Check for the presence of specified functions or keywords.Function or Keyword is Absent
— Check that learner solutions do not include the specified functions or keywords.
For example, if you want learners to use the taylor
function
to compute the Taylor series approximation of a given function, specify these
options:
In the Test Type list, select
Function or Keyword is Present
.Specify the
taylor
function as the function the learner must use.Optionally, specify supplemental feedback the learner receives if they do not use the expected function.
For a script assessment item, you can assess a variable using the predefined test
type, Variable Equals Reference Solution
. For example,
assess whether a variable is the right size and has the expected values and data
type by comparing it to the variable with the same name in your reference
solution.
You can also write MATLAB code by selecting the MATLAB Code
option in
the Test Type list. Follow these guidelines:
Assessing correctness — You can use functions that correspond to the predefined tests types:
assessVariableEqual
,assessFunctionPresence
, andassessFunctionAbsence
. Alternatively, you can write a custom test that returns an error for incorrect results.Variable names in script assessment items — To refer to variables in your reference solution, add the prefix
referenceVariables
, such asreferenceVariables.
. To refer to learner variables, use the variable name.myvar
Function names in function assessment items — To call the function in your reference solution, add the prefix
reference
, such asreference.
. To call the learner function, use the function name.myfunction
Variable scope – Variables that you create within the test code exist only inside the assessment test.
For example, suppose that in a learner must calculate a value for variable
X
that can vary more than the default tolerance. To compare
the value from the learner solution to the value in your reference solution, specify
the variables and the tolerance by calling the
assessVariableEqual
function.
To test function assessment items, call the learner and reference functions with a
test input and compare the outputs. For example, this code checks that a learner
function named tempF2C
correctly converts a temperature from
Fahrenheit to Celsius.
temp = 78;
tempC = tempF2C(temp);
expectedTemp = reference.tempF2C(temp);
assessVariableEqual('tempC',expectedTemp);
Tip
The default error message from the assessVariableEqual
function includes the name of the tested variable, such as X
and tempC
in the previous examples. For function assessment
items, this variable is in the assessment test script and not in the learner
code. Use meaningful names that the learner can recognize, such as an output in
the function declaration.
A single assessment test can include multiple tests. For example, this code checks
that a learner implements a function named normsinc
that
correctly handles both zero and nonzero values. The zero case includes additional
feedback for learners using the Feedback
name-value argument in
the assessVariableEqual
function.
nonzero = 0.25*randi([1 3]); y_nonzero = normsinc(nonzero); expected_y_nonzero = reference.normsinc(nonzero); assessVariableEqual('y_nonzero',expected_y_nonzero); y_zero = normsinc(0); expected_y_zero = reference.normsinc(0); assessVariableEqual('y_zero',expected_y_zero, ... Feedback='Inputs of 0 should return 1. Consider an if-else statement or logical indexing.');
Give Partial Credit
By default, the software considers a solution correct if all tests pass and
considers it incorrect if any test fails. To give partial credit, assign tests
relative weights by changing Scoring Method to
Weighted
. MATLAB
Grader calculates the percentage for each test based on the total of the
relative weights, so you can define the weights as points or percentages.
For example, setting the relative weight of all tests to 1
gives each test equal weight. Setting some weights to 2
gives
those tests twice as much weight as tests with a weight set to
1
.
Alternatively, you can enter percentages for the weights.
Specify Pretests
Pretests are assessment tests that learners can run to determine if their solution is on the right path before submitting it for grading. Consider using a pretest to guide learners when multiple approaches are correct but your test requires a particular approach or when you set a submission limit on the assignment.
Pretests differ from regular assessment tests in these ways:
Pretest results are not recorded in the gradebook prior to submission.
Running pretests does not count against a submission limit.
Learners can view pretest code and details, including output from MATLAB code tests, regardless of whether the test passes or fails. Make sure that a pretest does not include the assessment item solution.
Like regular assessment tests, pretests run when learners submit their solutions, and pretests contribute to the final grade.
For example, suppose learners must define a system of linear equations that can be organized in multiple ways. Define a pretest to make sure that the learner solution has the expected order and coefficients.
Learners see an error for a failing test and can modify their solution.
Control Error Display for a Script Assessment Item
In a script assessment item, an initial error can cause subsequent errors. You can encourage the learner to focus on the initial error first.
Selecting the Only Show Feedback for Initial Error option shows detailed feedback for the initial error but hides details for subsequent errors, by default. The learner can display this additional feedback by clicking Show Feedback.