resume
Syntax
Description
resumes a set of hyperparameter optimization problems for an additional default number of
iterations using the same fitting function and optimization settings used to create the
newAggregateResults
= resume(AggregateResults
)AggregateBayesianOptimization
object AggregateResults
. When
you resume the optimization problems, the resume
function returns a new
AggregateBayesianOptimization
object, but does not return any model
objects.
specifies additional options using one or more name-value arguments. For example, you can
specify which optimization problems in newAggregateResults
= resume(AggregateResults
,Name=Value
)AggregateResults
to
resume.
Examples
Perform and Resume Hyperparameter Optimization
Load the ovarian cancer data set.
load ovariancancer.mat
Create a HyperparameterOptimizationOptions
object with the following specifications for two hyperparameter optimization problems:
Use
kfoldLoss
as the constraint, with the bounds [0, 0.1] for the first problem and [0, 0.15] for the second problem.Perform a maximum of 10 objective evaluations for each optimization problem.
Use the
'expected-improvement-plus'
acquisition function (for reproducibility).Suppress the display of plots.
hpoOptions = hyperparameterOptimizationOptions(ConstraintType="loss", ... ConstraintBounds=[0.1; 0.15],MaxObjectiveEvaluations=10, ... AcquisitionFunctionName="expected-improvement-plus",ShowPlots=false);
Call the fitctree
function to train a binary decision tree classification model and optimize the MinLeafSize
hyperparameter for each optimization problem, using the options and constraints in hpoOptions
. Because ConstraintType
is 'loss
', the software uses the size of the compact version of the model object as the objective.
rng(0,"twister"); % For reproducibility [Mdl,hpoResults]=fitctree(obs,grp,OptimizeHyperparameters="MinLeafSize", ... HyperparameterOptimizationOptions=hpoOptions);
|=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.1] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Infeas | 40027 | 66.373 | NaN | 40027 | 0.34 | 91 | | 2 | Infeas | 44103 | 0.60747 | NaN | 40230 | 0.0713 | 1 | | 3 | Infeas | 40707 | 0.35853 | NaN | 40027 | 0.025 | 22 | | 4 | Infeas | 42063 | 0.43956 | NaN | 40027 | 0.062 | 6 | | 5 | Infeas | 40027 | 0.3385 | NaN | 40019 | 0.025 | 79 | | 6 | Infeas | 42063 | 0.30684 | NaN | 39996 | 0.0759 | 10 | | 7 | Infeas | 44103 | 0.30746 | NaN | 40055 | 0.0574 | 3 | | 8 | Infeas | 42063 | 0.32436 | NaN | 40056 | 0.0805 | 7 | | 9 | Infeas | 42063 | 0.30914 | NaN | 40046 | 0.0759 | 10 | | 10 | Infeas | 40707 | 0.24694 | NaN | 40040 | 0.062 | 21 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 85.1906 seconds Total objective function evaluation time: 69.6117 No feasible points were found. |=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.15] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Best | 40027 | 0.25714 | 40027 | 40027 | -0.0343 | 38 | | 2 | Infeas | 44103 | 0.34661 | 40027 | 40247 | 0.012 | 1 | | 3 | Accept | 40027 | 0.17625 | 40027 | 40027 | -0.00649 | 85 | | 4 | Infeas | 40707 | 0.40139 | 40027 | 40026 | 0.0259 | 14 | | 5 | Accept | 40027 | 0.28058 | 40027 | 40009 | -0.0343 | 57 | | 6 | Accept | 40027 | 0.22424 | 40027 | 40023 | -0.0343 | 55 | | 7 | Accept | 40027 | 0.27688 | 40027 | 40023 | -0.0343 | 46 | | 8 | Infeas | 40027 | 0.1058 | NaN | 40033 | 0.29 | 89 | | 9 | Accept | 40027 | 0.19001 | NaN | 40033 | -0.0111 | 80 | | 10 | Accept | 40707 | 0.34801 | NaN | 40031 | -0.0204 | 19 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 4.3909 seconds Total objective function evaluation time: 2.6069 No feasible points were found.
Display a summary of the optimization results.
summary(hpoResults)
Objective: CompactModelSize (bytes) Constraint: kfoldLoss MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective ____________ ________________________ ________________ ____________________________ ________ _____________________ Result_1 40027 0.12499 0 0.1 false false "ClassificationTree" Result_2 40027 0.11573 0 0.15 true false "ClassificationTree"
The final attained models in the optimization problems are infeasible. The ConstraintAtMinObjective
value of the final attained model in the second optimization problem (0.11573
) satisfies the constraint bounds, but the model is infeasible because the attained point is outside the optimization confidence bounds.
Resume the optimization problems for an additional 30 iterations (the default number of additional iterations). Store the results in the AggregateBayesianOptimization
object newResults
.
newResults=resume(hpoResults);
|=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.1] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 11 | Infeas | 40027 | 0.43061 | NaN | 40028 | 0.025 | 36 | | 12 | Infeas | 42743 | 0.41494 | NaN | 40044 | 0.0435 | 5 | | 13 | Infeas | 40707 | 0.27159 | NaN | 40040 | 0.062 | 21 | | 14 | Infeas | 40707 | 0.28638 | NaN | 40022 | 0.0852 | 13 | | 15 | Infeas | 40027 | 0.25927 | NaN | 40017 | 0.025 | 28 | | 16 | Infeas | 40707 | 0.25519 | NaN | 40016 | 0.025 | 24 | | 17 | Infeas | 44103 | 0.32862 | NaN | 40045 | 0.0574 | 3 | | 18 | Infeas | 44103 | 0.34216 | NaN | 40058 | 0.0574 | 3 | | 19 | Infeas | 44103 | 0.38356 | NaN | 40061 | 0.0713 | 1 | | 20 | Infeas | 44103 | 0.34886 | NaN | 40047 | 0.0667 | 2 | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 21 | Infeas | 42063 | 0.32492 | NaN | 40053 | 0.0805 | 7 | | 22 | Infeas | 40707 | 0.28227 | NaN | 40053 | 0.0481 | 19 | | 23 | Infeas | 40707 | 0.27995 | NaN | 40048 | 0.025 | 32 | | 24 | Infeas | 40707 | 0.25968 | NaN | 40054 | 0.025 | 33 | | 25 | Infeas | 40707 | 0.23445 | NaN | 40063 | 0.025 | 35 | | 26 | Infeas | 40027 | 0.27243 | NaN | 40065 | 0.025 | 30 | | 27 | Infeas | 40027 | 0.24756 | NaN | 40052 | 0.025 | 40 | | 28 | Infeas | 40027 | 0.23647 | NaN | 40042 | 0.025 | 40 | | 29 | Infeas | 40707 | 0.27462 | NaN | 40040 | 0.025 | 31 | | 30 | Infeas | 40027 | 0.17799 | NaN | 40036 | 0.025 | 75 | | 31 | Infeas | 40707 | 0.2949 | NaN | 40036 | 0.062 | 17 | | 32 | Infeas | 40707 | 0.19485 | NaN | 40055 | 0.025 | 45 | | 33 | Infeas | 40027 | 0.18133 | NaN | 40043 | 0.025 | 51 | | 34 | Infeas | 40707 | 0.28066 | NaN | 40042 | 0.062 | 16 | | 35 | Infeas | 40027 | 0.17038 | NaN | 40030 | 0.025 | 57 | | 36 | Infeas | 43423 | 0.37602 | NaN | 40029 | 0.0528 | 4 | | 37 | Infeas | 40027 | 0.16661 | NaN | 40020 | 0.025 | 63 | | 38 | Infeas | 42063 | 0.35504 | NaN | 40019 | 0.0898 | 8 | | 39 | Infeas | 40027 | 0.19436 | NaN | 40017 | 0.025 | 48 | | 40 | Infeas | 40707 | 0.23338 | NaN | 40016 | 0.025 | 23 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 40 reached. Total function evaluations: 40 Total elapsed time: 98.7197 seconds Total objective function evaluation time: 77.9708 No feasible points were found. |=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.15] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 11 | Infeas | 42063 | 0.34939 | NaN | 40078 | 0.00277 | 8 | | 12 | Infeas | 43423 | 0.40105 | NaN | 40082 | 0.012 | 4 | | 13 | Infeas | 44103 | 0.43972 | NaN | 40084 | 0.012 | 1 | | 14 | Accept | 40707 | 0.28792 | NaN | 40031 | -0.0204 | 20 | | 15 | Infeas | 44103 | 0.37072 | NaN | 40111 | 0.0213 | 2 | | 16 | Accept | 40707 | 0.32508 | NaN | 40032 | -0.00186 | 17 | | 17 | Infeas | 40707 | 0.36003 | NaN | 40028 | 0.0167 | 11 | | 18 | Infeas | 40707 | 0.30457 | NaN | 40028 | 0.0167 | 12 | | 19 | Infeas | 44103 | 0.45343 | NaN | 40028 | 0.0213 | 2 | | 20 | Infeas | 42063 | 0.38948 | NaN | 40101 | 0.00277 | 10 | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 21 | Infeas | 40027 | 0.1309 | NaN | 40103 | 0.29 | 101 | | 22 | Accept | 40027 | 0.22526 | NaN | 40106 | -0.0343 | 55 | | 23 | Accept | 40027 | 0.30185 | NaN | 40108 | -0.0343 | 55 | | 24 | Accept | 40027 | 0.20494 | 40027 | 40018 | -0.0343 | 55 | | 25 | Accept | 40027 | 0.23858 | 40027 | 40011 | -0.0343 | 38 | | 26 | Accept | 40027 | 0.1978 | 40027 | 40010 | -0.0343 | 46 | | 27 | Infeas | 40027 | 0.084622 | NaN | 40095 | 0.29 | 108 | | 28 | Accept | 40027 | 0.15048 | NaN | 40097 | -0.0343 | 55 | | 29 | Accept | 40027 | 0.14118 | 40027 | 40016 | -0.0343 | 55 | | 30 | Accept | 40027 | 0.18496 | 40027 | 40017 | -0.0343 | 76 | | 31 | Accept | 40027 | 0.15109 | 40027 | 40015 | -0.0296 | 75 | | 32 | Accept | 40027 | 0.28697 | 40027 | 40012 | -0.0343 | 38 | | 33 | Accept | 40027 | 0.19405 | 40027 | 40011 | -0.0343 | 46 | | 34 | Accept | 40027 | 0.23407 | 40027 | 40010 | -0.0343 | 30 | | 35 | Accept | 40027 | 0.2241 | 40027 | 40009 | -0.0343 | 30 | | 36 | Accept | 40027 | 0.22822 | 40027 | 40008 | -0.0343 | 30 | | 37 | Accept | 40027 | 0.23876 | 40027 | 40007 | -0.0343 | 38 | | 38 | Accept | 40027 | 0.12949 | 40027 | 40008 | -0.0296 | 75 | | 39 | Accept | 40027 | 0.14375 | 40027 | 40008 | -0.00649 | 85 | | 40 | Accept | 40027 | 0.22908 | 40027 | 40008 | -0.0343 | 30 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 40 reached. Total function evaluations: 40 Total elapsed time: 19.3236 seconds Total objective function evaluation time: 10.2084 Best observed feasible point: MinLeafSize ___________ 38 Observed objective function value = 40027 Estimated objective function value = 40034.0182 Function evaluation time = 0.25714 Observed constraint violations =[ -0.034265 ] Best estimated feasible point (according to models): MinLeafSize ___________ 55 Estimated objective function value = 40007.5995 Estimated function evaluation time = 0.19489 Estimated constraint violations =[ -0.034265 ]
By default, the software resumes each optimization problem by calling fitctree
with the same hyperparameter optimization options as before, with the exception of MaxObjectiveEvaluations
, whose value is increased by 30. The resume
function returns a new AggregateBayesianOptimization
object, but does not alter the original final attained models in Mdl
.
Display a summary of the new results.
summary(newResults)
Objective: CompactModelSize (bytes) Constraint: kfoldLoss MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective ____________ ________________________ ________________ ____________________________ ________ _____________________ Result_1 40027 0.12499 0 0.1 false false "ClassificationTree" Result_2 40027 0.11573 0 0.15 true true "ClassificationTree"
The summary indicates that the original final attained model in the second optimization problem is feasible with the new attained hyperparameter value.
Display the new attained hyperparameter value for the second optimization problem.
newResults.HyperparameterOptimizationResults{2}.XAtMinObjective
ans=table
MinLeafSize
___________
38
Train a new cross-validated binary decision tree classification model with the new attained hyperparameter value.
optimizedMdl=fitctree(obs,grp,MinLeafSize=38);
Display the compact size of the new model in bytes.
learnersize(optimizedMdl)
ans = 40027
Calculate the classification loss for the cross-validated model.
kfoldLoss(crossval(optimizedMdl))
ans = single
0.1389
The loss lies within the constraint bounds of [0, 0.15]
.
Resume Hyperparameter Optimization With Modified Variables
This example shows how to resume a set of hyperparameter optimization problems with modified variables. The example uses the gprdata2
data that ships with your software.
Load the data.
load('gprdata2.mat')
The data has one predictor variable and continuous response. This is simulated data.
Create a structure that contains the following non-default settings for the hyperparameter optimization problems.
Use the compact model size as a constraint, with bounds between 0 and 10000 bytes for the first problem, and 0 and 20000 bytes for the second problem.
Use the
'expected-improvement-plus'
acquisition function (for reproducibility).Perform a maximum of 10 objective function evaluations for each optimization problem.
Do not display any plots.
hyperopts = struct(AcquisitionFunctionName="expected-improvement-plus", ... ConstraintType="size", ConstraintBounds=[10000; 20000], ... MaxObjectiveEvaluations=10, ShowPlots=false);
For each hyperparameter optimization problem, train a GPR model and optimize the Sigma
hyperparameter within the range [0 0.01]
using the specified optimization settings. Use a squared exponential kernel function with default kernel parameters.
rng(0,'twister'); % For reproducibility [Mdl,hpoResults] = fitrgp(x,y,KernelFunction="squaredexponential",... OptimizeHyperparameters=optimizableVariable(Sigma=[0 0.01]), ... HyperparameterOptimizationOptions=hyperopts);
|=====================================================================================================| | | |Objective : "kfoldLoss" | |Constraint : "CompactModelSize (bytes)" | |Constraint Bounds : [0 10000] | | | |=====================================================================================================| | Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma | | | result | log(1+loss) | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Infeas | 0.29871 | 2.281 | NaN | 0.29871 | 6.01e+03 | 0.0021539 | | 2 | Infeas | 1.1091 | 2.0392 | NaN | 0.3711 | 6.01e+03 | 0.0057638 | | 3 | Infeas | 0.29823 | 2.4338 | NaN | 0.3315 | 6.01e+03 | 0.0098837 | | 4 | Infeas | 1.8412 | 1.8691 | NaN | 0.54369 | 6.01e+03 | 6.9219e-05 | | 5 | Infeas | 0.29873 | 2.6339 | NaN | 0.76919 | 6.01e+03 | 4.3883e-05 | | 6 | Infeas | 1.1091 | 2.2547 | NaN | 0.82584 | 6.01e+03 | 0.0052323 | | 7 | Infeas | 0.29839 | 2.1299 | NaN | 0.75049 | 6.01e+03 | 0.0082039 | | 8 | Infeas | 0.29845 | 2.1781 | NaN | 0.69398 | 6.01e+03 | 0.0074221 | | 9 | Infeas | 0.29823 | 2.228 | NaN | 0.65001 | 6.01e+03 | 0.0098713 | | 10 | Infeas | 1.1091 | 1.7088 | NaN | 0.69592 | 6.01e+03 | 0.0043666 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 24.1372 seconds Total objective function evaluation time: 21.7564 No feasible points were found. |=====================================================================================================| | | |Objective : "kfoldLoss" | |Constraint : "CompactModelSize (bytes)" | |Constraint Bounds : [0 20000] | | | |=====================================================================================================| | Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma | | | result | log(1+loss) | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Best | 0.3047 | 2.1728 | 0.3047 | 0.3047 | -3.99e+03 | 9.8148e-05 | | 2 | Best | 0.30457 | 1.7426 | 0.30457 | 0.30457 | -3.99e+03 | 0.0054459 | | 3 | Accept | 0.30466 | 2.0936 | 0.30457 | 0.30464 | -3.99e+03 | 0.0029412 | | 4 | Best | 0.30434 | 1.747 | 0.30434 | 0.30434 | -3.99e+03 | 0.0089683 | | 5 | Best | 0.30425 | 2.3 | 0.30425 | 0.30425 | -3.99e+03 | 0.0099999 | | 6 | Accept | 0.30425 | 1.7777 | 0.30425 | 0.30425 | -3.99e+03 | 0.0099992 | | 7 | Accept | 0.30425 | 2.2579 | 0.30425 | 0.30425 | -3.99e+03 | 0.0099979 | | 8 | Accept | 0.30425 | 2.1081 | 0.30425 | 0.30425 | -3.99e+03 | 0.0099993 | | 9 | Accept | 0.30425 | 2.0023 | 0.30425 | 0.30425 | -3.99e+03 | 0.0099999 | | 10 | Accept | 0.30446 | 2.078 | 0.30425 | 0.30425 | -3.99e+03 | 0.0073607 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 23.0263 seconds Total objective function evaluation time: 20.28 Best observed feasible point: Sigma _________ 0.0099999 Observed objective function value = 0.30425 Estimated objective function value = 0.30425 Function evaluation time = 2.3 Observed constraint violations =[ -3990.500000 ] Best estimated feasible point (according to models): Sigma _________ 0.0099999 Estimated objective function value = 0.30425 Estimated function evaluation time = 2.0183 Estimated constraint violations =[ -3990.500000 ]
Display a summary of the optimization results.
summary(hpoResults)
Objective: kfoldLoss Constraint: CompactModelSize (bytes) MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective ____________ ________________________ ________________ ____________________________ ________ _____________________ Result_1 0.29823 16010 0 10000 false false "RegressionGP" Result_2 0.30425 16010 0 20000 true true "RegressionGP"
The final model in the first optimization problem is infeasible, since its compact size lies outside the constraint bounds. The second optimization problem has a minimum objective value of 0.30065 and is feasible. Display its properties.
details(Mdl{2})
RegressionGP with properties: IsActiveSetVector: [501x1 logical] LogLikelihood: -1.2483e+03 ActiveSetHistory: [] BCDInformation: [] Y: [501x1 double] X: [501x1 double] RowsUsed: [] W: [501x1 double] ModelParameters: [1x1 classreg.learning.modelparams.GPParams] NumObservations: 501 BinEdges: {} HyperparameterOptimizationResults: [1x1 BayesianOptimization] PredictorNames: {'x1'} CategoricalPredictors: [] ResponseName: 'Y' ExpandedPredictorNames: {'x1'} ResponseTransform: 'none' KernelFunction: 'SquaredExponential' KernelInformation: [1x1 struct] BasisFunction: 'Constant' Beta: 7.9950 Sigma: 0.0100 PredictorLocation: [] PredictorScale: [] Alpha: [501x1 double] ActiveSetVectors: [501x1 double] FitMethod: 'Exact' PredictMethod: 'Exact' ActiveSetMethod: 'Random' ActiveSetSize: 501
The final attained model in the second optimization problem has a Sigma
value of 0.0293
.
Resume the hyperparameter optimization of the second problem, and alter the constraint such that the Sigma
parameter value must lie between 0 and 1. By default, the software resumes the optimization problem by calling fitrgp
with the same hyperparameter optimization options as before, with the exception of MaxObjectiveEvaluations
, whose value is increased by 30. The resume
function returns the new AggregateBayesianOptimization
object newResults
, and does not alter the original final attained model Mdl
.
newResults=resume(hpoResults, Results=2, ...
VariableDescriptions={optimizableVariable(Sigma=[0 1])});
|=====================================================================================================| | | |Objective : "kfoldLoss" | |Constraint : "CompactModelSize (bytes)" | |Constraint Bounds : [0 20000] | | | |=====================================================================================================| | Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma | | | result | log(1+loss) | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 11 | Accept | 0.41882 | 1.2293 | 0.30425 | 0.30428 | -3.99e+03 | 0.66292 | | 12 | Best | 0.03798 | 1.6116 | 0.03798 | 0.03799 | -3.99e+03 | 0.16757 | | 13 | Best | 0.037831 | 1.7601 | 0.037831 | 0.037831 | -3.99e+03 | 0.23883 | | 14 | Accept | 0.03789 | 1.9903 | 0.037831 | 0.037789 | -3.99e+03 | 0.20216 | | 15 | Best | 0.037818 | 1.9654 | 0.037818 | 0.037787 | -3.99e+03 | 0.33607 | | 16 | Best | 0.037799 | 1.752 | 0.037799 | 0.037694 | -3.99e+03 | 0.29549 | | 17 | Accept | 0.21443 | 1.6676 | 0.037799 | 0.037694 | -3.99e+03 | 0.41667 | | 18 | Accept | 0.42224 | 1.3012 | 0.037799 | 0.037683 | -3.99e+03 | 0.99989 | | 19 | Accept | 0.037806 | 1.5579 | 0.037799 | 0.037317 | -3.99e+03 | 0.31875 | | 20 | Accept | 0.037807 | 1.5803 | 0.037799 | 0.037177 | -3.99e+03 | 0.26588 | |=====================================================================================================| | Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma | | | result | log(1+loss) | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 21 | Accept | 0.037936 | 1.639 | 0.037799 | 0.037141 | -3.99e+03 | 0.18244 | | 22 | Accept | 0.03781 | 1.6934 | 0.037799 | 0.037274 | -3.99e+03 | 0.32505 | | 23 | Accept | 0.037855 | 1.7994 | 0.037799 | 0.03726 | -3.99e+03 | 0.22119 | | 24 | Accept | 0.037944 | 1.5767 | 0.037799 | 0.037259 | -3.99e+03 | 0.1798 | | 25 | Accept | 0.037801 | 1.6126 | 0.037799 | 0.037253 | -3.99e+03 | 0.27948 | | 26 | Accept | 0.037817 | 1.8017 | 0.037799 | 0.037248 | -3.99e+03 | 0.25247 | | 27 | Accept | 0.037809 | 1.6188 | 0.037799 | 0.037416 | -3.99e+03 | 0.32457 | | 28 | Accept | 0.03781 | 1.8364 | 0.037799 | 0.037507 | -3.99e+03 | 0.32529 | | 29 | Accept | 0.037841 | 1.8995 | 0.037799 | 0.037506 | -3.99e+03 | 0.23064 | | 30 | Accept | 0.037946 | 1.7412 | 0.037799 | 0.037507 | -3.99e+03 | 0.17898 | | 31 | Accept | 0.037803 | 2.4806 | 0.037799 | 0.037508 | -3.99e+03 | 0.27441 | | 32 | Accept | 0.037801 | 2.6047 | 0.037799 | 0.037494 | -3.99e+03 | 0.30734 | | 33 | Accept | 0.037816 | 2.0562 | 0.037799 | 0.037494 | -3.99e+03 | 0.25336 | | 34 | Accept | 0.03787 | 3.0448 | 0.037799 | 0.037494 | -3.99e+03 | 0.21259 | | 35 | Accept | 0.42015 | 1.3942 | 0.037799 | 0.037475 | -3.99e+03 | 0.83224 | | 36 | Accept | 0.41878 | 1.6174 | 0.037799 | 0.037454 | -3.99e+03 | 0.54762 | | 37 | Accept | 0.42198 | 1.9642 | 0.037799 | 0.037435 | -3.99e+03 | 0.91788 | | 38 | Accept | 0.41896 | 1.489 | 0.037799 | 0.037415 | -3.99e+03 | 0.74565 | | 39 | Accept | 0.41773 | 1.7524 | 0.037799 | 0.037516 | -3.99e+03 | 0.48318 | | 40 | Accept | 0.037918 | 2.4197 | 0.037799 | 0.037516 | -3.99e+03 | 0.18954 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 40 reached. Total function evaluations: 40 Total elapsed time: 85.0182 seconds Total objective function evaluation time: 74.7373 Best observed feasible point: Sigma _______ 0.29549 Observed objective function value = 0.037799 Estimated objective function value = 0.037951 Function evaluation time = 1.752 Observed constraint violations =[ -3990.500000 ] Best estimated feasible point (according to models): Sigma _______ 0.32457 Estimated objective function value = 0.037516 Estimated function evaluation time = 1.8176 Estimated constraint violations =[ -3990.500000 ]
The new minimum objective and Lambda
values are 0.23944
and 0.10002
, respectively.
Input Arguments
AggregateResults
— Aggregate optimization results
AggregateBayesianOptimization
object
Aggregate optimization results, specified as an AggregateBayesianOptimization
object.
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN
, where Name
is
the argument name and Value
is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Example: resume(
resumes the optimization of the first and third optimization problems in
AggregateResults
,Results=[1 3])AggregateResults
.
Note
The MaxTime
and MaxObjectiveEvaluations
name-value arguments specify additional time or objective
evaluations, above the numbers stored in AggregateResults
. For
example, the default number of evaluations is 30
in addition to the
original specification.
Results
— Optimization problems to resume
"all"
(default) | numeric vector of positive integers | logical vector
Optimization problems in AggregateResults
to resume,
specified as "all"
, a numeric vector of positive integers
containing indices in the range [1,N]
, or a logical vector of
length N
, where
N=numel(AggregateResults.HyperparameterOptimizationResults)
.
Example: Results=[1 3]
Data Types: single
| double
| logical
VariableDescriptions
— Variables to modify
[]
(default) | cell array
Variables to modify, specified as a P-by-1
cell array, where P must be equal to 1
or a
value that depends on the contents of Results
.
Contents of Results | Value of P |
---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
Numeric values | numel(Results) |
Logical values | sum(Results) |
Each cell of VariableDescriptions
contains a
K-by-1
or
1
-by-K array of optimizableVariable
objects, where K is the number of
optimizable variables in hpoResults
. The software applies the
contents of each cell to the
AggregateResults.HyperparameterOptimizationResults
property of
the corresponding optimization problem with the index specified in
Results
. If P=1
, then the
software applies the cell contents to all optimization problems with the indices
specified in Results
.
You can modify only the following properties of a variable in an optimization:
Range
of real or integer variables. For example:xvar = optimizableVariable(x=[-10,10]); % Modify the range: xvar.Range = [1,5];
Type
between"integer"
and"real"
. For example:xvar.Type = "integer";
Transform
of real or integer variables between"log"
and"none"
. For example:xvar.Transform = "log";
Data Types: cell array
MaxObjectiveEvaluations
— Maximum number of objective function evaluations
array of positive integers | []
Maximum number of objective function evaluations, specified as a
P-by-1
array of positive integers or
[]
. The value of P must be equal to
1
or a value that depends on the contents of Results
.
Contents of Results | Value of P |
---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
Numeric values | numel(Results) |
Logical values | sum(Results) |
If P=1
, the software applies the
value of MaxObjectiveEvaluations
to all optimization problems in
AggregrateOptimizationResults
with the indices specified in
Results
. Otherwise, the software applies each element of
MaxObjectiveEvaluations
to the corresponding optimization
problem with the index specified in Results
.
If MaxObjectiveEvaluations
is []
, the
default value depends on the fitting function and optimizer used to create
AggregrateOptimizationResults
. For more information, see the
HyperparameterOptimizationOptions
name-value argument description
on the documentation pages of the individual fitting functions.
Example: MaxObjectiveEvaluations=40
Data Types: single
| double
MaxTime
— Time limit
[]
| P-by-1
array
Time limit for the optimization, specified as []
or a
P-by-1
array. The array must contain
nonnegative integers or Inf
. The value of P must
be equal to 1
or a value that depends on the contents of Results
.
Contents of Results | Value of P |
---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
Numeric values | numel(Results) |
Logical values | sum(Results) |
If P=1
, then the software
applies the value of MaxTime
to all optimization problems in
AggregrateOptimizationResults
with the indices specified in
Results
. Otherwise, the software applies each value of
MaxTime
to the corresponding optimization problem with the
index in Results
.
If MaxTime
is []
, the default value is
Inf
.
Example: MaxTime=[30 60]
Data Types: single
| double
Output Arguments
newAggregateResults
— Optimization results
AggregateBayesianOptimization
object
Optimization results, returned as an AggregateBayesianOptimization
object.
Version History
Introduced in R2024b
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)