I'm answering my own question. (Actually, I had found the answer before asking, after much fiddling, but I thought it might be useful for others.)
NOTE: this is a somewhat disgusting hack. I'd love to know if there is a better way!
Though the "Trained" property of my ensemble is set-protected, and the constituent CompactClassificationTrees don't seem to possess a publicly accessible "prune" method, the hidden "Impl" property of each is publicly set-able and allows me to do what I want. So after using a template tree with pruning on to create an initial ensemble, I was able to loop over the resulting set of trained trees and prune each like the following. (The key is the two usage of "Impl" to access public implementations I could adjust.)
tTree = templateTree( ... , 'Prune', 'on');
ensembleCTrees = fitensemble(xdata, labels, 'AdaBoostM2', 10, tTree);
prunedEnsemble = ensembleCTrees;
for iTree = 1:maxTrees
prunedEnsemble.Impl.Trained{iTree}.Impl = prune( ...
prunedEnsemble.Impl.Trained{iTree}.Impl, 'Level', 20);
end
Then, I could try different settings of the prune level and compare prediction performance between the original and pruned ensembles.