Borrar filtros
Borrar filtros

How to export RegressionEnsemble to ONNX.

16 visualizaciones (últimos 30 días)
Michael
Michael el 14 de Mzo. de 2024
Respondida: Garmit Pant el 4 de Jul. de 2024 a las 5:13
exportONNXNetwork only works on neural nets. However ONNX has support for regression models as demonstrated here: https://onnx.ai/sklearn-onnx/auto_examples/plot_convert_model.html
Anyone have any ideas on a workflow to get our model(s) out of matlab for use in Triton? There will probably need to be an intermediary format.

Respuestas (1)

Garmit Pant
Garmit Pant el 4 de Jul. de 2024 a las 5:13
Hello Michael
From what I understand, you want to export MATLAB RegressionEnsemble model to use with Triton.
Your intuition to use an intermediary format is correct. However, MATLAB currently doesn’t support the conversion of ensemble regression learner models like ‘RegressionEnsemble’ to any other format.
exportONNXNetwork” function expects a ‘dlnetwork’ object as the input. Converting a RegressionEnsemble model to a dlnetwork in MATLAB is not possible because they represent fundamentally different types of models. RegressionEnsemble is an ensemble of decision trees, whereas dlnetwork is used for deep learning models.
If you need to deploy only ensemble learning models in Triton, then using scikit-learn in Python will be more suitable for your use case.
Scikit-learn in Python supports exporting multiple models, including ensemble learning models, to ONNX. The following resource provides a list of models supported by “skl2onnx”.
You can also consider using a neural network for your task and exporting that from MATLAB using “exportONNXNetwork”.
You can refer the following MATLAB documentation for further understanding how to use the ‘dlnetwork’ object to design deep learning network for various tasks and workflows:
I hope you find the above explanation and suggestions useful!

Productos


Versión

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by