How do I export a neural network from MATLAB?

157 visualizaciones (últimos 30 días)
MathWorks Support Team
MathWorks Support Team el 15 de Feb. de 2017
Editada: Jon Cherrie el 21 de Dic. de 2023
I have a neural network which I trained using MATLAB. I want to export the network so I can use it with other frameworks, for example Caffe. How do I do that?

Respuesta aceptada

MathWorks Support Team
MathWorks Support Team el 8 de Dic. de 2023
Editada: MathWorks Support Team el 20 de Dic. de 2023
You can export a Deep Learning Toolbox network or layer graph to TensorFlow and ONNX using the "exportNetworkToTensorFlow" and "exportONNXNetwork" functions, respectively.  For more information about exporting networks to external deep learning platforms, please see the following documentation page:
Alternatively, you could export via the MATLAB Compiler SDK.
Using the MATLAB Compiler SDK, you can save the trained network as a MAT file, and write a MATLAB function that loads the network from the file, performs the desired computation, and returns the network's output.
You can then compile your MATLAB function into a shared library to be used in your C/C++, .NET, Java, or Python project.
You can find more information about MATLAB Compiler SDK in the following link:
Furthermore, the objects that MATLAB uses to represent neural networks are transparent, and you can therefore access all the information that describes your trained network.
For example, you will get an object of type "SeriesNetwork", which is a trained Convolutional Neural Network. You can then see the weights and biases of the trained network:
convnet.Layers(2).Weights
convnet.Layers(2).Bias
Then, using for example Caffe's MATLAB interface, you should be able to save a Convolutional Neural Network as a Caffe model. The code for the MATLAB interface is in the following link:
and includes a classification demo that shows you how to use the interface.
Please note that the above code is not developed or supported by MathWorks Technical Support. If you have any questions about how to use the code, please contact the project's developers.

Más respuestas (2)

Maria Duarte Rosa
Maria Duarte Rosa el 25 de Jun. de 2018
Editada: Jon Cherrie el 21 de Dic. de 2023
The exportONNXNetwork function in Deep Learning Toolbox Converter for ONNX Model Format allows one to export a trained deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks that support ONNX model import.
To use the network with TensorFlow, use the exportNetworkToTensorFlow function that is part of the Deep Learning Toolbox Converter for TensorFlow Models.
  1 comentario
michael scheinfeild
michael scheinfeild el 6 de Ag. de 2018
still i have no success to import it to c++ from onnx there are many issues of compilation

Iniciar sesión para comentar.


michael scheinfeild
michael scheinfeild el 14 de Abr. de 2019
after testing onnx i found that the output of convolutions is not the same as in matlab .

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by