importNetworkFromONNX error opening LLM

4 visualizaciones (últimos 30 días)
joel brinton
joel brinton el 4 de Dic. de 2023
Comentada: joel brinton el 22 de Dic. de 2023
I'm trying to import an ONNX model into Matlab. First, I exported a Large Language Model (LLM) to ONNX from Hugging Face using the Optimum library. Next, I imported it into Matlab with the Deep Learning Toolbox Converter. However, during import I get the following error:
>> net = importNetworkFromONNX("model.onnx")
Error using reshape
Number of elements must not change. Use [] as one of the size inputs to automatically calculate the appropriate size for that dimension.
Error in nnet.internal.cnn.onnx.fcn.GraphTranslation/getInitializerValue (line 129)
data = reshape(data, dimVec); % Apply MATLAB shape
...
Error in importNetworkFromONNX (line 77)
Network = nnet.internal.cnn.onnx.importNetworkFromONNX(modelfile, varargin{:});
I think the issue has to do with the lack of support for the external data file which is required for models greater than 2GB. ONNX models are built on protobuf which has a 2 GB limit. When the model is greater than 2GB, ONNX separates the model weights into an external raw data file, called model.onnx_data. I've noted that that importNetworkFromONNX() doesn't even attempt to open the associated onnx_data file before it aborts.
How can we get large ONNX model support into Matlab? I've scowered the comments already and, for some reason, no one else has yet run into this issue.
thanks!

Respuestas (1)

Ashutosh Thakur
Ashutosh Thakur el 22 de Dic. de 2023
Hi Joel,
I can understand that you are facing issues while importing ONNX model into MATLAB.
Here are the few suggestions which would help you:
  • If it is possible then try to split the large model into multiple smaller sub models such that it under the memory limit of 2GB, then try to run each model sequentially by passing the output of one to the input of other.
  • Also based on the error message it also seems that reshape is being used incorrectly used in the code, their might be the issue of particular ONNX network in its usage of reshape operation.
  • If the problem persists, I suggest you reach out to MathWorks Technical Support here: https://www.mathworks.com/support/contact_us.html
I hope this helps!
Thanks.
  1 comentario
joel brinton
joel brinton el 22 de Dic. de 2023
Thank you Ashutosh. I will try to see if I can export just one layer of the LLM and keep it under the 2GB limit. I hadn't considered that yet.
I think the error is because some off the Reshape inputs are not being loaded (due to the external data file not being supported). Here is the ONNX node description of the Reshape operation:
node {
output: "/model/Constant_25_output_0"
name: "/model/Constant_25"
op_type: "Constant"
attribute {
name: "value"
t {
dims: 1
data_type: 7
name: "/model/Constant_25_attr::value"
raw_data: "\377\377\377\377\377\377\377\377"
}
type: TENSOR
}
}
node {
input: "/model/Concat_3_output_0"
input: "/model/Constant_25_output_0"
output: "/model/Reshape_2_output_0"
name: "/model/Reshape_2"
op_type: "Reshape"
}
As you can see, the Reshape operation usage is standard. But the dimension vector input comes from raw_data which I'm assuming is not supported. Keeping the model under 2GB should prevent my ONNX exporter from using raw_data.
If importing the model one layer at a time doesn't work, I'll reach out for support.

Iniciar sesión para comentar.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by