Main Content

Code Generation for LSTM Network That Uses Intel MKL-DNN

This example shows how to generate code for a pretrained long short-term memory (LSTM) network that uses the Intel Math Kernel Library for Deep Neural Networks (MKL-DNN).This example generates a MEX function that makes predictions for each step of an input timeseries. The example demonstrates two approaches. The first approach uses a standard LSTM network. The second approach leverages the stateful behavior of the same LSTM network. This example uses textual descriptions of factory events that can be classified into one of these four categories: Electronic Failure, Leak, Mechanical Failure, and Software Failure. The example uses a pretrained LSTM network. For more information on training a network, see the Classify Text Data Using Deep Learning (Text Analytics Toolbox).

Third-Party Prerequisites

This example is supported on Mac®, Linux® and Windows® platforms and not supported for MATLAB Online.

Prepare Input

Load the wordEncoding MAT-file. This MAT-file stores the words encoded as numerical indices. This encoding was performed during the training of the network. For more information, see Classify Text Data Using Deep Learning (Text Analytics Toolbox).


Create a string array containing the new reports to classify the event type.

reportsNew = [ ...
    "Coolant is pooling underneath sorter."
    "Sorter blows fuses at start up."
    "There are some very loud rattling sounds coming from the assembler."
    "At times mechanical arrangement software freezes."
    "Mixer output is stuck."];

Tokenize the input string by using the preprocessText function.

documentsNew = preprocessText(reportsNew);

Use the doc2sequence (Text Analytics Toolbox) function to convert documents to sequences.

XNew = doc2sequence(enc,documentsNew);
labels = categorical({'Electronic Failure', 'Leak', 'Mechanical Failure', 'Software Failure'});

The lstm_predict Entry-Point Function

A sequence-to-sequence LSTM network enables you to make different predictions for each individual time step of a data sequence. The lstm_predict.m entry-point function takes an input sequence and passes it to a trained LSTM network for prediction. Specifically, the function uses the LSTM network that is trained in the example Classify Text Data Using Deep Learning (Text Analytics Toolbox). The function loads the network object from the textClassifierNetwork.mat file into a persistent variable and then performs prediction. On subsequent calls, the function reuses the persistent object.

function out = lstm_predict(in)

%   Copyright 2020 The MathWorks, Inc.

    persistent mynet;

    if isempty(mynet)
        mynet = coder.loadDeepLearningNetwork('textClassifierNetwork.mat');

    out = predict(mynet, in);

To display an interactive visualization of the network architecture and information about the network layers, use the analyzeNetwork function.

Generate MEX

To generate code, create a code configuration object for a MEX target and set the target language to C++. Use the coder.DeepLearningConfig function to create a MKL-DNN deep learning configuration object. Assign it to the DeepLearningConfig property of the code configuration object.

cfg = coder.config('mex');
cfg.TargetLang = 'C++';
cfg.DeepLearningConfig = coder.DeepLearningConfig('mkldnn');

Use the coder.typeof (MATLAB Coder) function to specify the type and size of the input argument to the entry-point function. In this example, the input is of double data type with a feature dimension value of 1 and a variable sequence length.

matrixInput = coder.typeof(double(0),[1 Inf],[false true]);

Generate a MEX function by running the codegen (MATLAB Coder) command.

codegen -config cfg lstm_predict -args {matrixInput} -report
Code generation successful: View report

Run Generated MEX

Call lstm_predict_mex on the first observation.

YPred1 = lstm_predict_mex(XNew{1});

YPred1 contains the probabilities for the four classes. Find the predicted class by calculating the index of the maximum probability.

[~, maxIndex] = max(YPred1);

Associate the indices of max probability to the corresponding label. Display the classification. From the results, you can see that the network predicted the first event to be a Leak.

predictedLabels1 = labels(maxIndex);

Generate MEX that Accepts Multiple Observations

If you want to perform prediction on many observations at once, you can group the observations together in a cell array and pass the cell array for prediction. The cell array must be a column cell array, and each cell must contain one observation. The sequence lengths of the inputs might vary. In this example, XNew contains five observations. To generate a MEX function that can accept XNew as input, specify the input type to be a 5-by-1 cell array. Specify that each cell be of the same type as matrixInput.

matrixInput = coder.typeof(double(0),[1 Inf],[false true]);
cellInput = coder.typeof({matrixInput}, [5 1]);
codegen -config cfg lstm_predict -args {cellInput} -report
Code generation successful: View report

Run the generated MEX function with XNew as input.

YPred2 = lstm_predict_mex(XNew);

YPred2 is 5-by-4 cell array. Find the indices that have maximum probability for each of the five inputs and classify them.

[~, maxIndex] = max(YPred2, [], 2);
predictedLabels2 = labels(maxIndex);
     Leak      Mechanical Failure      Mechanical Failure      Software Failure      Electronic Failure 

Generate MEX with Stateful LSTM

Instead of passing the entire timeseries to predict in a single step, you can run prediction on an input by streaming in one timestep at a time and using the function predictAndUpdateState. This function accepts an input, produces an output prediction, and updates the internal state of the network so that future predictions take this initial input into account.

The entry-point function lstm_predict_and_update.m accepts a single-timestep input and processes the input using the predictAndUpdateState function. The predictAndUpdateState function returns a prediction for the input timestep and updates the network so that subsequent inputs are treated as subsequent timesteps of the same sample. After passing in all timesteps, one at a time, the resulting output is identical to the case where all timesteps were passed in as a single input.

function out = lstm_predict_and_update(in)

%   Copyright 2020 The MathWorks, Inc.

    persistent mynet;

    if isempty(mynet)
        mynet = coder.loadDeepLearningNetwork('textClassifierNetwork.mat');

    [mynet, out] = predictAndUpdateState(mynet,in);

Generate code for lstm_predict_and_update. Because this function accepts a single timestep at each call, specify matrixInput to have a fixed sequence dimension of 1 instead of a variable sequence length.

matrixInput = coder.typeof(double(0),[1 1]);
codegen -config cfg lstm_predict_and_update -args {matrixInput} -report
Code generation successful: View report

Run the generated MEX on the first observation.

sequenceLength = size(XNew{1},2);
for i=1:sequenceLength
    inTimeStep = XNew{1}(:,i);
    YPred3 = lstm_predict_and_update_mex(inTimeStep);
clear mex;

Find the index that has the highest probability and map it to the labels.

[~, maxIndex] = max(YPred3);
predictedLabels3 = labels(maxIndex);

See Also

(MATLAB Coder) | (MATLAB Coder) | (MATLAB Coder) | (Text Analytics Toolbox)

Related Topics