Import Pretrained Deep Learning Networks into MATLAB - MATLAB
Video Player is loading.
Current Time 0:00
Duration 12:01
Loaded: 1.38%
Stream Type LIVE
Remaining Time 12:01
 
1x
  • descriptions off, selected
  • en (Main), selected
    Video length is 12:01

    Import Pretrained Deep Learning Networks into MATLAB

    From the series: Perception

    In this video, Neha Goel joins Connell D’Souza to import networks designed and trained in environments like TensorFlow and PyTorch into MATLAB® using Open Neural Network Exchange (ONNX). You can download ONNX models for popular pretrained deep learning networks from the ONNX Model Zoo.

    A TinyYOLOv2 ONNX is imported into MATLAB and Neha demonstrates how you can use the Deep Network Designer app to edit the model to prepare it for training. This pretrained model is then trained using transfer learning to identify the objects of interest.

    Resources:

    Published: 3 Jan 2020

    Hello and welcome to another episode of the MATLAB and Simulink Robotics Arena. For today's episode, I've got Neha Goel again with me today. Today's video is another video in our short series on deep learning for object detection.

    And in the past couple of videos, we've covered different topics like data preprocessing for deep learning, as well as building and training your own network from scratch in MATLAB. In previous videos, we briefly touched on two important concepts.

    The first concept is importing pretrained networks from other environments. So we're going to use a tool called ONNX to import pretrained models from things like TensorFlow or PyTorch. And then we're also going to touch on another important topic called transfer learning. So without wasting too much time, let's jump right into the video. And I'm going to hand it over to Neha, and Neha, take it away.

    So yeah, in today's video, we'll be talking about how we actually import a pretrained YOLOv2 network. So this YOLOv2 network is known as Tiny YOLOv2. And it's an ONNX model.

    So if you click on this link, I'll show what an ONNX model is. So ONNX is open neural network exchange. And so in this GitHub link, you will see that it's known as ONNX Model Zoo. And it has models of various different neural network models available out there in the internet.

    Yes, so for example, I can see MobileNet and ResNet.

    Yeah, so this, if you scroll down on this, we will see there's an object detection model here that's Tiny YOLOv2. So this Tiny YOLOv2 is a smaller version of the YOLOv2 network which detects 20 classes.

    Gotcha, OK.

    So we will be downloading this model. And we'll be then training it for our data set.

    OK, cool. So let's go ahead and download this model. The next thing we do is we actually need to import the model into MATLAB, right?

    So once we click on that link from the GitHub, our model file is downloaded as a model like. ONNX file. And you will find that file in the Utilities folder here. So here, the main function uses import ONNX network. So what this function does is it imports your downloaded model or ONNX file, which we have stored here as model file. And we have to give an output layer type.

    So this output layer type can be a classification or a regression depending upon the type of model we are downloading.

    OK, so I'm guessing the reason why we are using a regression layer is because we are building an object detection?

    Yes.

    OK, gotcha.

    All right, so I'm going to go ahead and run this section real quick. And all that it's going to do is it's going to import the model file and save it to a variable called net. And then we'll use the network analyzer to just visualize what our network looks like.

    So as we see this, if you scroll down, you'll see that this has around 34 layers. And these are all the layers that are already in Tiny YOLOv2. And the regression layer that we added is at the bottom of the network.

    OK, cool. So let's hop back into MATLAB. What is the next step that we need to take?

    So next up is we have to rebuild or redesign the network for our object classes and then train it. So for that, there is one option using the Deep Network Designer App. So so this is an option, one of the options that you can use.

    So if you take it as design app as true-- so here, what we'll be doing, we'll be importing the network. So on the Import button, you-- this is the network that we imported.

    So this is the one that we just loaded. So let's go ahead and pull this into the app.

    So can you tell us a little bit about the app while this is opening?

    So in here, in this deep network designer app, there are two options. Either you import the network that we already have in the workspace, what we did here. Or the other option is designing the whole network. So I'll show you how you-- this is the whole network that we actually had. So we have all the 34 layers, all the type of connections, the input image, and the output type.

    So if you want to actually-- if you want to change any parameter, you just click on this layer.

    And these are the options. And you can change these options. And so the other-- what we will be doing here is we'll be importing some part of the network. And so what we can do, click on the network. And you just delete this layer.

    So just hit the Delete button.

    Just delete the button. And if you want to have a new layer, you can just copy/paste the error here and then just connect it.

    Gotcha, OK.

    So this is one of the usages or one of the ways how you can actually design the whole network.

    OK, so this is interesting. Because I know in our video on building the network, we actually wrote a bunch of code to write the network. Is there a way in which I can get the code from this? Because I see that I can interactively lay out a bunch of network. But do I have to go in and rewrite the code for this after that?

    No, so that's the fun part here. Here, you just click here. With this Export, this whole network will be exported in the workspace. But this Generate Code, you can actually generate the whole code. Here, we don't need the code. So we are just using the Export button. And it's exported as layers_1 in the workspace.

    So let's go in the workspace. And we see here that we have the layers_1 network here. So once we have that, we store the whole layers_1 as an lgraph layer. So that's stored in the lgraph variable. Next, we go on calculating the number of classes.

    OK, so the next sort of important topic that we want to talk about, especially with these YOLO networks, is we want to talk about anchor boxes. And I know we spent a significant time in our video on building YOLO networks. So we're not going to talk But what we've got here is if I go ahead and run this section of code, it's going to open up a file that we used to compute these anchor boxes as well as registering these anchor boxes with MATLAB so we can use it for our network. Again, check out our video on designing and training a network. And that's where we talk about the anchor a whole lot more.

    So it looks like we've got our layers together. We actually have to assemble our YOLOv2 network right now. So are we going to use a very similar process to what we did earlier?

    Yes. So it's a very similar process as we used in our last video here. So what you see here, we are using the same YOLOv2 layers function. But the input image size as 128 by 128 by 3, the number of classes, the anchors-- this is a network that is imported from your deep network designer app?

    Yep.

    And the important part here is the feature layer. So here we are choosing the feature layer as activation4. And the reason being is just according to our number of classes and according to the network, we saw that activation4 works best. And after that, the YOLOv2 layers will be added. So if you run this section, you'll be able to see how our new network, the final network looks like.

    OK, so it's going to create this new lgraph. And then I'll open it in the analyze network app.

    So if you see here, it takes all the network from the Tiny YOLOv2 till activation4. And later on, it adds all the YOLOv2 layers.

    OK, so what that add YOLOv2 function is it's basically dropped all the layers that we had in our original network after activation4?

    Yeah, yeah.

    OK, cool. So we've got a network here. Hopefully we should be able to train this to detect our classes, right?

    Yes.

    So let's hop back into MATLAB real quick. The next thing we're going to talk about is transfer learning. So Neha, do you want to give us a little bit of an overview of what learning is before we go any further?

    Yes, so in transfer learning, what does-- the basic of transfer learning is that you import a pretrained network. And then you retrain the network according to your training options, and according to your classes and your data set. So that is the transfer learning.

    OK, so my understanding is that we're taking a network's ability to identify features and then we're transfer learning, we're sort of attaching our new layers at the end and telling it that, OK, hey, these features don't correspond to the original classes that you're trained to identify. But these now correspond to a new set of classes that we are providing.

    Yes.

    OK, gotcha. That makes total sense. So let's actually continue through this, the transfer learning section of our code. Again, we're not going to actually have you sit through the training process. Because it takes a significant amount of time. But what Neha can do is Neha can actually talk a little bit about our training options Because I know we-- I know the training options here are a little different from the ones that we used earlier.

    Yeah, so we can have a lot of options in our training options. And in here, I played with some of the learning rate options only. So in here, what you will see are like-- we are using the solver as sgdm. That is the similar one that I used before. But the only difference is the learning rate.

    So initially, I was putting the learning rate at 0.001, as a constant one. By here, I'm planning to change while it trains with every epoch. So if you see here, the learn rate schedule, the drop factor, and drop period. These three is a combination.

    So it says that after every five epochs, it should drop with a factor of 0.5. So this is the way how you can actually drop the learning rate after every five epochs. You can give it as 10 epochs or a more factor. But this is a good one for this data set and for this network.

    Basically, figure out what works best for you at this point.

    Yes.

    So OK, so these network, these training options worked best for us. As you see, down here, what we're doing is we're calling the training function. So I'm going to go ahead and run this section of code again. What we're going to do is we're just going to go and load a pretrained detector.

    But you can very well run the training options in your own time. So we should see a detector Tiny YOLOv2 loaded into our workspace. All right, so let's actually go ahead and see how the other sample performs.

    So I'm going to run this on our evaluation section. Again, we're just going to step through our test data set and visualize the outputs. So, as we can see, it's able to identify some of the smaller elements, again, probably not as well as it did earlier. But I guess that really comes down to the amount of data that we had, right?

    Yes, so if you see the-- it depends upon your type of the data. So a few classes have-- like buoys have the amount of data is less in comparison to our So if we give, for the training, our buoys data set a lot more, then the performance would have been increased a lot more. But you could see that have a lot of data. So is a very good accuracy here.

    OK, so to sort of round out this video, let's actually go ahead and numerically evaluate how I'm going to run this section. It's going to calculate detection position and mislead and give us a plot at the end.

    And we have talked in detail about how and what are these options in our last week. So you can check out if you want to see how and what functions we are using it.

    This is the numerical check for how good or how bad our detector is. So in this way, we see how you can actually go in and import an already trained network from an outside environment into MATLAB. All right, so thank you so much for joining us for this video. I highly recommend that you stick around for the next video, where we're going to talk about how to take networks that have been trained, either trained or imported in MATLAB and deployed onto an NVIDIA Jetson, by using the GPU CUDA product and converting it into CUDA code before deploying it.

    So stick around for that video. In the meantime, get in touch with us through our email address on the screen and our Facebook group. And we'd love to hear from you. So see you again on the Robotics Arena.