GPU out of memory issue appears with trainNetwork.
9 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I have a Tesla P100 with 16 GB RAM. Yesterday, I ran the trainNetwork() with different layer achitectures and few different input data. It worked. Then I tried a larger input data set, but get the out of memory error:
Error using trainNetwork
GPU out of memory. Try reducing 'MiniBatchSize' using the trainingOptions function.
Error in A1_B1_C1a_D2 (line 152)
[net,netinfo] = trainNetwork(trainInput,trainTarget,Layers,options);
Caused by:
Error using gpuArray/hTimesTranspose
Out of memory on device. To view more detail about available memory on the GPU, use 'gpuDevice()'. If the problem persists, reset the GPU by calling 'gpuDevice(1)'.
I try to do what is suggested, but it doesn't help. I have tried many different less intensive approaches, done a reboot, and I even have returned to the scripts that used to work fine.
Now nothing works.
Any suggestions to troubleshoot hardware faults or a protective status somewhere?
0 comentarios
Respuesta aceptada
Matt J
el 3 de Mayo de 2023
Editada: Matt J
el 3 de Mayo de 2023
Then I tried a larger input data set, but get the out of memory error:
If you make your data larger and larger, you will eventually run out of memory. Maybe reduce the MiniBatchSize setting.
13 comentarios
Joss Knight
el 13 de Mayo de 2023
Editada: Joss Knight
el 14 de Mayo de 2023
Seems fairly clearcut to me. In your first image, fc2 alone takes up 7.4GB so you're definitely going to struggle, especially for training because you need 8GB for weights, 8GB for their gradients, and probably 8 more for temporaries while you're updating the weights. You need a smaller network. Try adding more convolution layers rather than relying on a massive fully connected layer to do most of the work. Look at the Total Number of Learnables at the top of the Network Analyzer window and multiply it by 4 to get the number of bytes your network will need.
Your other network is much smaller, a 'mere' 1.4GB for the fully connected layers.
Más respuestas (0)
Ver también
Categorías
Más información sobre Parallel and Cloud en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!