Borrar filtros
Borrar filtros

GPU Out of memory on device

7 visualizaciones (últimos 30 días)
Nikhat Ali
Nikhat Ali el 9 de Mayo de 2023
Respondida: Himanshu el 25 de Mayo de 2023
I am using trainautoencoder function to encode my images, I have total image 4099 of size 128x128. I am running the code using my GPU machine using 'UseGPU',true command, because when I tried running my code without this my CPU was occupied more and system restarted in middle of running my code. But when i was using above 'UseGPU',true command in my function I got following Error: pls help
the GPU description
Code where error occured:

Respuesta aceptada

Himanshu
Himanshu el 25 de Mayo de 2023
Hello Nikhat,
I understand that you are facing an "Out of memory on device" error while using the "trainAutoencoder" function with GPU acceleration in MATLAB. You are trying to train an autoencoder on a large dataset of 4099 images of size 128x128.
The error message you encountered suggests that your GPU is running out of memory while training the autoencoder. This can happen if the memory required for the computations exceeds the available memory on your GPU.
To resolve this issue, you can reduce the batch size. Decreasing the batch size will reduce memory usage during training. You can try reducing the batch size in the "trainAutoencoder" function by implementing a mini-batch mechanism by splitting up the training dataset and training it on each mini-batch iteratively.
If reducing the batch size doesn't solve the memory issue, you can try reducing the size of the hidden layers in your autoencoder. Smaller layers will require less memory for computation.
You can refer to the below documentation to learn more about "trainAutoencoder" function in MATLAB.

Más respuestas (0)

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Etiquetas

Productos


Versión

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by