Not using GPU for trainNetwork?
Mostrar comentarios más antiguos
Hello,
I am looking to train a network using the trainNetwork command. I have set up the network, options, and data. I have installed the Parallel Processing Toolbox and my GPU is NVIDIA QuadroM1000M with compute capability 5.0 (which should be enough compute capability per https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html). It was suggested in the Matlab Deep Learning Onramp that the GPU would be automatically used if I had the processing toolbox and my GPU was compatible. However, when running trainNetwork() it does not use the GPU. Using code gpuDeviceTable returns nothing. Does this suggest my GPU actually is not compatible or is there some other way I can access it?
Thank you
Respuesta aceptada
Más respuestas (1)
yanqi liu
el 23 de Mzo. de 2022
0 votos
yes,sir,may be use
>> gpuDevice
to check your ExecutionEnvironment
or in train option,set
'ExecutionEnvironment','gpu'
'ExecutionEnvironment','cpu'
to make the device type when training
Categorías
Más información sobre Parallel and Cloud en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!