Is Matlab/ Simulink supporting Nvidia external GPU Systems/Thunderbold 3 for Machine Learning and Deep Learning projects including embedded systems like Jetson TX2?

3 visualizaciones (últimos 30 días)
Hi there,
we are planning some research projects on machine learning (big data approaches, creating predictive models, deep learning with object recognition). We plan to use GPU computing for this.
To make the GPU easily available for different users (with their laptops) we thought of buying an "external GPU" case (equiped with i.e. Nvidia Geforce 1080Ti) in combination with Thunderbold 3 interfaces. Different users could prepare their tasks and alternately connect the eGPU only for training of their networks.
Does MATLAB/Simulink support this setup?
Does anybody use this hardware setup? (-in combination with the "GPU-Coder Add-on" and Nvidia Jetson TX2 embedded hardware?)
How are your experiences?
Thank you!
Marcus

Respuesta aceptada

Joss Knight
Joss Knight el 19 de Mayo de 2018
Yes, it's supported. This isn't really a question of MATLAB's system requirements; if the CUDA driver can interact with your card then so can MATLAB. You will notice a considerable increase in latency due to the dramatically reduced bandwidth to get data on and off the card, so you need to minimize the amount of CPU/GPU communication.

Más respuestas (1)

Marcus Kreuzer
Marcus Kreuzer el 21 de Mayo de 2018
Thank you for this information!
Since I'm a newbie to this topic: I asked myself what you mean with "considerable" increase in latency? Could you please give a rough estimation? Let's say, a comparison (same GPU for all) of:
(i) a desktop system with internal GPU (calculation needs 100min)
(ii) an eGPU system with optimized CPU/GPU communication (estimated calculation needs 100+x min)
(iii) an eGPU system with non-optimized CPU/GPU communication (estimated calculation needs 100+evenmorethan x min)
For example, a Nvidia GTX 1080 Ti GPU card is equipped with 11GB of V-RAM. So to reduce the CPU/GPU communication the aim is to fill this V-RAM with data, let the GPU do the calculations, and "bring back" the results to the CPU/PC, right? Or do CPU and GPU have to communicate continously and establish a kind of a data stream?
Thank you!
Marcus
  6 comentarios
Marcus Kreuzer
Marcus Kreuzer el 20 de Jun. de 2018
Editada: Walter Roberson el 20 de Jun. de 2018
Hi!
I wanted to give some feedback:
We finally decided to buy a workstation, equipped with a Nvidia GTX 1080 Ti. The reason is simple: an external case is pricy, for a little more you get an Desktop PC, too. And laptops with Thunderbolt C interface are still rare. As soon as we bring this card to its limits (no one knows when...), we'll buy the "latest" GPU by then. The old one will be put into an external case to be used for "first experiments" and student's practices.
Another interesting source may be: https://egpu.io/build-guides/
This community did a lot of work for using external GPUs on Laptops (Tutorials, lists with working "constructions" etc...)
Thank you all for your help!
Marcus

Iniciar sesión para comentar.

Categorías

Más información sobre Get Started with Statistics and Machine Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by