Running GLMs on GPU

8 visualizaciones (últimos 30 días)
Aravind Krishna
Aravind Krishna el 15 de Sept. de 2017
I was wondering if it was possible to generate generalized linear models from a gpu to speed up the process since that is a rate limiting step in my code - ends up taking about a week for many models.
training data is of the order [100x10]. Is it possible with a simple gpuArray ? if Not how do I run GLMs on GPU?
Thanks in advance (p.s. I have no idea how GPU programming is done)

Respuestas (0)

Categorías

Más información sobre GPU Computing en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by