What to do when you really ARE out of memory?
Información
La pregunta está cerrada. Vuélvala a abrir para editarla o responderla.
Mostrar comentarios más antiguos
What is the solution for optimizing code when you really are just working with too large of a dataset?
Currently I need to perform triscatteredinterp using 3 vectors all (100,000,000 x 1).
scatteredInterpolant does not work any better in this instance.
Respuestas (3)
the cyclist
el 4 de Ag. de 2015
Editada: the cyclist
el 4 de Ag. de 2015
2 votos
For very large datasets, processing a random sample of the data will often give satisfactory results.
Walter Roberson
el 4 de Ag. de 2015
1 voto
Store the data in hierarchies such as octrees that allow you to extract a subset that fits within working memory to do the fine-grained work on.
Robert Jenkins
el 7 de Ag. de 2015
1 voto
La pregunta está cerrada.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!