Working on large .db file (database file) using MapReduce?

7 visualizaciones (últimos 30 días)
Jørgen Fone Pedersen
Jørgen Fone Pedersen el 3 de Mzo. de 2021
Respondida: Samay Sagar el 28 de Feb. de 2025
I have a very large signal file stored in a sqlite database file, it over 100 gb. I have the code section for extracting the data over to a matrix in matlab and storing it in a ,mat file, which has worked fine for small signals. However, I would like to do this to the large signal also but the signal is too large for memory.
I read that you can work on large files using MapReduce, however i only see how to use this with .csv files or other types of table data.
Is it possible to use MapReduce with this type of file? Or are there any other method to work on this signal without loading it into memory in its entirety?

Respuestas (1)

Samay Sagar
Samay Sagar el 28 de Feb. de 2025
You can use "mapreduce" while working with large SQLite Databases. You can use the "databaseDatastore" function to create a datastore for your database which can subsequently be used with the "mapreduce" function.
Refer the following documentations for more information:

Categorías

Más información sobre Workspace Variables and MAT Files en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by