Borrar filtros
Borrar filtros

Automatically detect last row of CSV file with dlmread

11 visualizaciones (últimos 30 días)
lexi11
lexi11 el 27 de Nov. de 2016
Comentada: lexi11 el 28 de Nov. de 2016
Hi,
In my code,I load a csv file every time I get data from a sensor and read separate columns to process in matlab.
I have tried:
my_column = dlmread('file1.csv', ',', [1 3 500 3]);
This works fine and I can choose the required columns and rows.
But every time I get data from a sensor, I have to manually insert the last row number (e.g. 500 in the above code). If I need to process many columns, I have to change each line of dlmread.
Is there a way to tell matlab to get the bottom-most row with data as the last row? (I do not need to do this for columns because number of columns are constant.)
Thanks in advance

Respuesta aceptada

Walter Roberson
Walter Roberson el 27 de Nov. de 2016
If I understand correctly, what you want to do is read all of the rows in the file but only a subset of the columns. Unfortunately, dlmread() does not have a syntax for that.
The easiest approach is to dlmread() the entire file and then throw out the extra columns.
Second easiest would probably be to use fopen() / textscan() / fclose(), having created a format that ignores everything after the third column. For example,
fmt = '%f%f%f%*[^\n]';
fid = fopen('file1.csv', 'rt');
result_cell = textscan(fid, fmt, 'Delimiter', ',', 'CollectOutput', 1);
fclose(fid);
my_column = result_cell{1};
  4 comentarios
Walter Roberson
Walter Roberson el 28 de Nov. de 2016
Sequential files do not store any information about how many lines are in them. The operating system tracks the number of bytes in the file but not the number of lines.
lexi11
lexi11 el 28 de Nov. de 2016
OK, thank you.

Iniciar sesión para comentar.

Más respuestas (1)

dpb
dpb el 27 de Nov. de 2016
Editada: dpb el 28 de Nov. de 2016
With a sequential file, "no, not really". It'll likely be just as fast to read the whole file every time and discard all except the last record in memory. Only if you're letting the file grow unbounded where memory could potentially eventually become an issue would I consider anything else; at least until is proven to be a bottleneck in practice.
Alternatively, you could read the file via fgetl in loop until feof, keeping the last record and then parse from it.
Whatever, there is no way to read a sequential file by direct-access record except by simulating the same thing. It might be possible to not append to the output file from the instrument but to just write the single record then let the processing function be responsible for archiving the data instead. Then you only have either one record in the file (or with some additional logic, unprocessed record(s) if happened to miss one).
Or, perhaps you could rearrange the function to actually retrieve the data directly instead of the interim file and again let it be responsible for archiving it.
But, the simplest solution to go with first is to just read the file and keep the record(s) of interest is likely both easiest and best (or at least good enough).

Categorías

Más información sobre Text Data Preparation en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by