I've just run following test:
- create a timetable (2 variables) of 15 GB and split it into several files
- Test 1: 15 GB in 10 files
- Test 2: 15 GB in 100 files
- Test 3: 15 GB in 1000 files
- Test 4: 15 GB in 10000 files
After that, I use fileDatastore to extract the mean value of the first variables. Results are as follow:
- Test 1: 183 seconds
- Test 2: 95 seconds
- Test 3: 95 seconds
- Test 4: 141 seconds
It is clear that there is an optimum "chunk size" but I still have no clue how to determine it in a non empiric way.
