Date: Apr 10, 2013 2:13 PM
Author: bram
Subject: Downsampling very large text file

Hi all,

For an experiment I measured vast amounts of data because of high sample rate. Ive taken out important parts, now I need to downsample the data to get a general trend.

The data is in a textfile, it had 8 columns and so many rows the text files are around 50GB. columns are seperated by tabs. first 84 rows are sensor properties.

Im trying to read 200 000 rows and average them into a single value and write them to a new array, then take the next ones....and so on.

Can you guys help me along? Ive used quite some matlab but never analysed such data files.

Ive made a start with textscan, seems to be working nicely, but im having trouble making proper for-loops in combination with the ~foef test.

Hope you can help,

Bram