Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: Downsampling very large text file
Replies: 7   Last Post: Apr 18, 2013 9:02 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
bram

Posts: 4
Registered: 4/10/13
Downsampling very large text file
Posted: Apr 10, 2013 2:13 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

Hi all,

For an experiment I measured vast amounts of data because of high sample rate. Ive taken out important parts, now I need to downsample the data to get a general trend.

The data is in a textfile, it had 8 columns and so many rows the text files are around 50GB. columns are seperated by tabs. first 84 rows are sensor properties.

Im trying to read 200 000 rows and average them into a single value and write them to a new array, then take the next ones....and so on.

Can you guys help me along? Ive used quite some matlab but never analysed such data files.

Ive made a start with textscan, seems to be working nicely, but im having trouble making proper for-loops in combination with the ~foef test.

Hope you can help,

Bram



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.