"Sean de " <firstname.lastname@example.org> wrote in message <email@example.com>... > "Alexey" wrote in message <firstname.lastname@example.org>... > > Hi, > > > > I have a 65k-element array of objects of a particular class. > > > > >> whos results > > Name Size Bytes Class Attributes > > > > results 1x65000 3025921848 StrategyResultDef > > > > When I try to save it to a MAT file I get the following error: > > > > >save backtesting/files/technical-obj > > Warning: Variable 'results' cannot be saved to a MAT-file whose version is older > > than 7.3. > > To save this variable, use the -v7.3 switch. > > Skipping... > > > > And the variable 'results' is not saved. > > > > If I do: > > > > >save -v7.3 backtesting/files/technical-obj > > > > I do not get an error but it takes about 4 hours to save the file and its size is 1.5gb. > > > > I want to save the results for later processing but at this rate it's easier for me to re-generate the results every time. > > > > I will have a look at reducing the size of the structure but does anyone have any suggestions as to what is the best way to save the results like this? > > > > Kind Regards, > > Alexey > > Hmm that does seem like a really long time. I frequently save/load 9GB files (structs with 3-3GB fields) and it usually takes a few minutes (MAC r2009b, 64bit). I have found for simple computations it's quicker to recalculate than to save. I.e: write a script that loads/generates what you need, run it, go get a coffee. > > You could also try writing it to a binary file using fwrite and reading it in with fread. This is a pain though since you need to remember: dimensions, classes, bytes to skip if necessary etc.
Thanks for suggestions! I have tried to use R2010b and it has the same problem. Once the vector of objects reaches 64k elements it is impossible to save it. Just takes forever and then fails with "cannot close file" or something like this.
I'll probably consider saving to database as I wanted to do searches through the results.