Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
NCTM or The Math Forum.



Re: very large array
Posted:
Jun 25, 2013 1:26 PM


"James Tursa" wrote in message <kqbdh7$amk$1@newscl01ah.mathworks.com>... > "Lorenzo Quadri" <quadrilo_sub_r@gmail.com> wrote in message <kqbcna$93q$1@newscl01ah.mathworks.com>... > > dpb <none@non.net> wrote in message <kqaumh$4ko$1@speranza.aioe.org>... > > > On 6/24/2013 6:26 PM, James Tursa wrote: > > > > dpb <none@non.net> wrote in message <kq9r5n$21t$1@speranza.aioe.org>... > > > >> > > > >> While doing it by a loop is _NOT_ the way for large cases, sometimes > > > >> it is handy and time isn't an issue for small array sizes. The way in > > > >> general to do such things is to start at the end and progress > > > >> forwardsthat way the lower indices aren't affected by the deleted > > > >> rows... > > > >> > > > >> for i=length(dat):1:1 > > > >> if((sum(dati(i,:))<355) & range(dati(i,:))>20) > > > >> dati(i,:) = []; > > > >> end > > > >> end > > > > > > > > I understand what you are saying about using a loop on small datasets, > > > > but IMO this is never a good way to program in MATLAB, even on small > > > > datasets.... > > > > > > I don't recall saying it was a "good" way...just that it won't be > > > terribly noticeable on small datasets. > > > > > > And, yes, no, I'd never program that way in Matlab unless I had some > > > ulterior motive as in this posting as a pedagogical tool... > > > > > >  > > > > I can't do logic discrimination on dataset on loading because is created by an other program as complete array (matlab workspace format). > > I can iterate the whole dataset and mark zeros the rows I want to delete, but then I've to delete the rows in possible fast way. > > The fastest way is likely what dpb already posted. The vectorized logical index method: > > ix=sum(dat,2)<355 & range(dat,2)>20; > dat(ix,:)=[]; > > James Tursa
Thanks a lot works greatly, about 20 seconds to delete 300000000 rows thx dpb and tursa



