Date: Jan 24, 2013 2:00 PM
Author: Derek Goring
Subject: Re: NaN filtering question
On Friday, January 25, 2013 7:19:08 AM UTC+13, Andy wrote:

> Hi All,

>

> I have some data from an instrument in a series of 1800 by 600 matrices, with NaNs where the instrument could not collect data. In some spots, there is a little bit of data surrounded by NaNs, which looks quite messy when plotted, and may not be accurate. I have some code that runs over the matrix and counts the number of NaNs around each point and then deletes them if surrounded by too many NaNs. It works, but I have committed the matlab cardinal sin of nesting for loops (gasp!) :

>

>

>

> for i=3:dimmX-2

>

> for j=3:dimmY-2

>

> sur=aod(i-2:i+2,j-2:j+2);

>

> surNan(i,j)=length(find(isnan(sur)==1));

>

>

>

> end

>

>

>

> end

>

> index=find(surNan>surMin);

>

> aod(index)=NaN;

>

>

>

> this of course is very slow, and I need to create averages from many of these matrices. Does anyone know of a better way to do this, or point me in a faster direction?

>

> Thanks, Andy.

Instead of length(find(isnan(sur)==1)) use simply sum(isnan(sur))

Logical indexing is faster than find. Only use find when you have to, so:

index=find(surNan>surMin);

should be

index=surNan>surMin;

Other than that, your method looks good to me.

BTW, loops are not evil; only eval() is evil.