This question regards the learnsomb (batch self organizing map learning) function in the neural network toolbox. There is a line of code in this function like a = a .* double(rand(size(a))<0.9);... it sets 10% of the values of the matrix a randomly to be 0. This is effectively like ignoring 10% of the training data during each iteration.
Does anyone know why this is here? I can't think of any good reason to do this. It seems to prevent the algorithm from ever converging to a stable solution since the training data is constantly changing. The documentation doesn't give any references for where their algorithm came from.