Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: Performance function with pattern recognition in neural networks
Posted:
Mar 29, 2013 6:09 PM


"William " <william.henson@amec.com> wrote in message <kivgk2$i7$1@newscl01ah.mathworks.com>... > Hi, > > A question for anyone who might know or have an opinion. > > I have set up a neural network to perform a pattern recognition (or classification) but I have found I am getting way too many false negatives compared to what I might actually get with say a Support Vector Machine set up. One possibility I am thinking is that the SVM set up can have harsh penalties for incorrect classifications. So, with this in mind, is there a "best" performance function for pattern recognition with neural networks?? Or am I best to say use use some function on the distance from the hyperplane (or similar)?? > > Cheers
You have given absolutely no information that will let any one help you. Are you using patternnet with tansig/logsig or tansig/softmax ? Dimension of inputs? How many classes? For c classes does your target contain columns of the cdimensional unit matrix eye(c) or eye(c1)? How unbalanced is the data set: How large is each class? Are the ratios of class sizes the same as the apriori probabilities of the general population? Are the misclassification costs specified or the usual default values {0,1}?
My apriori advice is to standardize your inputs and remove or modify outliers. Then use duplicates with or without added noise so that the number in each class is equal. If you have c classes, the cdimensional targets and class indices can be obtained from each other via ind2vec and vec2ind.
Once the net is trained to yield approximately equal errors, you can transform the outputs by multiplying to account for differences in class priors and classification costs.
You might find some old posts of mine in comp.ai.neuralnets and CSSM regarding priors and classification costs that will help.
Hope tis helps.
Greg



