NN accuracy on test set low

5 visualizaciones (últimos 30 días)
Anitha
Anitha el 12 de Mzo. de 2014
Comentada: Greg Heath el 20 de Mzo. de 2014
I have implemented a neural network in Matlab R2013a for character recognition. I have used trainbr function for nn training. 80% samples were used for training and the rest for testing. When i plot the confusion matrix, i get 100% accuracy on the training set. But for the test set the accuracy is very low(around 60%). What could be possibly wrong?

Respuesta aceptada

Greg Heath
Greg Heath el 18 de Mzo. de 2014

Más respuestas (3)

Greg Heath
Greg Heath el 13 de Mzo. de 2014
Insufficient info:
How many characters?
How many examples for each character?
What are the dimensions of the input and target matrices?
Are the summary statistics of the training and test subsets sufficiently similar?
How many input, hidden and output nodes?
What values of hidden nodes did you try ?
How many random weight initializations for each value ?
Although trainbr should mitigate the effect of using more hidden nodes than are needed, you still need many trials to establish sufficient confidence intervals.
Hope this helps.
Thank you for formally accepting my answer
Greg
  6 comentarios
Anitha
Anitha el 16 de Mzo. de 2014
Editada: Anitha el 16 de Mzo. de 2014
I have done the code as you said with multiple designs.Now i am getting good results. After running 10 trials, i get best test set performance in trial 8. Now i want to train my final neural network with the parameters of this trial 8. But the problem is that, I have saved the rng state but when I use the same state for traing my net again, I get different results. This is my code.
clear all, close all, clc, plt=0;
tic
load inputdata
load targetdata
x=input;
t=target;
hiddenLayerSize = 30;
net = patternnet(hiddenLayerSize);
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
net.trainFcn='trainbr';
net.trainParam.epochs=100;
rng(4151941)
Ntrials=10
for i = 1:Ntrials
s{i} = rng;
[ net tr y e] = train(net,x,t);
mseval(i) = tr.best_tperf; % Best mseval over all epochs of ith run
tstind = tr.testInd;
ytst = y(:,tstind);
ttst = t(:,tstind);
plt=plt+1,figure(plt)
plotconfusion(ttst,ytst)
title([ ' TEST SET CONFUSION MATRIX. TRIAL = ', num2str(i)] )
hold off
end
[ minmseval ibest ] = min(mseval);
rng = s{ibest}; % For repeating the best design
bestnet = configure(net,x,t);
bestIW0 = bestnet.IW
bestb0 = bestnet.b
bestLW0 = bestnet.LW
[ bestnet tr y e] = train(net,x,t)
tstind = tr.testInd;
ytst = y(:,tstind);
ttst = t(:,tstind);
plt=plt+1,figure(plt)
plotconfusion(ttst,ytst)
title([ ' OVERALL TEST SET CONFUSION MATRIX= '] )
hold off
save net
view(bestnet)
Greg Heath
Greg Heath el 19 de Mzo. de 2014
Two mistakes
1. No configure statement in the loop
2. Used net instead of bestnet in the last train statement

Iniciar sesión para comentar.


Greg Heath
Greg Heath el 16 de Mzo. de 2014
1. Not necessary to specify default process functions.
2. How did you know my birthdate is 4151941 ??
3. You are reusing the same net for each trial without using CONFIGURE.
Therefore, the initial weights of each trial are the final weights of the last trial.
I suspect that if the design results are not monotonically better it is because TRAIN is
using a new trn/tst division.
4. Use configure after the RNG initialization.
5. An alternate approach is to CONTINUALLY save one or all of
a. the best current RNG state
a. the best current net
b. the best current Wb = getwb(net)
6. I think you should do all three at the same time and compare results
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 comentarios
Anitha
Anitha el 17 de Mzo. de 2014
Hi Greg,
I used an example from your previous post. That is why i set the rng as 4151941. I have modified the code as you said with configure after initialization. I got the best result in trial 4. But when i try to train the final network with these parameters, i still get different results. The weights and bias are all different as well. The confusion matrix is also different. What shall i do?
clear all, close all, clc, plt=0;
tic
load inputdata
load targetdata
x=input;
t=target;
hiddenLayerSize = 30;
net = patternnet(hiddenLayerSize);
net.trainFcn='trainbr';
net.trainParam.epochs=100;
rng(4151941);
Ntrials=15
for i = 1:Ntrials
s{i} = rng;
net = configure(net,x,t);
netIW0{i} = net.IW
netb0{i} = net.b
netLW0{i} = net.LW
[ net tr y e] = train(net,x,t);
% Best mseval over all epochs of ith run
tstind = tr.testInd;
ytst = y(:,tstind);
ttst = t(:,tstind);
%mseval(i) = mse(net,ttst,ytst)
mseval(i)=tr.best_tperf;
plt=plt+1,figure(plt)
plotconfusion(ttst,ytst)
title([ ' TEST SET CONFUSION MATRIX. TRIAL = ', num2str(i)] )
hold off
end
[ minmseval ibest ] = min(mseval);
rng=s{ibest}; % For repeating the best design
bestnet = configure(net,x,t);
bestIW0 = bestnet.IW
bestb0 = bestnet.b
bestLW0 = bestnet.LW
[ bestnet tr y e] = train(net,x,t);
msetst=tr.best_tperf;
tstind = tr.testInd;
ytst = y(:,tstind);
ttst = t(:,tstind);
fWb=getwb(bestnet);
%msetst= mse(bestnet,ttst,ytst)
plt=plt+1,figure(plt)
plotconfusion(ttst,ytst)
title([ ' OVERALL TEST SET CONFUSION MATRIX= '] )
hold off
save bestnet
view(bestnet)
Greg Heath
Greg Heath el 19 de Mzo. de 2014
The second train statement contains net instead of bestnet

Iniciar sesión para comentar.


Anitha
Anitha el 19 de Mzo. de 2014
Hi greg, Thanks a lot..Got it now..

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by