How can I fix minmum mse requirement for neural network training mannually rather using default requirement?

1 visualización (últimos 30 días)
I wish to stop traing automatically when mse falls below 0.0001, but system is waiting till it becomes 10^-14 or so.

Respuesta aceptada

Greg Heath
Greg Heath el 26 de Mzo. de 2014
[ I N ] = size(x)
[ O N ] = size(t)
% Reference MSEs (From Naïve Constant Output Model y00 = mean(ttrn,2))
Ntrn = N-2*(0.15*N) % Default no. of training examples
MSEtrn00 = mean(var(ttrn',1)) % Biased training target variance (divide by Ntrn)
MSEtrn00a = mean(var(ttrn',0) % Degree of freedom "A"djusted (divide by Ntrn-1)
% Number of estimation degrees of freedom (See Wikipedia)
Ntrneq = Ntrn*O % prod(size(ttrn)) is number of training equations
Nw = (I+1)*H+(H+1)*O % Number of unknown weights for an I-H-O node topology
Ndof = Ntrneq-Nw % Not useful when Nw >= Ntrneq
etrn = ttrn-ytrn; % Training error
MSEtrn = sse(etrn)/Ntrneq % Biased MSE
= mse(etrn)
MSEtrna = sse(etrn)/Ndof % DOFA MSE
= Ntrneq*MSEtrn/Ndof
% Normalized MSEs
NMSEtrn = MSEtrn/MSEtrn00
NMSEtrna = MSEtrna/MSEtrn00a
% Practical training goal
NMSEtrna <= 0.01 % Independent of target scaling
net.trainParam.goal = 0.01*Ndof*MSEtrn00a/Ntrneq
Hope this helps.
Thank you for formally accepting my answer
Greg

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by