Neural Network Toolbox

Improved Generalization

Improving the network’s ability to generalize helps prevent overfitting, a common problem in neural network design. Overfitting occurs when a network has memorized the training set but has not learned to generalize to new inputs. Overfitting produces a relatively small error on the training set but a much larger error when new data is presented to the network.

Neural Network Toolbox provides two solutions to improve generalization:

  • Regularization modifies the network’s performance function (the measure of error that the training process minimizes). By including the sizes of the weights and biases, regularization produces a network that performs well with the training data and exhibits smoother behavior when presented with new data.
  • Early stopping uses two different data sets: the training set, to update the weights and biases, and the validation set, to stop training when the network begins to overfit the data.
Next: Simulink Blocks and Control Systems Applications

Try Neural Network Toolbox

Get trial software

Predicción de Precios y Carga de Electricidad con MATLAB

View webinar