Neural Network

This network uses ReLU activation functions and no regularization.

  1. Compute, by hand, a forward propagation followed by error backpropagation and weight adjustment.
  2. Check yourself by clicking the corresponding text labels in turn.
  3. Keep clicking them in turn, and watch the weights converge.

Also, play with the user-settable training instance and learning rate.