Neural Network
This network uses ReLU activation functions and no
regularization.
- Compute, by hand, a forward propagation followed by error
backpropagation and weight adjustment.
- Check yourself by clicking the corresponding text labels in
turn.
- Keep clicking them in turn, and watch the weights
converge.
Also, play with the user-settable training instance and
learning rate.