← Back to Gallery
Backpropagation
Learning Rate:
0.5
Hidden Neurons:
4
Reset Network
Forward + Backward Step
Train (100 epochs)
Epoch:
0
Loss:
-
Pattern:
XOR
Neural Network Training
• Forward pass: compute activations
• Backward pass: compute gradients
• Update: adjust weights by gradients
Positive
Negative
Gradient flow