This script demonstrates a simple neural network training process using gradient descent. The network processes a set of training data, adjusts its weights and biases based on the error, and visualizes the outputs in real-time.
- Neural Network Training: Utilizes a basic form of a neural network to process training data.
- Gradient Descent: Employs gradient descent to optimize the network's weights and biases.
- Real-Time Plotting: Visualizes the network's output for each training iteration in real-time using Matplotlib.
The script is structured as follows:
-
Softplus Activation Function: Defines a
softplus
function used as the activation function in the neural network. -
Derivative Functions: Includes derivative functions (
derW1
,derW2
, etc.) to compute gradients for gradient descent. -
Initial Setup: Sets up the training data and initializes the weights and biases.
-
Plotting Setup: Configures a Matplotlib plot for real-time visualization.
-
Training Loop:
- The main loop runs for a predetermined number of iterations.
- In each iteration, the network processes the training data.
- The derivatives of the weights and biases are calculated.
- The weights and biases are updated using the calculated derivatives.
- The outputs are visualized in real-time on the plot.
-
Finalization: Turns off the interactive mode and displays the final plot.
- Ensure
numpy
andmatplotlib
are installed in your Python environment. - Adjust the
learning_rate
and the number of iterations as per your requirements. - Run the script in an environment that supports real-time plotting (like VS Code).
Real-time plotting can slow down the training process, especially for a large number of iterations. Adjust the frequency of plot updates for optimal performance.