A neuron with a data point that has two features - X1 & X2. Expected output is Y. W1 and W2 are randomly generated weights and these can be thought as synaptic strengths of neuron. Z represents aggregated synaptic strength of the neuron for the given data point. Z is fed to a Sigmoid function which squishifies its input values to [0, 1]. Goal of neural network training is to minimize the value of cost function by finding optimal values of weights - it requires multiple data points and multiples forward and backward passes. Backward passes compute impacts of weights on the cost function and weights are updated before the next forward pass. #neuron #genai #calculus #math #neuralnet #sigmoid

Neural network forward pass in action with two data points each with two features. Can you imagine how it looks like with thousands of data points at a time? #genai #neuron #neuralnet