Models of neural computation: Difference between revisions

Content deleted Content added
NEURON: removed bias to a product
Sfan00 IMG (talk | contribs)
m v1.41b - WP:WCW project (Link equal to linktext)
Line 151:
<math>f_{j}=\sum_{i}g\left(w_{ji}'x_{i}+b_{j}\right)</math>.
 
This response is then fed as input into other neurons and so on. The goal is to optimize the weights of the neurons to output a desired response at the output layer respective to a set given inputs at the input layer. This optimization of the neuron weights is often preformed using the [[Backpropagation| backpropagation algorithm]] and an optimization method such as [[Gradient descent|gradient descent]] or [[Newton's method| Newton's method of optimization]]. Backpropagation compares the output of the network with the expected output from the training data, then updates the weights of each neuron to minimize the contribution of that individual neuron to the total error of the network.
 
===Genetic algorithms===