Models of neural computation: Difference between revisions

Content deleted Content added
m Cross-correlation in sound localization: Jeffress model: Journal cites, Added 1 doi to a journal cite
m Neural networks: Typo fixing, replaced: preform → perform using AWB
Line 151:
<math>f_{j}=\sum_{i}g\left(w_{ji}'x_{i}+b_{j}\right)</math>.
 
This response is then fed as input into other neurons and so on. The goal is to optimize the weights of the neurons to output a desired response at the output layer respective to a set given inputs at the input layer. This optimization of the neuron weights is often preformedperformed using the [[Backpropagation| backpropagation algorithm]] and an optimization method such as [[gradient descent]] or [[Newton's method| Newton's method of optimization]]. Backpropagation compares the output of the network with the expected output from the training data, then updates the weights of each neuron to minimize the contribution of that individual neuron to the total error of the network.
 
===Genetic algorithms===