In this approach the strength and type, excitatory or inhibitory, of synaptic connections are represented by the magnitude and sign of weights, that is, numerical [[coefficients]] <math>w'</math> in front of the inputs ''<math>x''</math> to a particular neuron. The response of the ''<math>j''</math>-th neuron is given by a sum of nonlinear, usually "[[sigmoid function|sigmoidal]]" functions <math>g</math> of the inputs as:
This response is then fed as input into other neurons and so on. The goal is to solve foroptimize the weights givenof the neurons to output a desired response at the output layer respective to a set given inputs at the input layer. This optimization of the neuron weights is often preformed using the [[Backpropagation| backpropagation algorithm]] and an optimization method such as [[Gradient descent|gradient descent]] or [[Newton's method| Newton's method of optimization]]. Backpropagation compares the output of the network with the expected output from the training data, then updates the weights of each neuron to minimize the contribution of that individual neuron to the total error of the network.