# neural-networks.io

## neural-networks.io

# Training algorithm

### Network architecture

Consider the following single layer architecture:

where:

- \(x_i\) are the inputs of the network
- \(o_j\) are the outputs of the network
- \( f(\Sigma) \) is the activation function
- \( w_{ij} \) is the weight connecting input \( x_i \) to neuron \( j\) (i.e. associated to output \( o_j \))

### Algorithm

Initialize weights \( w_{ij} \) with arbitrary values
Repeat
Pick a training example <\( x \),\( \check{o}\)> (x is the input, \( \check{o} \) is the expected output)
Compute the sum for each neuron: \( S_j = \sum\limits_{i=1}^N w_{ij}x_i \)
Compute outputs (\( o \)) of the network: \( o_j = f(S) \)
For each output, compute the error: \( \delta_j = ( \check{o}_j - o_j ) \)
Update each synaptic weights of the network: \( w_{ij} = w_{ij} + \eta.\delta_j.x_i.\frac{df(S)}{dS} \)
Until the training set is empty