# Perceptron

A perceptron was the first algorithm proposed in history of neural networks. The word perceptron is nowaday associated to its graphical representation:

The perceptron is the graphical representation of a mathematical function composed of two parts. The left side is the weighted sum of the inputs $$x_i$$, while the right side is a general function called activation function choosen according to the problem the netwok will have to solve. Usualy, it is recommended to pick diferentiable functions. The global trasfert function of a perceptron is given by:

$$y(x_i,w_i)= f(w_1 x_1 + w_2 x_2 + ... + w_N x_N)$$ $$y(x_i,w_i)= \sum_{i=1}^N w_i x_i$$ The perceptron, also called neuron, is the basic building block. By assembling them it becomes possible to create more complex networks.