# Perceptron

In the last article, we were talking about Perceptron, the model proposed before the activation model. It was developed by Frank Rosenblatt back in 1950s-60s. The Perceptron model is very simple in approach. It takes several binary inputs to multiply it with weights and produces a single binary output.

The best way to learn is by example. Let’s take an example that you’ll have to go office in the morning but there are certain requirements that must be fulfilled before going to the office. 1. It must be working day because working on weekends is a big no-no.
2. The weather must be good. As you are going to the office on a bicycle because you want to be slim also being health conscious.

In reality, the weather is not that much big deal so here comes how can we decide which parameter is how much important in determining the output. If both are equally important then both should be given equal weights. Weight is something that defines the importance of a parameter in determining the output. Let’s give the working day weight of number 7 and weather weight of 2. Here comes the calculation.

Total = (Input 1) x (Weight) + (Input 2) x (Weight)

Before calculation, we assume that it’s a working day so the input1 will be high or 1 and let’s also assume that the weather is also fine so the input 2 will also be high or 1.

Total = (Weekday) x (Weight) + (Weather) x (Weight)

Total = (7)(1) + (2)(1)

Total = 7 + 2 = 9

9 is just a number, how we can define our output with just number 9. Here comes the concept of the threshold. If our output is greater than the threshold, then our output will be high otherwise it is low. So let’s assume the value of the threshold is 6. As 9 is greater than the threshold value so the output will be 1 and we’ll have to go office.

Now let’s come to the real-time environment. In reality, the weights are set to random values and then network adjusts those weights based on the output errors it made using previous weights. The is something called Training the network. In the next article we’ll discuss about sigmoid function