Now that you understand the design of a perceptron, think about how it can be used for simple learning tasks. To start with, consider a simple **binary classification** task and spend a few minutes thinking about how the perceptron can work as a classifier.

In the following lecture, you will understand how the perceptron can act as a classifier.

You saw how the perceptron works as a classifier. The weights represent the importance of the corresponding feature for classification. You might have also noticed that the professor has used a **sign function**. The ‘sign function’ is similar to the step function – it outputs +1 when the input is greater than 0 and -1 otherwise. In a binary classification setting, +1 and -1 represent the two classes.

This is a simple exercise that will help you better understand how a perceptron works.

Consider the decision of whether to go to the sushi place being taken by a perceptron model. You have the following factors affecting the decision to go/not go: Distance, Cost and Company. These three variables are inputs to the perceptron. Suppose the inputs can be only 0/1 and the weights you assign to each variable add up to 1.

A sample set of weights can be ⎡⎢⎣0.50.30.2⎤⎥⎦.

For each of the inputs, the rules for deciding 1 and 0 are as follows – these are arbitrary mappings that you have decided to make your model simpler:

Factor | 1 | 0 |

Distance | < 8 km | > = 8 km |

Cost | =< Rs 2000 for two | > Rs 2000 for two |

Company | > 2 friends | < 2 friends |

Assume that the **bias** value is -0.7. The sushi place is 5 km away and 3 of your friends are ready to accompany you. Also, the cost for 2 is INR 2500.

From this exercise, you would have realised that the weighted sum of inputs, w1x1+w2x2+w3x3, when crosses a **threshold** (that is 0.7 here), you decide that you’ll go to the restaurant. else you wouldn’t go.

In the next segment, you will understand how a perceptron can perform binary classification in detail.

Report an error