04 Jan

Diving into Deep Learning Part I: Perceptrons

This new years I am gonna be working on deep learning. I have decided to experiment with deep learning techniques in biomedical information retrieval. I have read about Neural networks, worked with them on some projects, and even implemented my own Neural networks library in Javascript. But this time am taking baby steps and reviewing all basic concepts along my way.

Am using Neural networks and deep learning, which seems to be a free online collection of essays, being converted into a book, as my reference point. I am also using Deeplearning.net‘s reading list. Deep learning by Bengio et al. is a work in progress, the draft is freely available online and using that too.

Using Perceptrons for Implementing Logic gates

Logic gates are simple to understand. They are often used as examples to introduce students to Neural networks. Now if we are to implement an OR gate, we basically have to implement the following truth table:

x y Z
0 0 0
0 1 1
1 0 1
1 1 1

A perceptron is defined by 2 parameters:  w and b, the weight vector and the bias respectively. Now consider the perceptron shown in the following figure. It implements the OR gate.  Try and check the perceptron for different values of x and y and observe the output value. You’ll find that it mimicks the OR gate.

Perceptron for OR gate

Perceptron for OR gate