Thursday, February 16, 2012

HEBBIAN LEARNING::::::THE TEACHER OF A NEURON:::::::

HEBBIAN BREEZE::::

The development of various methodologies to train a neuron have a significant place in the history of learning algorithms used by  neurons to produce good neuron citizens for the artificial intelligence community. Hebbian learning is one of the most primitive form of learning. It is widely accepted as a supervised learning and also a small community included it in the unsupervised learning..

My first encounter to hebbian learning methodologies ended in a mischap of all of its basic concepts. But timely intervention of good friends and Lecturers made me clear to a good extent about the concept of training a neuron, what learning is etc...

For instance of hebbian learning adaptation we can use iris data set. In the case of iris data set it we have 3 groups. They are Iris-virginica, Iris-versicolor, Iris-setosa. As human beings if we had given all the parts of each group of flower we will naturally starts learning to identify the parts of each flower. We will keep an eye on certain parts of the flower especially the inflorescence or petal colour etc. At last as the time period over we will be put into a test, in which we have to select a flower of right group with all kinds of combination tests. Either we have to identify flower group from  petal colour or group the given flower into the right group etc.

The whole phenomenon was copied and abstracted on  a single neuron. Here we have to become a neuron. As we all know all the sense organs of an abstract neuron are the inputs that got from external environment in the form of numbers(normalised values). We have to look certain inputs with at most affection(weights). We have to be trained with the data set of each group for estimating the final weights of required group(updating the weights with hebbian equation). Thus we learned to identify the 'flowers' (can group successfully into matched group).

The whole process of hebbian learning is a systematic one that each data set have to be trained seaparately and store final updated  weights for reference. This final updated weights are the keypoints in identifying the missing properties to some extent. ie with desired output and weights we are able to get the missing input value. It will be closed to some values predefined as inputs. Thats how we say that a neuron is learned to identify the group to some extent by performing a comparison with the desired output. 

1 comment:

  1. This is wonderful blog. Contents over here are so informative. For more information on this topic, visit here..Hebbian Learning

    ReplyDelete