Artificial intelligence is one of the most reforming branches of computer science. When man starts thinking about how his own brain works, how can he adopt the features of brain to solve most complicated problems, artificial intelligence emerges. In-fact many eminent researchers with their outstanding inventions, found out numerous models of human thought process.
Computational intelligence, Artificial neural networks, Design of neurons, ganglia etc are the most challenging tasks that happening at the back-end of all these research.
Most of the neural network studies starts with the study of biological neurons, which are the major component of human brain where amazing mind blogging mysteries are happening. We have to wonder and with wide open eyes to believe that these neurons are cells which carry more than gigabytes of informations about various learning process right from a human beings' birth. It is the place where all information exchanges are happening. If neuron has this much importance in our life, its better to fold our hands and give a worshipful respect to the creator of all these structures at this moment..
Most classification algorithms with neurons include perceptron model, alpha-LMS, steepest descent, Back-propagation etc. The perceptron model includes the training of a neuron with the inputs from external environment with a specialized weights given to each of these inputs. Clearly when we remap perceptron to our daily routine, it can be visualized as a process when we starts learning a new environment.
In this process of learning we will naturally give more importance to certain inputs and we will constantly observing these inputs in course of our learning. Similar thing happens in the training of perceptron training. In each iteration we will update the importance of certain inputs by improving their weights also give unimportance to certain weights by the process of weight updation. There are different systematic ways to do this critical process. It can be alpha-LMS approach, Back-propagation, Steepest descent etc. Each of these methods have their own merits and demerits. These demerits are the pointers to further research in this area.
One simple classification problem using perceptron is described below. The matlab simulation code also included which contains weight updation process done with perceptron approach as well as alpha-LMS.
The problem is that we have to design a neuron using perceptron approach and also with alpha-LMS which can distinguish between two triangles located on either side of a line. The output is the final weights that we will give to each input pattern and inputs are the various points of training. It is assumed that the expected output if 1 if the point is above the line and the expected output is 0 when the point is below the line. The image of the problem specification is shown above.
The matlab code for perceptron learning and alpha-LMS is shown below. If we execute the code we will be getting weights at which the neuron is to be ready to classify the points whether it is above or below the line.
The perceptron code:
p = [1 2 2
1 2 .75
1 2.25 .75
1 2.75 .75
1 2.125 .75
1 3 .75
1 3.25 .75
1 3.5 .75
1 4 .75
1 2.25 .5
1 2.5 .25
1 2.75 0
1 3 -.25
1 3.25 -.25
1 4 .25
1 1 1
1 1.125 1
1 1.25 1
1 1.25 1
1 1.5 1
1 2 1
1 2.25 1
1 2.5 1
1 2.75 1
1 3 1
1 1.25 1.25
1 1.5 .5
1 1.75 1.75
1 2 2
1 2.25 1.75
1 2.5 1.5
1 2.75 .5
1 3 1
1 2.125 1.75];
d = [0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1];
w = [0 0 0];
eta = 1;
update = 1;
while update == 1
for i=1:34
y = p(i,:)*w';
if y>=0 & d(i) == 0
w = w-eta*p(i,:);
up(i) = 1;
elseif y<=0 & d(i) ==1
w = w + eta*p(i,:);
up(i) = 1;
else
up(i) = 0;
end
end
numberofupdates = up * up' ;
if numberofupdates > 0
update = 1;
else
update = 0;
end
end
p = [1 2 2
1 2 .75
1 2.25 .75
1 2.75 .75
1 2.125 .75
1 3 .75
1 3.25 .75
1 3.5 .75
1 4 .75
1 2.25 .5
1 2.5 .25
1 2.75 0
1 3 -.25
1 3.25 -.25
1 4 .25
1 1 1
1 1.125 1
1 1.25 1
1 1.25 1
1 1.5 1
1 2 1
1 2.25 1
1 2.5 1
1 2.75 1
1 3 1
1 1.25 1.25
1 1.5 .5
1 1.75 1.75
1 2 2
1 2.25 1.75
1 2.5 1.5
1 2.75 .5
1 3 1
1 2.125 1.75];
d = [0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1];
w = [0 0 0];
eta = 1;
update = 1;
while update == 1
for i=1:34
y = p(i,:)*w';
ek = d(i) - y;
pi = p(i,:);
pi = pi/(norm(pi)^2)
w = w + 1*ek*pi;
if y>=0 & d(i) == 0
% w = w-eta*p(i,:);
up(i) = 1;
elseif y<=0 & d(i) ==1
% w = w + eta*p(i,:);
up(i) = 1;
else
up(i) = 0;
end
end
numberofupdates = up * up' ;
if numberofupdates > 0
update = 1;
else
update = 0;
end
end
The above views are according to the descriptions from the book of neural networks by Dr. sathish Kumar and subject to change in-case, any error in the conepts and the pointers to errors are most welcome.
Computational intelligence, Artificial neural networks, Design of neurons, ganglia etc are the most challenging tasks that happening at the back-end of all these research.
Most of the neural network studies starts with the study of biological neurons, which are the major component of human brain where amazing mind blogging mysteries are happening. We have to wonder and with wide open eyes to believe that these neurons are cells which carry more than gigabytes of informations about various learning process right from a human beings' birth. It is the place where all information exchanges are happening. If neuron has this much importance in our life, its better to fold our hands and give a worshipful respect to the creator of all these structures at this moment..
Most classification algorithms with neurons include perceptron model, alpha-LMS, steepest descent, Back-propagation etc. The perceptron model includes the training of a neuron with the inputs from external environment with a specialized weights given to each of these inputs. Clearly when we remap perceptron to our daily routine, it can be visualized as a process when we starts learning a new environment.
In this process of learning we will naturally give more importance to certain inputs and we will constantly observing these inputs in course of our learning. Similar thing happens in the training of perceptron training. In each iteration we will update the importance of certain inputs by improving their weights also give unimportance to certain weights by the process of weight updation. There are different systematic ways to do this critical process. It can be alpha-LMS approach, Back-propagation, Steepest descent etc. Each of these methods have their own merits and demerits. These demerits are the pointers to further research in this area.
One simple classification problem using perceptron is described below. The matlab simulation code also included which contains weight updation process done with perceptron approach as well as alpha-LMS.
The problem is that we have to design a neuron using perceptron approach and also with alpha-LMS which can distinguish between two triangles located on either side of a line. The output is the final weights that we will give to each input pattern and inputs are the various points of training. It is assumed that the expected output if 1 if the point is above the line and the expected output is 0 when the point is below the line. The image of the problem specification is shown above.
The matlab code for perceptron learning and alpha-LMS is shown below. If we execute the code we will be getting weights at which the neuron is to be ready to classify the points whether it is above or below the line.
The perceptron code:
p = [1 2 2
1 2 .75
1 2.25 .75
1 2.75 .75
1 2.125 .75
1 3 .75
1 3.25 .75
1 3.5 .75
1 4 .75
1 2.25 .5
1 2.5 .25
1 2.75 0
1 3 -.25
1 3.25 -.25
1 4 .25
1 1 1
1 1.125 1
1 1.25 1
1 1.25 1
1 1.5 1
1 2 1
1 2.25 1
1 2.5 1
1 2.75 1
1 3 1
1 1.25 1.25
1 1.5 .5
1 1.75 1.75
1 2 2
1 2.25 1.75
1 2.5 1.5
1 2.75 .5
1 3 1
1 2.125 1.75];
d = [0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1];
w = [0 0 0];
eta = 1;
update = 1;
while update == 1
for i=1:34
y = p(i,:)*w';
if y>=0 & d(i) == 0
w = w-eta*p(i,:);
up(i) = 1;
elseif y<=0 & d(i) ==1
w = w + eta*p(i,:);
up(i) = 1;
else
up(i) = 0;
end
end
numberofupdates = up * up' ;
if numberofupdates > 0
update = 1;
else
update = 0;
end
end
The alpha-LMS code :
1 2 .75
1 2.25 .75
1 2.75 .75
1 2.125 .75
1 3 .75
1 3.25 .75
1 3.5 .75
1 4 .75
1 2.25 .5
1 2.5 .25
1 2.75 0
1 3 -.25
1 3.25 -.25
1 4 .25
1 1 1
1 1.125 1
1 1.25 1
1 1.25 1
1 1.5 1
1 2 1
1 2.25 1
1 2.5 1
1 2.75 1
1 3 1
1 1.25 1.25
1 1.5 .5
1 1.75 1.75
1 2 2
1 2.25 1.75
1 2.5 1.5
1 2.75 .5
1 3 1
1 2.125 1.75];
d = [0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1];
w = [0 0 0];
eta = 1;
update = 1;
while update == 1
for i=1:34
y = p(i,:)*w';
ek = d(i) - y;
pi = p(i,:);
pi = pi/(norm(pi)^2)
w = w + 1*ek*pi;
if y>=0 & d(i) == 0
% w = w-eta*p(i,:);
up(i) = 1;
elseif y<=0 & d(i) ==1
% w = w + eta*p(i,:);
up(i) = 1;
else
up(i) = 0;
end
end
numberofupdates = up * up' ;
if numberofupdates > 0
update = 1;
else
update = 0;
end
end
The above views are according to the descriptions from the book of neural networks by Dr. sathish Kumar and subject to change in-case, any error in the conepts and the pointers to errors are most welcome.
No comments:
Post a Comment