You are on page 1of 10

Neural Network & GA Assignment

R.Bhalamurugan
1671210007

Tool Used: nntool

Literature: Analyzing Students Cognitive Load to Prioritize English Public Speaking

Training
Data:

Using Command line programming with the Same Data


% B_Input_N=[0.0948 0.0395 0.0352 0.0416 0.0273 0.0434 0.0312 0.0704 0.0447
0.0719 0.0743 0.1551 0.0732 0.032 0.0174 0.1299 0.0576 0.0885 0.0606 0.0371
0.2232 0.0184 0.0184 0.0834 0.1061 0.0282 0.0819 0.0241 0.0292;
% 0.1292 0.0445 0.0399 0.1094 0.0273 0.0285 0.3531 0.0474 0.1192 0.0315
0.0381 0.3353 0.3001 0.0933 0.0174 0.0537 0.0254 0.0414 0.0273 0.0371 0.3704
0.0184 0.2587 0.0345 0.3463 0.0282 0.0372 0.0684 0.0672;
% 0.2087 0.0395 0.0485 0.0416 0.0273 0.0259 0.1304 0.1853 0.0635 0.0936
0.0298 0.198 0.0934 0.1411 0.032 0.1299 0.0298 0.0885 0.0899 0.0371 0.1189
0.0355 0.0352 0.0401 0.2335 0.0282 0.042 0.0241 0.3761;
% 0.3162 0.0395 0.0813 0.2528 0.0273 0.2252 0.0337 0.2751 0.0593 0.0324
0.1356 0.0354 0.032 0.0361 0.1377 0.1299 0.0468 0.1277 0.0348 0.0371 0.1189
0.1264 0.0309 0.0345 0.1421 0.0662 0.3789 0.0241 0.2266;
% 0.055 0.0863 0.0674 0.0416 0.0704 0.0293 0.0207 0.0412 0.0411 0.048 0.0354
0.0314 0.0397 0.0825 0.2706 0.0537 0.0565 0.0324 0.0329 0.0865 0.0272 0.0624
0.229 0.0345 0.0519 0.0459 0.0216 0.3413 0.0769;

% 0.0743 0.1205 0.1068 0.0416 0.0704 0.0555 0.0388 0.0521 0.0368 0.4326
0.0487 0.0452 0.0989 0.3544 0.2706 0.0239 0.1474 0.0513 0.0393 0.0915 0.0272
0.0355 0.1953 0.0834 0.0304 0.107 0.0216 0.1871 0.1364;
% 0.029 0.21 0.079 0.2528 0.4005 0.1177 0.0741 0.164 0.1222 0.0261 0.2477
0.0857 0.1192 0.0361 0.0458 0.4014 0.2899 0.0233 0.3652 0.1768 0.0272 0.1338
0.11 0.3653 0.0524 0.1606 0.0725 0.0916 0.0292;
% 0.0443 0.21 0.3696 0.1094 0.1961 0.3568 0.2188 0.1015 0.3283 0.2307 0.3162
0.0466 0.0507 0.1925 0.1377 0.0537 0.1936 0.2006 0.1361 0.32 0.0598 0.3653
0.0687 0.182 0.0203 0.3839 0.2305 0.1871 0.0292;
% 0.0486 0.21 0.1725 0.1094 0.1536 0.1177 0.0993 0.0628 0.1849 0.0333 0.0743
0.0672 0.1927 0.032 0.0706 0.0239 0.153 0.3463 0.2138 0.1768 0.0272 0.2042
0.0538 0.1426 0.017 0.1517 0.1139 0.0521 0.0292];
% %Training Patterns (domain values)
%
% B_Target_N=[0.6485 1 0.677 0.926 0.4998 0.578 0.5666 0.919 0.7361 0.441
0.6865 0.3864 0.7723 0.9032 0.824 0.376 0.927 0.413 0.5314 0.7901 0 0.5414
0.8833 0.6167 0.3423 0.6218 0.3359 0.6291 0.5247
%
; 19 31 20 29 8
14 13 28 22 7
21 5
23 27 25 4
30 6
11 24 1
12 26 15 3
16 2
18 9
% ];
%Training Targets (range values)
B_Input_N = xlsread('NNet.xlsx', 1);
B_Target_N = xlsread('NNet.xlsx', 2);
P=B_Input_N;
T=B_Target_N;
net = newff(minmax(P),[10 1],{'tansig' 'purelin'});
%Plot the original data points and the untrained output
Y = sim(net,P);
figure(1);
plot(P,Y,'o',P,T,'p');
title('Data and Untrained Network Output');
%Train the network and plot the results
net.trainParam.goal=0.01; %0 is the default- too small!
net.trainParam.epochs = 50; %For our sample, dont train too long
net.trainParam.lr = 0.001;
net = train(net,P,T);
X= xlsread('NNet.xlsx',3);
% X=[0.064 0.0386;
% 0.064 0.071;
% 0.0474 0.0802;
% 0.1333 0.0386;
% 0.0329 0.3353;
% 0.0329 0.0386;
% 0.3781 0.1766;
% 0.2301 0.1307;
% 0.0173 0.0906];
%New Domain Points

Y = sim(net,X); %Network Output


figure(2);
% plot(P,T,'p');
%hold on;
plot (X,Y,'-o');
title ('Output after trained');
%hold off;
%plot(P,T(1,:),X,Y(1,:));
% hold on;
% plot(P,T(2,:),X,Y(2,:));

% hold off;
An alternative way to test training: postreg
figure(3)
Tout=sim(net,P); %Get network output for the training domain
[m,b,r]=postreg(T,Tout); %Performs a linear regression
display Y;

Out Put:

Genetic Algorithm:
Tool Used: gatool
Fitness Fuction:
function y = Bga(x)
y = -(x^2-1);
end

Initial Population:
function Pp=Pop(~)
%
Pp = [0.0948 0.0395 0.0352 0.0416 0.0273 0.0434 0.0312 0.0704
0.0447 0.0719 0.0743 0.1551 0.0732 0.032
0.0174 0.1299 0.0576
0.0885 0.0606 0.0371 0.2232 0.0184 0.0184 0.0834 0.1061];
Pp=[0.0948 0.0395 0.0352 0.0416 0.0273 0.0434 0.0312 0.0704 0.0447 0.0719];
end

Program :
function [x,fval,exitflag,output,population,score] =
B_GA(nvars,PopulationSize_Data,CrossoverFraction_Data,Generations_Data,TolFun
_Data,InitialPopulation_Data)
% Start with the default options
options = gaoptimset;
% Modify options setting
options = gaoptimset(options,'PopulationType', 'custom');
options = gaoptimset(options,'PopulationSize', PopulationSize_Data);
options = gaoptimset(options,'CrossoverFraction', CrossoverFraction_Data);
options = gaoptimset(options,'Generations', Generations_Data);
options = gaoptimset(options,'TolFun', TolFun_Data);
options = gaoptimset(options,'InitialPopulation', InitialPopulation_Data);
options = gaoptimset(options,'CreationFcn', @Pop);
options = gaoptimset(options,'FitnessScalingFcn', @fitscalingrank);
options = gaoptimset(options,'CrossoverFcn', @crossovertwopoint);
options = gaoptimset(options,'MutationFcn', @mutationadaptfeasible);
options = gaoptimset(options,'Display', 'iter');
options = gaoptimset(options,'PlotFcns', { @gaplotbestf @gaplotbestindiv
@gaplotscores });
options = gaoptimset(options,'OutputFcns', { [] });
[x,fval,exitflag,output,population,score] = ...
ga(@Bga,nvars,[],[],[],[],[],[],[],options);

You might also like