1、MATLAB中的神经网络及其应用:以BP为例主讲:王茂芝 副教授1 一个预测问题n已知:一组标准输入和输出数据(见附件)n求解:预测另外一组输入对应的输出n背景:略2 BP网络3 MATLAB中的newff命令nNEWFF Create a feed-forward backpropagation network.nSyntaxnnet = newffnnet = newff(PR,S1 S2.SNl,TF1 TF2.TFNl,BTF,BLF,PF)命令newff中的参数说明n NET = NEWFF creates a new network with a dialog box.nNEWFF
2、(PR,S1 S2.SNl,TF1 TF2.TFNl,BTF,BLF,PF) takes,nPR - Rx2 matrix of min and max values for R input elements.nSi - Size of ith layer, for Nl layers.nTFi - Transfer function of ith layer, default = tansig.nBTF - Backprop network training function, default = trainlm.nBLF - Backprop weight/bias learning fu
3、nction, default = learngdm.nPF - Performance function, default = mse.nand returns an N layer feed-forward backprop network.参数说明n The transfer functions TFi can be any differentiable transfer function such as TANSIG, LOGSIG, or PURELIN.nThe training function BTF can be any of the backprop training fu
4、nctions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.参数说明n*WARNING*: TRAINLM is the default training function because it is very fast, but it requires a lot of memory to run. If you get an out-of-memory error when training try doing one of these:n(1) Slow TRAINLM training, but reduce memory requ
5、irements, by setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.)n(2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.n (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.参数说明n The learning function BLF can be either of the backpropagation lear
6、ning functions such as LEARNGD, or LEARNGDM.nThe performance function can be any of the differentiable performance functions such as MSE or MSEREG.4 MATLAB中的train命令nTRAIN Train a neural network.nSyntaxnnet,tr,Y,E,Pf,Af = train(NET,P,T,Pi,Ai,VV,TV)nDescriptionnTRAIN trains a network NET according to
7、NET.trainFcn and NET.trainParam.输入参数说明n TRAIN(NET,P,T,Pi,Ai) takes,nNET - Network.nP - Network inputs.nT - Network targets, default = zeros.nPi - Initial input delay conditions, default = zeros.nAi - Initial layer delay conditions, default = zeros.nVV - Structure of validation vectors, default = .nT
8、V - Structure of test vectors, default = .输出参数说明nand returns,nNET - New network.nTR - Training record (epoch and perf).nY - Network outputs.nE - Network errors.nPf - Final input delay conditions.nAf - Final layer delay conditions.说明n Note that T is optional and need only be used for networks that re
9、quire targets. nPi and Pf are also optional and need only be used for networks that have input or layer delays.输入参数数据结构说明n The cell array format is easiest to describe. It is most convenient for networks with multiple inputs and outputs, and allows sequences of inputs to be presented:nP - NixTS cell
10、 array, each element Pi,ts is an RixQ matrix.nT - NtxTS cell array, each element Pi,ts is an VixQ matrix.nPi - NixID cell array, each element Pii,k is an RixQ matrix.nAi - NlxLD cell array, each element Aii,k is an SixQ matrix.nY - NOxTS cell array, each element Yi,ts is an UixQ matrix.nE - NtxTS ce
11、ll array, each element Pi,ts is an VixQ matrix.nPf - NixID cell array, each element Pfi,k is an RixQ matrix.nAf - NlxLD cell array, each element Afi,k is an SixQ matrix.输入参数数据结构说明nWhere:nNi = net.numInputsnNl = net.numLayersnNt = net.numTargetsnID = net.numInputDelaysnLD = net.numLayerDelaysnTS = nu
12、mber of time stepsnQ = batch sizenRi = net.inputsi.sizenSi = net.layersi.sizenVi = net.targetsi.size5 实现n数据处理和准备n把WORD数据转换成TXT文件格式n利用dlmread读取数据n是否进行归一化处理?生成网络n为调用newff命令做好各种准备n pr矩阵的形成n网络结构确定:网络层数以及每层的神经元个数n每一层的传输函数的确定n注意参数的含义进行网络训练n为调用train命令进行数据准备n输入样本的确定n标准输出的确定n网络训练参数(次数)的确定n net. trainParam.ep
13、ochs=100n调用网络训练命令:net=train(net,p,t);进行输出模拟n 调用y=sim(net,p)进行输出模拟n画图进行对比查看网络参数及权值n netn net参数引用和查看6 预测及分析n sim输出n 重新训练并sim输出n 画图对比7 程序实现nclcnclear allnclear netnload data;nload data_pre;nc1=in(:,1);nc2=in(:,2);nc3=in(:,3);nc4=in(:,4);nc5=in(:,5);nc6=in(:,6);nc7=in(:,7);nc8=in(:,8);nc1_max=max(c1);nc
14、2_max=max(c2);nc3_max=max(c3);nc4_max=max(c4);nc5_max=max(c5);nc6_max=max(c6);nc7_max=max(c7);nc8_max=max(c8);续n% c1=c1/c1_max;n% c2=c2/c2_max;n% c3=c3/c3_max;n% c4=c4/c4_max;n% c5=c5/c5_max;n% c6=c6/c6_max;n% c7=c7/c7_max;n% c8=c8/c8_max;n% n% in(:,1)=c1;n% in(:,2)=c2;n% in(:,3)=c3;n% in(:,4)=c4;n%
15、 in(:,5)=c5;n% in(:,6)=c6;n% in(:,7)=c7;n% in(:,8)=c8;n% 续n% c1_max=max(c1);nc1_min=min(c1);n% c2_max=max(c2);n c2_min=min(c2);n% c3_max=max(c3);n c3_min=min(c3);n% c4_max=max(c4);n c4_min=min(c4);n% c5_max=max(c5);n c5_min=min(c5);n% c6_max=max(c6);n c6_min=min(c6);n% c7_max=max(c7);n c7_min=min(c7
16、);n% c8_max=max(c8);n c8_min=min(c8);续npr=c1_min,c1_max;c2_min,c2_max;c3_min,c3_max;c4_min,c4_max;c5_min,c5_max;c6_min,c6_max;c7_min,c7_max;c8_min,c8_max;np=in;nt=out;nnet=newff(pr,8 11 4,logsig logsig logsig);nnet.trainParam.epochs=100;nnet=train(net,p,t);ny=sim(net,p);n% plot(t);n% figure;n% plot(y);