神经网络之-递归神经网络课件.ppt

上传人(卖家):晟晟文业 文档编号:4753717 上传时间:2023-01-07 格式:PPT 页数:72 大小:1.61MB
下载 相关 举报
神经网络之-递归神经网络课件.ppt_第1页
第1页 / 共72页
神经网络之-递归神经网络课件.ppt_第2页
第2页 / 共72页
神经网络之-递归神经网络课件.ppt_第3页
第3页 / 共72页
神经网络之-递归神经网络课件.ppt_第4页
第4页 / 共72页
神经网络之-递归神经网络课件.ppt_第5页
第5页 / 共72页
点击查看更多>>
资源描述

1、Lecture 10 Total 69 pages1Recurrent Neural NetworksLecture 10 Total 69 pages2Classification of NNsFeedforward NNsRecurrent NNsNeural NetworksLecture 10 Total 69 pages3 3视网膜信息处理的基本系统n视网膜分视网膜分3层神经细胞(自下而上):层神经细胞(自下而上):外层、中间层、最后层外层、中间层、最后层n光信息自光信息自光感受器光感受器经经双极细胞双极细胞传至传至神经节细胞,神经节细胞,神经节细胞的轴突汇神经节细胞的轴突汇聚成聚成

2、视神经视神经离开眼球。离开眼球。n水平细胞和无长突细胞通过侧向联系调水平细胞和无长突细胞通过侧向联系调节双极细胞和神经节细胞的反应。节双极细胞和神经节细胞的反应。Lecture 10 Total 69 pages4Feedforward NNs神经节细胞层神经节细胞层内核层内核层外核层外核层三层神经网络:神经节细胞层三层神经网络:神经节细胞层-内核层内核层-外核层外核层每层内各神经元之间无连接每层内各神经元之间无连接前一层神经元计算完后传递给下一层神经元进行计算前一层神经元计算完后传递给下一层神经元进行计算Lecture 10 Total 69 pages5Feedforward NNs1x2

3、1w13w12w11wRx2x22w23w1Rw2Rw3Rw12f11sf11sn12n11f11n22sf22sn22f22n21f21n31f31n32f32n33sf33sn11sa12a11a22sa22a21a22sa22a21aWxfaLecture 10 Total 69 pages6Contain feedback among neuronsRecurrent NNsLecture 10 Total 69 pages7Recurrent NNsLecture 10 Total 69 pages8Recurrent NNsHow to derive math models of

4、RNNs?Lecture 10 Total 69 pages9Recurrent NNs1f1n2n2fLecture 10 Total 69 pages10Recurrent NNs1f1n2n2f)(1kx)(2kx)1(2kx)1(1kx)(2kx)(1kxLecture 10 Total 69 pages11Recurrent NNs1f1n2n2f)(1kx)(2kx)1(2kx)1(1kx)(2kx)(1kx22w21w12w11wLecture 10 Total 69 pages12Recurrent NNs1f111 1122()()nw x kw x k)()(2221212

5、kxwkxwn2f)(1kx)(2kx)1(2kx)1(1kx)(2kx)(1kx22w21w12w11wLecture 10 Total 69 pages13Recurrent NNs1f)()(121111kxwkxwn)()(2221212kxwkxwn2f)(1kx)(2kx)()()1(21211111kxwkxwfkx)(2kx)(1kx22w21w12w11w)()()1(22212122kxwkxwfkxLecture 10 Total 69 pages14Recurrent NNs)()()1()()()1(2221212221211111kxwkxwfkxkxwkxwfkx

6、1f1n2n2f)(1kx)(2kx)1(2kx)1(1kx)(2kx)(1kx22w21w12w11wLecture 10 Total 69 pages15Recurrent NNs1f1n2n2f)(1kx)(2kx)1(2kx)1(1kx)(2kx)(1kx22w21w12w11w)()()1()()()1(2221212221211111kxwkxwfkxkxwkxwfkxnjjijiikxwfkx1)()1(Lecture 10 Total 69 pages16Recurrent NNs)()1(kwxfkxLecture 10 Total 69 pages17Recurrent N

7、NsbbbbLecture 10 Total 69 pages18Discrete Time RNNsbkwxfkx)()1(Lecture 10 Total 69 pages19Discrete Time RNNsbkwxfkx)()1(Network computing?0 x1x 6x 3x 5x 2x 7x 4x 8x xLecture 10 Total 69 pages20Discrete Time RNNsbkwxfkx)()1(Network computing 0 x1x 6x 3x 5x 2x 7x 4x 8x xRNNInputOutput x 0 xLecture 10

8、Total 69 pages21Computing:Discrete or Continuous?Lecture 10 Total 69 pages22Discrete vs Continuous Discrete time computingContinuous time computingLecture 10 Total 69 pages23Discrete vs Continuous Continuous time computingHow to derive continuous time computing math models of RNNs?Lecture 10 Total 6

9、9 pages24From Discrete Computing to Continuous ComputingChanging time stepsLecture 10 Total 69 pages25btwxftxtxtx)()()()1(btwxftx)()1(From Discrete Computing to Continuous ComputingLecture 10 Total 69 pages26btwxftxtxtx)()(1)()1(btwxftxtxtx)()()()1(From Discrete Computing to Continuous ComputingLect

10、ure 10 Total 69 pages27btwxftxtxtx)()()()(btwxftxtxtx)()(1)()1(From Discrete Computing to Continuous ComputingLecture 10 Total 69 pages28btwxftxtxtx)()()()(btwxftxtxtx)()()()(From Discrete Computing to Continuous ComputingLecture 10 Total 69 pages29btwxftxtxtx)()()()(btwxftxdttdx)()()(0From Discrete

11、 Computing to Continuous ComputingLecture 10 Total 69 pages30btwxftxdttdx)()()(Continuous Computing RNNsLecture 10 Total 69 pages31Recurrent NNsttptagdttda),(),()(RNN model:Network stateNetwork inputNetwork timeLecture 10 Total 69 pages32Recurrent NNsWhats the output of a RNN?Network stateNetwork in

12、putNetwork timettptagdttda),(),()(Network outputtata )(Lecture 10 Total 69 pages33Convergence of RNNsNetwork statettptagdttda),(),()(tata )(Converge?0 allfor 0),(,tttpagEquilibrium point:Lecture 10 Total 69 pages34Trajectoriesttptagdttda),(),()()0(,(y trajectora is there),0(condition initialany give

13、n ataa nR 0,0 0,space esTrajectoriatataLecture 10 Total 69 pages35Trajectoriesttptagdttda),(),()(0any for ,then,If2121tataataaaLecture 10 Total 69 pages36Trajectoriesttptagdttda),(),()(0any for ,then,If2121tataataaaLecture 10 Total 69 pages37A Simple Exampleptadttda)()(ttepeata1)0()(Lecture 10 Total

14、 69 pages38Equilibrium Pointsttptagdttda),(),()(0 allfor 0),(,tttpagEquilibrium point:Lecture 10 Total 69 pages39Equilibrium Pointsptadttda)()(ttepeata1)0()(tappa Lecture 10 Total 69 pages40Convergence of RNNsttptagdttda),(),()(tata )(AttractorsLecture 10 Total 69 pages41Convergence of RNNsttptagdtt

15、da),(),()(tata )(Does each trajectory of a RNN converge to an equilibrium?Methods:1.Solving differential equation directly;2.Energy method.Lecture 10 Total 69 pages42Method OneSolving Differential EquationsLecture 10 Total 69 pages43A Simple Exampleptadttda)()(ttepeata1)0()(tapLecture 10 Total 69 pa

16、ges44Linear RNNsptWatadttda)()()(tata )(Lecture 10 Total 69 pages45Linear RNNs2122212111)()()()()()(ptatataptatatatata )(Lecture 10 Total 69 pages46http:/hebb.mit.edu/people/seung/index.htmlLecture 10 Total 69 pages47http:/hebb.mit.edu/people/seung/index.htmlLecture 10 Total 69 pages48Linear RNNsnH.

17、S.Seung,How the brain keeps the eyes still,Proc.Natl.Acad.Sci.USA,vol.93,pp.13339-13344,1996y sensitivitposition the-0 gaze centralat rate firing the-rate firing the-00iiiiiikEvvEkvvLecture 10 Total 69 pages49How the brain keeps the eyes stillnH.S.Seung,How the brain keeps the eyes still,Proc.Natl.A

18、cad.Sci.USA,vol.93,pp.13339-13344,1996ABSTRACT The brain can hold the eyes still because it stores a memory of eye position.The brains memory of horizontal eye position appears to be represented by persistent neural activity in a network known as the neural integrator,which is localized in the brain

19、stem and cerebellum.Existingexperimental data are reinterpreted as evidence for an“attractor hypothesis”that the persistent patterns of activity observed in this network form an attractive line of fixed points in its state space.Line attractor dynamics can be produced in linear or nonlinear neural n

20、etworks by learning mechanisms that precisely tune positive feedback.Lecture 10 Total 69 pages50Line Attractor 1injjijiiiiibxwtxdttdxEEkxSeung 1996Lecture 10 Total 69 pages51文字阅读文字阅读n人眼的运动方式。尽管人的阅读文字总是遵循一定的顺序,人眼的运动方式。尽管人的阅读文字总是遵循一定的顺序,但通过捕捉人员阅读时的目光定位,可以发现人眼的注意但通过捕捉人员阅读时的目光定位,可以发现人眼的注意是跳跃性的,当人脑找到和过去经

21、验、记忆相近的形象时,是跳跃性的,当人脑找到和过去经验、记忆相近的形象时,目光才更多地集中到具体内容上。目光才更多地集中到具体内容上。n完型理论(格式塔理论)完型理论(格式塔理论):人对事物的认知具有强大的人对事物的认知具有强大的“补补完完”功能功能 。1.研表究明,汉字序顺并不定一影阅响读!事证实明了当你看这完句话研表究明,汉字序顺并不定一影阅响读!事证实明了当你看这完句话之后才发字现都乱是的之后才发字现都乱是的2.Hvae a ncie day Hpoe you konw the ifnomariton.Lecture 10 Total 69 pages52阅读实验阅读实验Lecture

22、10 Total 69 pages53Linear RNNsnH.S.Seung,Pattern analysis and synthesis in attractor neural networks,1997AnalysisSynthesisLecture 10 Total 69 pages54Pattern analysis and synthesis in attractor neural networksnH.S.Seung.Pattern analysis and synthesis in attractor neural networks.In K.-Y.M.Wong,I.King

23、,and D.-Y.Yeung,editors,Theoretical Aspects of Neural Computation:A Multidisciplinary Perspective,Singapore,1997.Springer-Verlag.Abstract The representation of hidden variable models by attractor neural net works is studied Memories are stored in a dynamical attractor that is a continuous manifold o

24、f fixed points as illustrated by linear and nonlinear networks with hidden neurons.Pattern analysis and synthesis are forms of pattern completion by recall of a stored memory.Analysis and synthesis in the linear network are performed by bottom-up and top-down connections.In the nonlinear network,the

25、 analysis computation additionally requires rectification nonlinearity and inner product inhibition between hidden neurons.Lecture 10 Total 69 pages55Pattern analysis and synthesis in attractor neural networksEnergy functionLecture 10 Total 69 pages56Pattern analysis and synthesis in attractor neura

26、l networksnH.S.Seung.Pattern analysis and synthesis in attractor neural networks.In K.-Y.M.Wong,I.King,and D.-Y.Yeung,editors,Theoretical Aspects of Neural Computation:A Multidisciplinary Perspective,Singapore,1997.Springer-Verlag.Energy functionLecture 10 Total 69 pages57nV.Jain,V.Zhigulin,and H.S.

27、Seung.Representing part-whole relationships in recurrent neural networks.Adv.Neural Info.Proc.Syst.18,563-70(2006).Abstract There is little consensus about the computational function of top-down synaptic connections in the visual system.Here we explore the hypothesisthat top-down connections,like bo

28、ttom-up connections,reflect partwholerelationships.We analyze a recurrent network with bidirectional synaptic interactions between a layer of neurons representing parts and a layer of neurons representing wholes.Within each layer,there is lateral inhibition.When the network detects a whole,it can ri

29、gorously enforce part-whole relationships by ignoring parts that do not belong.The network can complete the whole by filling in missing parts.The network can refuse to recognize a whole,if the activated parts do not conform to a stored part-whole relationship.Parameter regimes in which these behavio

30、rs happen are identified using the theory of permitted and forbidden sets.The network behaviors are illustrated by recreating Rumelhart and McClellands“interactive activation”model.Lecture 10 Total 69 pages58nV.Jain,V.Zhigulin,and H.S.Seung.Representing part-whole relationships in recurrent neural n

31、etworks.Adv.Neural Info.Proc.Syst.18,563-70(2006).Lecture 10 Total 69 pages59Lecture 10 Total 69 pages60Lecture 10 Total 69 pages61Method TwoEnergy Functions MethodLecture 10 Total 69 pages62Energy Function MethodnLyapunov MethodnA.M.LyapunovnStability theorynLaSalle PrinciplenEnergy Functionttptagd

32、ttda),(),()(Lecture 10 Total 69 pages63Energy Function Method xfdttdx)(point.mequilibriuan toconvergesctory each traje Then,0)(:such that)(function Energy a exists There bounded.is RNN a ofctory each traje that SupposxVxETheorem:Recurrent Neural Network ModelLecture 10 Total 69 pages64Example)()()()

33、()(2ln53)(txtxtxtxeeeetxdttdxLecture 10 Total 69 pages65Example)()(2ln53)(txgtxdttdxEquilibrium Points:0,ln2,-ln2xxxxeeeexg)(Lecture 10 Total 69 pages66Example)()()()()(2ln53)(txtxtxtxeeeetxdttdx)(012)(2ln53)(21)(xgdssgxgxVxxxxeeeexg)(Lecture 10 Total 69 pages67Example)()()()()(2ln53)(txtxtxtxeeeetx

34、dttdxxeeeexVxxxx2ln534)(3Lecture 10 Total 69 pages68ExamplexeeeexVxxxx2ln534)(3422342ln53)(xxxxxxeeeeeexV0)0(,0)0(VVLecture 10 Total 69 pages69Example02ln534)(3xeeeexVxxxx0)0(,0)0(VVLecture 10 Total 69 pages70Example)()()()()(2ln53)(txtxtxtxeeeetxdttdxLecture 10 Total 69 pages71Reference Books1.Zhang Yi and K.K.Tan,Convergence Analysis of Recurrent Neural Networks,Kluwer Academic Publishers,ISBN 1-4020-7694-0,2004.2.H.J.Tang,K.C.Tan and Zhang Yi,Neural Networks:Computational Models and Applications,Springer-Verlag ISBN:978-3-540-69225-6,2007.谢谢观看!2020

展开阅读全文
相关资源
猜你喜欢
相关搜索
资源标签

当前位置:首页 > 办公、行业 > 各类PPT课件(模板)
版权提示 | 免责声明

1,本文(神经网络之-递归神经网络课件.ppt)为本站会员(晟晟文业)主动上传,163文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。
2,用户下载本文档,所消耗的文币(积分)将全额增加到上传者的账号。
3, 若此文所含内容侵犯了您的版权或隐私,请立即通知163文库(发送邮件至3464097650@qq.com或直接QQ联系客服),我们立即给予删除!


侵权处理QQ:3464097650--上传资料QQ:3464097650

【声明】本站为“文档C2C交易模式”,即用户上传的文档直接卖给(下载)用户,本站只是网络空间服务平台,本站所有原创文档下载所得归上传人所有,如您发现上传作品侵犯了您的版权,请立刻联系我们并提供证据,我们将在3个工作日内予以改正。


163文库-Www.163Wenku.Com |网站地图|