1、AI&NN Notes Chapter 9Feedback Neural Networks9.1 Basic ConceptsAttractor:a state toward which the system evolves in time starting from certain initial conditions.Basin of attraction:the set of initial conditions which initiates the evolution terminating in the attractor.Fixed point:if an attractor i
2、s in a form of a unique point in state space.Limit Cycle:if an attractor consists of a periodic sequence of states.Hopfield Network and its Basic Assumptions12nvvv12nTTT12niiin21wwww2112n1wwn21n2n1.1 layer,n neurons2.T -Threshold of neuron i3.w -weight from j to i4.v -output of neuron j 5.i -externa
3、l input to the i-th neuronijijThe total input of the i-th neuron isnet =iJ=1 jinw v +i -T =W V+i -T ,for I=2,nijjiiitiiwhereW =wwwii1i2in.V=vvv12nThe complete matrix description of the linear portionof the system shown in the figure is given by net=WV+i-twherenet=netnetneti=iii12n12nare vectors cont
4、aining activation,external inputto each neuron and threshold vector,respectively.t=TTT1n2 W is an nn matrix containing network weights:W=www=wwwwwwwwwwww000012nttt12131n21232n31323nn1n2n3w =w ijjiw =0iiand9.2 Discrete-Time Hopfield NetworkAssuming that the neurons activation function is sgn,the tran
5、sition rule of the i-th neuron would be -1,if net 0 (excitatory state)v If,for a given time,only a single neuron is allowed to update its output and only one entry in vector v is allowed to change,this is an asynchronous operation,under which each element of the output vector isupdated separately wh
6、ile taking into account the mostrecent values for the elements that have already beenupdated and remain stable.(*)Based on(*),the update rule of a discrete-time recurrent network,for one value of i at a time,becomesv =sgn(w v +i -T)for random i,i=1,2,n and k=0,1,2,.iK+1itkiiwhere k denotes the index
7、 of recursive update.This is referred as asynchronous stochastic recursion of theHopfield network.This update process will continue until all n entries of v have been updated.The recursive computation continues until the outputnode vector remains unchanged with further iterations.Similarly,for synch
8、ronous operation,we haveiK+1v =TWv +i-t,for all neurons,k=0,1,.K+1kwhere all neurons change their output simultaneously.Geometrical ExplanationThe output vector v is one of the vertices of the n-dimensional cube-1,1 in E space.The vector moves during recursions from vertex to vertex,untilit is stabi
9、lizes in one of the 2 vertices available.The movement is from a vertex to an adjacent vertexsince the asynchronous update mode allows for a single-component update of an n-tuple vector at a time.The final position of v as k,is determined by weights,thresholds,inputs,and the initial vector v as well
10、as the order of transitions.nn0nTo evaluate the stability property of the dynamical system of interest,the computational energy functionis defined in n-dimensional output space v .If the increments of a certain bounded positive-valuedcomputational energy function under the transitionrule are found t
11、o be non-positive,then the function can be called a Lyapunov function,and the systemwould be asymptotically stable.The scalar-valued energy function for the discussedsystem is a quadratic form:nE=-1/2 V WV-i V+t Vtttor E=-1/2 w v v -i v +t v i=1nj=1jinijiji=1i=1niiniiThe energy function in asynchron
12、ous mode.Assume that the output node I has been updated at the k-th instant so that v -v =v .Computing the energygradient vector:ik+1ki E=-1/2(W +W)v-i +t =-Wv-i +t vtttttW =WtThe energy increment becomes E=(E)v=(-W v-i +t )v ttttiiiiThis is because only the i-th output is updated.Therefore we have
13、(v)=0 v 0itThis can be rewritten as E=-(w v +i -t)v for j i,ii i jijnj=1or briefly E=-net v i iNote that when net 0,then v 0 thus (net v)is always non-negative.In other words,any corresponding energy changes E are non-positiveprovided that w =w .i i i i i i ijjiFurther we can show that the non-incre
14、asing energyfunction has a minimum.Since W is indefinite because of its zero diagonal,then E has neither a minimum nor maximum in unconstrained output space.However,E is obviously bounded in n-dimensional space consisting of the 2 vertices of n-dimensional cube,Thus,E has to reach its minimum finall
15、y under the update algorithm.nExample of recursive asynchronous update of computed digit 4:(a)(b)(c)(d)(e)where(a)k=0,(b)k=1,(c)k=2,(d)k=3,(e)k=4.The initial map is a destroyed digit 4 with 20%of thepixels randomly reversed.For k4,no changes are produced at the network output since the system arrive
16、d at one of its stable states.9.3 Gradient-Type Hopfield NetworkConsider the continuous-time single-layer feedbacknetworks.One of its model is given below.ccccgggg112233nn123nvvvvwn11nww32iiii123nuuuu123n123nvvvvIt consists of n neurons,each mapping its input u intothe output v through the activatio
17、n function f(u ).Conductance w connects the output of the j-th neuronto the input of the i-th neuron.It is also assumed thatw =w and w =0.The KCL equation for the input node having potentialu can be obtained asiiiijijjiiiii +w v -u (w +g )=C ij=1jjn ij jij=1jjnijiidudtiDefiningG =w +g ,C=diagC ,C,G=
18、diagG ,G j=1jjniji1n1nThen we haveC du(t)dt=Wv(t)-Gu(t)+i and v(t)=f(u(t)It can be shown thatdEdt=-(c )dudttdvdtIt follows thus that the change of E,in time,are in the general direction toward lower values of he energy function in v space-the stability condition.n 0.iija aija aThe outer product rule
19、 is as follows:For given vectors M=U,U ,where U =(x,x )write1mktkk1nW=(U U -I)=mk=1k kt 0 x x x x x x 0 x x x x x x 0mk=11111222nnnn2kkkkk kk kk kk k=x x x x00k=1k=1mmkkkkn11nand this can be implemented by following procedure:(1)Set W=0 (2)For k=1 to m,input U ,do w =w +x x for all connected pair(i,j)Check to see if it is reasonable:1)Suppose that U ,U are orthogonal and mm,then1mWU =(U U -I)U +(U U -I)U kkkkkttiimi=1ik=U (U U -IU )+(U U U -IU )kkkkkkttmi=1iiik=U n-U +(m-1)(-IU )=(n-1)U -(m-1)U =(n-m)U kkkkkkSgn(WU )=Sgn(n-m)U =U ,k=1,2,mkkkHence U are stable states.k