ImageVerifierCode 换一换
格式:PPT , 页数:63 ,大小:822.50KB ,
文档编号:7977032      下载积分:22 文币
快捷下载
登录下载
邮箱/手机:
温馨提示:
系统将以此处填写的邮箱或者手机号生成账号和密码,方便再次下载。 如填写123,账号和密码都是123。
支付方式: 支付宝    微信支付   
验证码:   换一换

优惠套餐
 

温馨提示:若手机下载失败,请复制以下地址【https://www.163wenku.com/d-7977032.html】到电脑浏览器->登陆(账号密码均为手机号或邮箱;不要扫码登陆)->重新下载(不再收费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录  
下载须知

1: 试题类文档的标题没说有答案,则无答案;主观题也可能无答案。PPT的音视频可能无法播放。 请谨慎下单,一旦售出,概不退换。
2: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
3: 本文为用户(ziliao2023)主动上传,所有收益归该用户。163文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知163文库(点击联系客服),我们立即给予删除!。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

1,本文(第4章-信道容量教学课件.ppt(63页))为本站会员(ziliao2023)主动上传,163文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。
2,用户下载本文档,所消耗的文币(积分)将全额增加到上传者的账号。
3, 若此文所含内容侵犯了您的版权或隐私,请立即通知163文库(发送邮件至3464097650@qq.com或直接QQ联系客服),我们立即给予删除!

第4章-信道容量教学课件.ppt(63页)

1、Review property of mutual info.function:Property.1 Relationship between average mutual info.and channel input probability distribution Property 1:I(X;Y)is an upper convex function of the channel input probability distribution p(x).I(X;Y)p(x)Review property of mutual info.function:Property.2 Relation

2、ship between Info.content and channel transition probability distribution Property 2:I(X;Y)is a concave function of channel transition probability distributes p(y/X).I(X;Y)p(y/x)What is channel?The channel is a carrier transmitting messages a passage through which signal passes.The information is ab

3、stract,but the channel is concrete.For instance:If two people converse,the air is the channel;If the two call each other,the telephone line is the channel;If we watch the television,listen to the radio,the space between the receiver and the transmitter is the channel.4.1.The model and classification

4、 of the channel In this part,we will mainly introduce two parts:Channel models Channel classifications 4.1.1 Channel Models we can treat channel as a converter which transfer events.the channel model can be indicated as the follow Fig:Binary Symmetric Channel(BSC)is the simplest channel model A BSC

5、is shown as below:BSC DMC We assume that the channel and the modulation is memoryless.The inputs and outputs can then be related by a set of conditional probabilities|ijijP YyXxP y x0,1,1iQ0,1,1jqDMC This channel is known as a Discrete Memoryless Channel(DMC)and is depicted as4.1.2 Channel classific

6、ations Channel can be classified into several types.ChangeableConstanttypeParameter 2)Two user channel(point to point)1)users typeMulti-user channel(work)Open wiresymmetrical balance cableSolid mediaCable fine coaxial cablecoaxial cableLong waveAMShortwaveFM3)type of mediaMobileair mediaHorizon rela

7、yMicrowaveTroposphericScatteringIonosphericSatelliteLightWaveguideMixed mediaCable(对流层)(电离层)discretememorylesscontinuoussignal semi-discretewith memorysemi-continuousno interference4)signal/interferenceThermal noiseLinear superpositionImpulse noiseinterferencewith interferenceIntermodulationMultipli

8、cative FadingInter-symbol interference4.2 Channel doubt degree and average mutual information 4.2.1 Channel doubt degree 4.2.2 Average mutual information 4.2.3 Properties of mutual information function 4.2.4 Relationship between entropy,channel doubt degree and average mutual information4.2.1 Channe

9、l doubt degree Assume r.v.X indicates the input set of channel,and r.v.Y indicates the output set of channel,the channel doubt degree is:The meaning of“channel doubt degree H(X|Y)”is that when the receiving terminal gets message Y,the average uncertainty still leaves about source X.In fact,the uncer

10、tainty comes from the noise in channel.(|)(|)()log()XjijijijH X YEH X bp a bp a b 11(|)(|)(|)(|)log(|)nnjijijijijiiH X bp ab I abp abp ab This means that if the average uncertainty of source X is H(X),well get more or less information which eliminates the uncertainty of the source X when get the out

11、put message Y.So we have the following concept of average mutual information.Since we have:(|)()H X YH X4.2.2 Average mutual information The average mutual information is the entropy of source X minus the channel doubt degree.The above meaning is that when the receiver gets a message Y,the average i

12、nformation he can get about X from every symbol he received.(;)()(|)defI X YH XH X Y4.2.3 Properties of mutual information function Property 1:Relationship between mutual information and channel input probability distribution.I(X;Y)is an upper convex function of the channel input probability distrib

13、ution P(X).This can be shown in Fig.4.5 and Fig.4.6.Fig.4.5.I(X;Y)is convex function of P(X)Fig.4.6.Message passing through the channel E.g.4.1 Considering a dual element channel,the probability distribution is110PXand the matrix of channel is Where is the probability of transmission error.11122122p

14、pppPpppp1pp)1log1log()()()|(1log)|()()()|(1log)()()|()();(ppppxpYHxypxypxpYHxypxypYHXYHYHYXIXXYXY)()()1log1log()(pHYHppppYHThen the mutual information is,And we can get the following results,So,The average mutual information diagram is shown in the following Fig.4.7.(0)(0)(0|0)(1)(0|1)(1)(1)(0)(1|0)

15、(1)(1|1)(1)P YP XYXP XYXppppP YP XYXP XYXpppp11()()log()logH YppppppppFig.4.7.Mutual information of the dual symmetric channel From the diagram,we can see that when the input symbols satisfy“equal probability distribution”,the average mutual information I(X;Y)reaches the maximum value,and only at th

16、is time the receiver gets the largest information from every symbol he received.Property 2 Relationship between information and channel transition probability distribution.I(X;Y)is a concave function of channel transition probability distribution of p(Y|X).Fig.4.8.I(X;Y)is a concave function of P(X|

17、Y)E.g.4.2(This is the follow-up of E.g.4.1)Considering dual channel,now we know the average mutual information is,when the source distribution is)()();(pHppHYXIaverage mutual information I(X;Y)is the concave function of p,just see it from the following diagram,the Mutual info.of fixed binary source

18、From the diagram,we can see,once the binary source fixed,when the channel property p changes,well get the different mutual information I(X;Y),when p=1/2,I(X;Y)=0,that means the receiver get the lest information from this channel,and all the information is lost in the way of transmission,this channel

19、 has the most loudly noise.Property 3 If the channel input is discrete and without memory,we have the following inequality1211()(,.,)()(;)(;)NNiiNiiipp x xxp xII X Yxx y Property 4 If the channel is discrete and without memory,we have121211(|)(,.,|,.,)(|)(;)(;)NNNiiiNiiipp y yyx xxp yxII X Yy xx y(R

20、emember the results)4.2.4 Relationship between entropy,channel doubt degree and average mutual information(;)()(|)(;)()()(,)(;)(;)0I X YH XH X YI X YH XH YH X YI X YI Y X(,)()()(,)(|)()(|)()H X YH XH YH X YH X YH YH YXH X E.g.4.3 There is a source 4.06.0)(21xxXPXIts messages pass through a channel w

21、ith noise.The symbols received by the other end of the channel are .The channels transfer matrix is ,please calculate,21yyY 434161651x2xX(1)The self-information included in the symbolandof event.(2)The information about the receiver gets when it observes the message)2,1(ixi)2,1(jyj(3)The entropy of

22、source X and received Y(4)The channel doubt degree H(X|Y)and the noise entropy H(Y|X).(5)The average mutual information got by receiver when it receives Y.Solution:1212()log()log 0.60.737 I xp xbit 2222()log()log 0.41.322 I xp xbit (1):(2):111121251()()(/)()(/)0.60.40.664p yp x p yxp x p yx212122213

23、()()(/)()(/)0.60.40.464p yp x p yxp xp yx1111221(/)5/6(;)loglog0.474 ()0.6p yxI xybitp y2112222(/)1/6(;)loglog1.263 ()0.4p yxI xybitp y 1221221(/)1/4(;)loglog1.263 ()0.6p yxI xybitp y 2222222(/)3/4(;)loglog0.907 ()0.4p yxI xybitp y(3):22()()log()(0.6log0.60.4log0.4)log 100.971/()()log()(0.6log0.60.4

24、log0.4)log 100.971/iiijjjH Xp xp xbit symH Yp yp ybit sym 255111133 (0.6log0.6log0.4log0.4log)log 1066664444 0.715 /()(/)()(/)(/)()(/)()0.971 0.715 0.971 0.715 bit symbolH XH Y XH YH X YH X YH XH Y XH Yb/it symbol(/)()(/)log(/)ijijiijH Y Xp x p yxp yx(5):(4):(;)()(|)0.971 0.7150.256/I X YH XH X Ybit

25、 sym4.3 Discrete channel without memory Three groups of variables to describe the channel:(1)Channel input probability space,)(,XpXK(2)Channel output probability space,)(,YqYK(3)Channel transfer probability,So,the channel can be represented by),/(,KKYXYPX(|)P Y X This can be indicated by the followi

26、ng illustration,2,1,2,1,)(channel)(11exitenter11kKjkKimmKnnKmjYyniXxqqyyyqYppxxxpXKKKKand the channel transfer matrix is)()/()()(1111KKKKnmnmxyPxyPxyPxyPP When K=1,it degenerates to the single message channel;and when n=m=2,it degenerates to the binary single message channel.If it satisfies symmetry

27、,it constitutes the most commonly used BSC.Fig.4.11.Binary message symmetrical channel1010)(ppxpX11)(xyP1010)(qqyqY4.4 Channel capacity 4.4.1 The concept of channel capacity 4.4.2 Discrete channel without memory and its channel capacity 4.4.3 Continuous channel and its channel capacity4.4.1 The conc

28、ept of channel capacity The capacity of channel can be defined as the maximum value of average mutual information,I(X;Y)Cp(x)maxdefThe unit of channel capacity C is bit/symbol or nat/symbol From the property mentioned before,we know I(X;Y)is an upper convex function of probability distribution p(x)o

29、f input variable X.For a specific channel,there always exists a source which maximizes the information of every message transmitting through the channel.That means the maximum of I(X;Y)exists.And the probability distribution p(x)is called the optimum input distribution.4.4.2 Discrete channel without

30、 memory and its channel capacity Classification of the discrete message sequence channel For discrete channel without memory,it satisfies the following relationship.According to the“property 4”of the mutual information I(X;Y)of the message sequence,for the discrete channel without memory,we have);()

31、;(1kKkkYXIYXI1(|)(|)(|)Kno memorykkkstationaryKPP yxPy xY XNote:only when the source is of without memory,the equal relationship in this formula may be satisfied()()1()11max(;)max(,)max(,)Kkkp xp xkKKstationarykkkkp xkkCII XYI XYCKCx ySo we can get the following deduction which gets the formula of c

32、hannel capacity C Theorem of discrete channel without memory Assuming that the transmission probability matrix of the discrete channel without memory is Q,the sufficient conditions under which the input letter probability distribution p*can make the mutual information I(p;Q)to achieve maximum value

33、are 0)(when,|);(0)(when,|);(*kppkkppkapCYaxIapCYaxIWhere is the average mutual information when source letter is sent;and C is the channel capacity of this channel.)()|(log)|();(1jkjkJjjkbpabqabqYaxIkaUnderstanding this theorem Firstly,under this kind of distribution,each letter whose probability is

34、 above zero provides mutual information C,and each letter whose probability is zero provides mutually information lower than or equal to C.Secondly,only under this kind of distribution,it may cause I(p;Q)to obtain the maximum value C.Thirdly,I(X;Y)is the average of .That is to say,it satisfies this

35、equation(,)kI xa Y);()(),(YaxIapYXIkkk(1)If we want to enhance I(X;Y),enhancing p(ak)may be a good idea.(2)However,once p(ak)is enhanced,I(x=ak;Y)may be reduced.(3)To adjust p(ak)repeatedly,make I(x=ak;Y)all equal to C (4)This time I(X,Y)=CThe theorem only provides a sufficient condition of)(xpto ma

36、ke CYXI),(distribution and the value of C;but it may help to get the value of C of several kinds of channels in simple situation.It does not give the concrete E.g.4.4 Assume the transmission matrix of binary discrete symmetrical channel is32313132(1)If,4/1)1(,4/3)0(PPplease calculate);(),/(),/(),(YX

37、IXYHYXHXH(2)Please calculate the capacity of channel,and the probability distribution when reaching the capacity of channel.(1)symbolbitxpXHii/811.0)41log4143log43()()(22ijijijixypxypxpXYH)/(log)/()()/(symbolbitYXHXHYXIsymbolbitXYHYHXHYXHXYHYHYXHXHYXIsymbolbitypYHxypxpxypxpyxpyxpypxypxpxypxpyxpyxpyp

38、symbolbitjj/062.0749.0811.0)/()();(/749.0918.0980.0811.0)/()()()/()/()()/()();(/980.0)4167.0log4167.05833.0log5833.0()()(4167.032413143)/()()/()()()()(5833.031413243)/()()/()()()()(/918.0 10log)32lg324131lg314131lg314332lg3243(2222212122212212111121112(2)Application Example 3.6222max(;)log1122log 2(

39、lglog)log 100.082/33331()2miiCI X YmHbit symP xWhere m represents the number of output symbol set;Hmi is the entropy of the row vector of channel matrix.(|)()(|)log(|)11()(|)log(|)log(|)(|)ijijimiijijijiijjjijiH Y Xp x p yxp yxHp xp yxp yxp yxp yx 4.4.3 Continuous channel and its channel capacity Ch

40、aracteristic of continuous channel Analog channel Basic knowledge to addable channel Shannon formula Usage of Shannon formulaCharacteristic of continuous channel Characteristic 1:The time is discrete,the value scope is continuous.Characteristic 2:At each moment,it is the single random variable whose

41、 value is continuous.Analog channelBasic knowledge to addable channel NXYX:channel input N:channel noiseY:channel output If two of X,Y,N are Gauss distributions,then the other is also the Gauss distribution.The differential entropy of the r.v.satisfying Gaussian distribution only concerns with its v

42、ariance and has nothing to do with the average value.2Fig.4.22.Addable channelX()H X:When a generally stationary random process source with limited frequency(F)and time(T)passes through a white Gaussian channel which has limited power(PN),the channel capacity is:Shannon formula),(wtXThis is the famo

43、us for continuous channel.When T=1,the capacity is:)1log(2SFC2log(1)log(1)sNPSCFTFTP:AssumeYXN,where X and N are independent discrete r.v.s;and 2(0,)NN(;)()(|)()()I X YH YH Y XH YH N()()max(;)max()()11log2()log22211loglog(1)22sp xp xsNNsNsNNCI X YH YH Ne PPePPPPPPSinceWe haveThe biggest entropy theo

44、rem of limited average power(|)()|()ny xnp y xp np yxy:Due to the limited frequency F for and according to the Nyquist sample theorem,the continuous signal X(t,w)can be equivalent to 2F discrete signals per second.),(wtX2log(1)/ssNPCF CFbit sPThat is:Considering time duration T:2log(1)sTsNPCFT CFTbi

45、tPFig.4.23.Shannon formulaUsage of Shannon formulaIn analog communications,the frequency modulation surpasses the amplitude modulation;the wider the frequency band,the stronger is the anti-disturbance.In digital communications,the pseudo-noise(PN)code straightforwardly expands the signal.The wider t

46、he band width,the more the frequency increases,and the stronger is the anti-disturbance.Another form of Shannon formula200log(1)log(1)log(1)bbE FESCFTFTFTN FNWhere N0 is the noise power intensity in unit bandwidth.FN02FSSTEbb/is the bit energy.And0/NEbis the normalization of SNR.When 1/0NEb)(/2ln1)(

47、/)1log(000bitNEnatNENEFTCbbb:when the SNR is very low,the channel capacity can be approximately determined by its signal noise ratio.E.g.4.61010logSdBNIn photo transmission,every frame has about 2.25 106 pixels.In order to reproduce good image,we need about 16 points brightness levels.Assuming equal

48、 probability distribution of the brightness levels,please calculate the transmission channel bandwidth requirement of 30 images per second(signal-to-noise ratio is 30dB).Solution:The channel capacity of additive white Gaussian noise(AWGN)channel in every unit time is,(bit/s)1log(limNSWTCCTtThe required information transmission rate is:)/1(log107.23016log1025.22826NSWCt310/30)/lg(10NSdBNS)(107.2)101(log/107.27328HzWExample 3.8(p26)Application exampleEnd of Chapter 4

侵权处理QQ:3464097650--上传资料QQ:3464097650

【声明】本站为“文档C2C交易模式”,即用户上传的文档直接卖给(下载)用户,本站只是网络空间服务平台,本站所有原创文档下载所得归上传人所有,如您发现上传作品侵犯了您的版权,请立刻联系我们并提供证据,我们将在3个工作日内予以改正。


163文库-Www.163Wenku.Com |网站地图|