francis t. s. yu - entropy and information optics

348 206 0
francis t. s. yu  -  entropy and information optics

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

[...]... absent from the optical standpoint As a result of the recent advances in modern information optics and optical communication, the relationship between optics andentropyinformationhasgrownmore rapidly than ever Although everyoneseems to know the word information, a fundamental theoristic concept may not the case Let us now definethe meaning of be 1 2 Chapter l information Actually, information may be defined... jointprobability of ai and bj However, if I(ai; hi) 0, then P(ai, bj) P(aJP(bj), that is, there is a lower joint probability of a and bj i bj) Chapter I 6 I(bj) A - log, P(bj) (14 I(aJ and I(bj) are defined as therespective input and output s e ~ - i n f o r ~ a t i o n of event ai and event bj In other words, I(ai) and I(bj) represent the amount of information provided the input and output the information channel... whichever comes first If the informationchannel is noiseless, then the mutualinformation I(ai; bj) is equal to I(ai), the input self -information of ai However, if the information channel is deterministic, then the mutual information is equal to I(bj), the output self -information of bj Moreover, if the input-output of the information channel is statistically independent, then no information can transferred... H ( ~ /=~-K1 - P - 4) log2 (1 - P - 4) + P log2 P + 4 log, 4 ) 1 (1.89) Thus the channel capacity is c = (1 - dr.1 - log2 (1 - 411 + [(l - P - 4) log2 (1 - P - 4) + P log,pI (1.90) We see that, if p = 0, the channel capacity can be written C=l-q (1.91) 22 ~hapte~ l The capacity is equal to thatof a noiseless binary channel minus the erasure probability q We now consider information transmission through... From the relationship of Eq (1.48) and the conditional entropy (1.32) and (1.33), we have of Eqs I ( A ;B) = H ( A ) - H ( A / B ) (1.49) I(A;B) = H(B) - H ( B / A ) (1S O ) and Equations (l 49) and (1.50) are of interest to us in determination of the mutual information (the amount of information transfer) For example, if H ( A ) is considered the average amount of information provided at the input... output entropy H(B) It can be readily checked that the maximum value of H(@ occurs when the input events are equiprobable, that is, P(a1) = P(a2) = Thus the output probability distribution is 3 P(b1) = P(b2) = $(l - 4) (l 87) P(b3) = 4 We see that H(B) can be evaluated: (l 88) = (1 - dr.1 - log2 (1 - 491 - 4 log2 41 From Eq (1.86) we have the conditional entropy H ( B / A ) : H ( ~ /=~-K1 - P - 4) log2... sections provides a very useful application of entropy information to optics Readers who are interested in a rigorous treatment of information theory are referred to the classic papers by Shannon [l-31 and the text by Fano [4] Information theory has two general orientations: one developed by Wiener [S, 61, andtheother by Shannon [l-31 Althoughboth Wiener and Shannon share common probabilistic basis,... bj) I(ai; - J(ai/bj) (1.15) I(ai; = I(bj) - I(bj/ai) (1.16) and bj) From the definition of (1.17) the self -information of the point (ai, bj) the product ensemble of AB, one can show that + I(ai; bi) = I(ai) I(bj) - J(ffibj) (1.18) Conversely, + I(aibj) = I(ai) I(!)) - I(ai; bj) (1.19) In concluding this section, we point out that, for the mutua1 information I(ai; bj) (i.e., the amount of information. .. average mutual information between the input and output sequences of an and p" can be written I(A'; B') H(Bn )- H ( B n / A n ) (1.67) where B is the output product space ' ' We also see that, from Eq (1.32), the entropy of B can be written = H(B1) + H ( & / B l ) + H ( B ~ / B Z B ~ ) + ff(Bn/Bn-1* + BI) (1.68) * * * * where H(Bi/Bi-l B1)' P(&) log, P(PJPi-1 * ' * p1) (1.69) B' The conditional entropy of... and output, then it is said u to be a ~ o u b l y n i f o r ~channel or just simply a uniform channel In the following, we evaluate the capacity for a special type of uniform channel, namely, an n-ary symmetric channel Let the transition probability matrix of an n-ary symmetric channel be P il-p P n -nl - l - P n-l (1.78) I i To evaluate the channelcapacity, we first evaluate the average mutual information . abstract artistic to very soph- isticated scientific uses. The purpose of this text is to discuss the relationship between optics and information transmission. However, it is emphasized that. explosion of high-speed, high-data-rate and high-capacity communi- cation systems. This volume discusses the fundamentals and the applications of entropy and information optics by means of a sampling. detailed discussion of optics and information, we devote this first chapter to the fundamentals of information trans- mission. However, it is noted that entropy i~~or~~tio~ was not originated

Ngày đăng: 05/06/2014, 11:45

Từ khóa liên quan

Tài liệu cùng người dùng

Tài liệu liên quan