Principles of communications

13 186 0
Principles of communications

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Principles of Communications By: Vinh Dang Quang Course information     Lecturer: Msc Dang Quang Vinh Mail: dg_vinh@yahoo.com Mobile:0983692806 Duration:30 hrs Outline       Basic concepts Information Entropy Joint and Conditional Entropy Channel Representations Channel Capacity Basic concepts What is Information Theory?    Information Theory: how much information  … is contained in a signal?  … can a system generate?  … can a channel transmit? Used in many fields: Communications, Computer Science, Economics,… Examples: Barcelona 0-3 SLNA Information  Let xj be an event with p(xj)  If xj occurred, we have I ( x j ) = log a = − log a p( x j ) p( x j ) units of information  The base of the logarithm     10 →the measure of information is hartley e →the measure of information is nat →the measure of information is bit Examples 10.1 (page 669) Entropy     H(X) = - ∑ p(x) log p(x) Entropy = information = uncertainty If a signal is completely predictable, it has zero entropy and no information Entropy = average number of bits required to transmit the signal Entropy example         Random variable with uniform distribution over 32 outcomes H(X) = - ∑ 1/32 log 1/32 = log 32 = # bits required = log 32 = bits! Therefore H(X) = number of bits required to represent a random event How many bits are needed for: Outcome of a coin toss “tomorrow is a Wednesday” US tops Winter Olympics tally” Entropy example  Horse race with horses, with winning probabilities ½, ¼, 1/8, 1/16, 1/64, 1/64, 1/64, 1/64 Entropy H(X) = bits  How many bits we need?  (a) Index each horse  log8 = bits  (b) Assign shorter codes to horses with higher probability: 0, 10, 110, 1110, 111100, 111101, 111110, 111111  average description length = bits!  Entropy    Need at least H(X) bits to represent X H(X) is a lower bound on the required descriptor length Entropy = uncertainty of a random variable Joint and conditional entropy Joint entropy: H(X,Y) = ∑x ∑y p(x,y) log p(x,y)  simple extension of entropy to RVs  Conditional Entropy: H(Y|X) = ∑x p(x) H(Y|X=x) = ∑x ∑y p(x,y) log p(y|x) “What is uncertainty of Y if X is known?”  Easy to verify:     If X, Y independent, then H(Y|X) = H(Y) If Y = X, then H(Y|X) = H(Y|X) = extra information between X & Y Fact: H(X,Y) = H(X) + H(Y|X) Mutual Information  I(X;Y) = H(X) – H(X|Y) = reduction of uncertainty due to another variable      I(X;Y) = ∑x ∑y p(x,y) log p(x,y)/{p(x)p(y)} “How much information about Y is contained in X?” If X,Y independent, then I(X;Y) = If X,Y are same, then I(X;Y) = H(X) = H(Y) Symmetric and non-negative Mutual Information Relationship between entropy, joint and mutual information Mutual Information     I(X;Y) is a great measure of similarity between X and Y Widely used in image/signal processing Medical imaging example:  MI based image registration Why? MI is insensitive to gain and bias [...]... reduction of uncertainty due to another variable      I(X;Y) = ∑x ∑y p(x,y) log p(x,y)/{p(x)p(y)} “How much information about Y is contained in X?” If X,Y independent, then I(X;Y) = 0 If X,Y are same, then I(X;Y) = H(X) = H(Y) Symmetric and non-negative Mutual Information Relationship between entropy, joint and mutual information Mutual Information     I(X;Y) is a great measure of similarity ... p( x j ) p( x j ) units of information  The base of the logarithm     10 →the measure of information is hartley e →the measure of information is nat →the measure of information is bit Examples... bits required = log 32 = bits! Therefore H(X) = number of bits required to represent a random event How many bits are needed for: Outcome of a coin toss “tomorrow is a Wednesday” US tops Winter... descriptor length Entropy = uncertainty of a random variable Joint and conditional entropy Joint entropy: H(X,Y) = ∑x ∑y p(x,y) log p(x,y)  simple extension of entropy to RVs  Conditional Entropy:

Ngày đăng: 03/01/2016, 21:08

Từ khóa liên quan

Mục lục

  • Principles of Communications

  • Course information

  • Outline

  • Basic concepts What is Information Theory?

  • Information

  • Entropy

  • Entropy example 1

  • Entropy example 2

  • Slide 9

  • Joint and conditional entropy

  • Mutual Information

  • Slide 12

  • Slide 13

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan