Illustration of the TSP algorithm docx

22 150 0
Illustration of the TSP algorithm docx

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

1 Illustration of the TSP algorithm                     2 Key idea  ! ""#" "!$! "%&''(')!*+    %  # 3 ,!(- . ,!(-'&&)'''((' -"&&/+!(&'-!()/&''' ') . 0+!(1,1!2'!()/&-&' ) 333 . '((+4,'&-'+/-5*) 2')-'&!+/)' . '+))(()'4 6 )'('(&)&'-'&)''(((&)&'- 6 &'-)')')''(5 6 (')''(')&'-+5 6 )()))-''5 6 ))-'' 6 %2')'(+ 6 ) 3333 4 Application to convolutional code ')& ,% )''( ' )& )&$'   !'+'2') 78' 9 78' 9   &(+ ) ) ' ' )⊕' )⊕' ,0:%;%:4'&2')<)'&)&2')8)5)9 '&')859 8)⊕'5)⊕'9 5 Use encoder state space (Trellis Diagram)  &(+ )     =  =  0                 6   =  =                =  =              ')&  )''(          ! 7 Viterbi Decoder action ,0:%;%:4  '&2')<)'& )&2')8)5)9'&')859 8)⊕'5 )⊕'9 >?/(&)4'&8)5)9>@ 7!!(+85A)5)9 7!8)⊕'5)⊕'A)5)9   7!8'5'9 'B'&-2( 8 %')7'& . %40 &') 5 &  5'C'-&') !*''+*)&2') . -&)'('()&4 ?-&')5 &   =(('!''!2(5 & '  5'2') *(*C'-*- . 0'/'*')')*+&-''-)'('( )&-'&') C*5-')&)'!'(+@&'&&')  9 Convolutional Codes 9 %')7'('(&8)'D&9 . '('()&(' 05C'-&')!*''+)&2') )'&C'-&')!*'(("@)&2') '&''@)&2') . 0''@2')'C'-*-&-((" @''&-*(("@( ' 10 %')74((' . 2')4C'-*- 5 & '    . 2')4C'-*- 5 & '    [...]... Function • The transfer function T(D,L,N) D5 L3 T(D, L, N) = 1 − DNL(1 + L) 13 Transfer Function (cont’d) • Performing long division: T(D,L,N) = D5L3N + D6L4N2 + D6L5N2 + D7L5N3 + … • If interested in the Hamming distance property of the code only, set N = 1 and L = 1 to get the distance transfer function: T (D) = D5 + 2D6 + 4D7 + … There is one code sequence of weight 5 Therefore dfree=5 There are...PERFORMANCE: theoretical uncoded BER given by               Puncoded ≈ Q( Eb ) No / 2 where Eb is the energy per information bit for the uncoded channel, Es/N0 = Eb/N0, since there is one channel symbol per bit.  for the coded channel with rate k/n, nEs = kEb and thus Es = Eb k/n The loss in the signal to noise ratio is thus -10log 10 k/n dB for rate ½ codes we thus loose 3 dB in SNR at the receiver... The upperbound for the event error probability is given by Pevent ≤ ∞ ∑ A(d )PEP(d ) d = d free where A (d ) is the number of codeword at dis tan ce d 15 performance • using the T(D,N,L), we can formulate this as Pevent ≤ T (D, L, N) L = N =1; D = 2 p(1− p) • The bit error rate (not probability) is written as Pbit ≤ d T (D, L, N) L =1; N =1;D = 2 p(1− p) dN 16 The constraint length of the ½ convolutional... probability distribution transitions among the states are governed by a set of probabilities called transition probabilities In a particular state an outcome or observation can be generated, according to the associated probability distribution It is only the outcome, not the state visible to an external observer and therefore states are ``hidden'' to the outside; hence the name Hidden Markov Model EXAMPLE... sequences of weight 6, four code sequences of weight 7, … 14 performance correct • • node The event error probability is defined as the probability that the decoder selects a code sequence that was not transmitted For two codewords the Pairwise Error Probability is incorrect PEP(d ) = d ∑   i = d +1 2 d/2 d i d −i d p  d  p (1 − p) ≤2   (1 − p) i  1− p    ≤ ( 4(p(1 − p) ) d • The upperbound... proportional to 2 K (number of different states) 17 Markov model example Figure from Huang et al, via 18 Markov Model • What is the probability of 5 consecutive up days? • Sequence is up-up-up-up-up I.e., state sequence is 1-1-1-1-1 • P(1,1,1,1,1) = π1a11a11a11a11 = 0.5 x (0.6)4 = 0.0648 19 Application to Hidden Markov Models Definition: The HMM is a finite set of states, each of which is associated with... is only the outcome, not the state visible to an external observer and therefore states are ``hidden'' to the outside; hence the name Hidden Markov Model EXAMPLE APPLICATION: speech recognition and synthesis 20 HMM 0.7 0.1 0.1 0.6 0.2 0.6 0.3 0.2 1 2 0.3 0.5 0.2 0.4 0.1 0.2 0.5 0.2 P(up) P(down) P(no-change) = 0.3 3 = initial state probability 0.3 0.3 0.4 0.6 0.5 0.2 0.2 0.5 0.3 0.2 0.4 0.1 0.5 21 transition . 1 Illustration of the TSP algorithm        .  d d 2/d p1 p didi d 2 1d i ))p1(p(4( )p1(2)p1(p i d )d(PEP −≤ −       ≤−         = − − + = ∑ dcetandisatcodewordofnumbertheis)d(Awhere )d(PEP)d(AP free dd event ∑ ∞ = ≤ )) '& ')) 16 ') . '-08%55?95*)'( . 0!8'!!(+9*' )p1(p2D;1NL event )N,L,D(TP −=== ≤ )p1(p2D;1N;1L dN d bit )N,L,D(TP −=== ≤ 17 0)''('-F)'('()&4/. $B+(' (>+,!&)&'-4'( J 8'!&'9 18 Markov model example Figure from Huang et al, via 19 Markov Model . What is the probability of 5 consecutive up days? . Sequence is up-up-up-up-up I.e., state sequence is 1-1-1-1-1 .

Ngày đăng: 07/07/2014, 06:20

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan