Introduction to Thermodynamics and Statistical Physics phần 2 ppt

17 480 0
Introduction to Thermodynamics and Statistical Physics phần 2 ppt

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Chapter 1. The Principle of Largest Uncertainty where l =0, 1, 2, L. A stationary point of σ occurs iff for every sm all change δ¯p, which is orthogonal to all vectors ¯ ∇g 0 , ¯ ∇g 1 , ¯ ∇g 2 , , ¯ ∇g L one has 0=δσ = ¯ ∇σ · δ¯p. (1.39) This condition is fulfilled only when the ve ctor ¯ ∇σ belongs to the subspace spanned by the vectors © ¯ ∇g 0 , ¯ ∇g 1 , ¯ ∇g 2 , , ¯ ∇g L ª [see also the discussion be- low Eq. (1.12) above]. In other words, only when ¯ ∇σ = ξ 0 ¯ ∇g 0 + ξ 1 ¯ ∇g 1 + ξ 2 ¯ ∇g 2 + + ξ L ¯ ∇g L , (1.40) where the num bers ξ 0 ,ξ 1 , , ξ L , which are called Lagrange m ultipliers, are constan ts. Using Eqs. (1.2), (1.5) and (1.37) the condition (1.40) can be expressed as −log p m − 1=ξ 0 + L X l=1 ξ l X l (m) . (1.41) From Eq. (1.41) one obtains p m =exp(−1 −ξ 0 )exp à − L X l=1 ξ l X l (m) ! . (1.42) The Lagrange multipliers ξ 0 ,ξ 1 , , ξ L can be determined from Eqs. (1.5) and (1.37) 1= X m p m =exp(−1 −ξ 0 ) X m exp à − L X l=1 ξ l X l (m) ! , (1.43) hX l i = X m p m X l (m) =exp(−1 −ξ 0 ) X m exp à − L X l=1 ξ l X l (m) ! X l (m) . (1.44) Using Eqs. (1.42) and (1.43) one finds p m = exp µ − L P l=1 ξ l X l (m) ¶ P m exp µ − L P l=1 ξ l X l (m) ¶ . (1.45) In terms of the partition function Z, which is defined as Eyal Buks Thermodynamics and Statistical Physics 10 1.2. Largest Uncertaint y Estimator Z = X m exp à − L X l=1 ξ l X l (m) ! , (1.46) one finds p m = 1 Z exp à − L X l=1 ξ l X l (m) ! . (1.47) Using the same arguments as in section 1.1.2 above [see Eq. (1 .16)] it is easy to show that at the stationary point that occurs for the proba bility distribution given by Eq. (1.47) the entropy obtains its largest value. 1.2.1 Useful Relations The expectation value hX l i can be expressed as hX l i = X m p m X l (m) = 1 Z X m exp à − L X l=1 ξ l X l (m) ! X l (m) = − 1 Z ∂Z ∂ξ l = − ∂ lo g Z ∂ξ l . (1.48) Similarly,  X 2 l ® can be expressed as  X 2 l ® = X m p m X 2 l (m) = 1 Z X m exp à − L X l=1 ξ l X l (m) ! X 2 l (m) = 1 Z ∂ 2 Z ∂ξ 2 l . (1.49) Using Eqs. (1.48) and (1.49) one finds that the variance of the variable X l is given by D (∆X l ) 2 E = D (X l − hX l i) 2 E = 1 Z ∂ 2 Z ∂ξ 2 l − µ 1 Z ∂Z ∂ξ l ¶ 2 . (1.50) However, using the following identity Eyal Buks Thermodynamics and Statistical Physics 11 Chapter 1. The Principle of Largest Uncertainty ∂ 2 log Z ∂ξ 2 l = ∂ ∂ξ l 1 Z ∂Z ∂ξ l = 1 Z ∂ 2 Z ∂ξ 2 l − µ 1 Z ∂Z ∂ξ l ¶ 2 , (1.51) one finds D (∆X l ) 2 E = ∂ 2 log Z ∂ξ 2 l . (1.52) Note that the above results Eqs. (1.48) and (1.52) are valid only when Z is expressed as a function of the the Lagrange multipliers, namely Z = Z (ξ 1 ,ξ 2 , , ξ L ) . (1.53) Using the definition of entropy (1.2) and Eq. (1.47) one finds σ = − X m p m log p m = − X m p m log à 1 Z exp à − L X l=1 ξ l X l (m) !! = X m p m à log Z + L X l=1 ξ l X l (m) ! =logZ + L X l=1 ξ l X m p m X l (m) , (1.54) thus σ =logZ + L X l=1 ξ l hX l i . (1.55) Using the above relations one can also evaluate the partial derivativ e of the entropy σ when it is expressed as a function of the expectation va lu es, namely σ = σ (hX 1 i , hX 2 i , , hX L i) . (1.56) Using Eq. (1.55) one has ∂σ ∂ hX l i = ∂ log Z ∂ hX l i + L X l 0 =1 hX l 0 i ∂ξ l 0 ∂ hX l i + L X l 0 =1 ξ l 0 ∂ hX l 0 i ∂ hX l i = ∂ log Z ∂ hX l i + L X l 0 =1 hX l 0 i ∂ξ l 0 ∂ hX l i + ξ l = L X l 0 =1 ∂ log Z ∂ξ l 0 ∂ξ l 0 ∂ hX l i + L X l 0 =1 hX l 0 i ∂ξ l 0 ∂ hX l i + ξ l , (1.57) Eyal Buks Thermodynamics and Statistical Physics 12 1.2. Largest Uncertaint y Estimator thus using Eq. (1.48) one finds ∂σ ∂ hX l i = ξ l . (1.58) 1.2.2 The Free En tropy The free entropy σ F is defined as the term log Z in Eq. (1.54) σ F =logZ = σ − L X l=1 ξ l X m p m X l (m) = − X m p m log p m − L X l=1 ξ l X m p m X l (m) . (1.59) ThefreeentropyiscommonlyexpressedasafunctionoftheLagrangemul- tipliers σ F = σ F (ξ 1 ,ξ 2 , , ξ L ) . (1.60) We have seen abo ve that the LUE maximizes σ for given values of expecta- tion values hX 1 i , hX 2 i , , hX L i. We show below that a similar result can be obtained for the free ener gy σ F with respect to given values of the Lagrange multipliers. Claim. The LUE maximizes σ F for given values of the Lagrange m ultipliers ξ 1 ,ξ 2 , , ξ L . Proof. As before, the normalization condition is expressed as 0=g 0 (¯p)= X m p m − 1 . (1.61) A t a stationary point of σ F , as we have seen previously, the following holds ¯ ∇σ F = η ¯ ∇g 0 , (1.62) where η is a Lagrange multiplier. Thus −(log p m +1)− L X l=1 ξ l X l (m)=η, (1.63) or p m =exp(−η −1) exp à − L X l=1 ξ l X l (m) ! . (1.64) Eyal Buks Thermodynamics and Statistical Physics 13 Chapter 1. The Principle of Largest Uncertainty Table 1.1. The microcanonical, canonical and grandcanonical distributions. energy number of particles microcanonical distribution constrained U (m)=U constrained N (m)=N canonical distribution average is given hUi constrained N (m)=N grandcanonical distribution average is given hUi average is given hNi This result is the same as the one given by Eq. (1.42). Taking into ac count the normalization condition (1.61) one obtains the same expression for p m as the one given by Eq. (1.47). Namely, the stationary point of σ F corresponds to the LUE probability distribution. Since ∂ 2 σ F ∂p m ∂p m 0 = − 1 p m δ m,m 0 < 0 , (1.65) one concludes that this stationary point is a maximum point [see Eq. (1.16)]. 1.3 The Principle of Largest Uncertainty in Statistical Mechanics The energy and num ber of particles of state e m are denoted by U (m)and N (m) respectively. The probability that state e m is occupied is denoted as p m . We consider below three cases (see table 1.1). In the first case (micro- canonical distribution) the system is isolated and its total energy U and num- ber of particles N are constrained , t hat is for all accessible states U (m)=U and N (m)=N. In the second case (canonical distribution) the system is allowed to exchange energy with the environment, and we assume that its average energy hUi is given. Ho wever, its number of particles is constrained ,thatisN (m)=N. In the third case (grandcanonical distribution) the sys- tem is allowed to exchange both energy and particles with the environment, and we assume that both the average energy hUi and the av erage num ber of particles hNi are given. However , in all cases, the probability distribution {p m } is not given. According to the principle of largest uncertainty in statistical mechanics the LUE is employed to estimate the probability distribution {p m },namely, w e will seek a probability distribution which is consistent with the normal- ization condition (1.1) and with the given expectation values (energy, in th e second case, and both energy and number of particles, in the third case), which maximizes the entropy. 1.3.1 Microcanonical Distribution In this case no expectation values are given. Thus we seek a probabilit y distribution which is consistent with the normalization condition (1.1), and Eyal Buks Thermodynamics and Statistical Physics 14 1.3. The Principle of Largest Uncertainty in Sta tistical Mechanics which max imizes the ent ropy. The desired pro bability distribution is p 1 = p 2 = =1/M , (1.66) where M is the number of accessible states of the system [see also Eq. (1.18)]. Using Eq. (1.2) the entropy for this case is giv en by σ =logM. (1.67) 1.3.2 Canonical Distribution Using Eq. (1.47) one finds that the probability distribution is given by p m = 1 Z c exp (−βU (m)) , (1.68) where β is the La grang e multiplier associated with the given expectation value hUi, and the partition function is given by Z c = X m exp (−βU (m)) . (1.69) The term exp (−βU (m)) is called Boltzmann factor. Moreover, Eq. (1.48) yields hUi = − ∂ log Z c ∂β , (1.70) Eq. (1.52) yields D (∆U) 2 E = ∂ 2 log Z c ∂β 2 , (1.71) and Eq. (1.55) yields σ =logZ c + β hUi . (1.72) Using Eq. (1.58) one can expressed the Lagrange multiplier β as β = ∂σ ∂U . (1.73a) The temperature τ =1/β is defined as 1 τ = β. (1.74) Exercise 1.3.1. Consider a system that can be in one of two states having energies ±ε/2. Calculate the average energy hUi and the variance D (∆U) 2 E in thermal equilibrium at temperature τ . Eyal Buks Thermodynamics and Statistical Physics 15 Chapter 1. The Principle of Largest Uncertainty Solution: The partition function is given by Eq. (1.69) Z c =exp µ βε 2 ¶ +exp µ − βε 2 ¶ =2cosh µ βε 2 ¶ , (1.75) thus using Eqs. (1.70) and (1.71) one finds hUi = − ε 2 tanh µ βε 2 ¶ , (1.76) and D (∆U) 2 E = ³ ε 2 ´ 2 1 cosh 2 βε 2 , (1.77) where β =1/τ. -1 -0.8 -0.6 -0.4 -0.2 0 -tanh(1/x) 12345 x 1.3.3 Grandcanonical Distribution Using Eq. (1.47) one finds that the probability distribution is given by p m = 1 Z gc exp (−βU (m) −ηN (m)) , (1.78) where β and η are the L agrange multipliers associated with the given expec- tation values hUi and hNi respectively, and the partition function is given by Z gc = X m exp (−βU (m) −ηN (m)) . (1.79) The term exp (−βU (m) −ηN (m)) is called Gibbs factor. Moreover, Eq. (1.48) yields Eyal Buks Thermodynamics and Statistical Physics 16 1.3. The Principle of Largest Uncertainty in Sta tistical Mechanics hUi = − µ ∂ log Z gc ∂β ¶ η , (1.80) hNi = − µ ∂ log Z gc ∂η ¶ β (1.81) Eq. (1.52) yields D (∆U) 2 E = µ ∂ 2 log Z gc ∂β 2 ¶ η , (1.82) D (∆N) 2 E = µ ∂ 2 log Z gc ∂η 2 ¶ β , (1.83) and Eq. (1.55) yields σ =logZ gc + β hUi + η hNi . (1.84) 1.3.4 Temperature and Chemical Potential Probability distributions in statistical mechanics of macroscopic param eters are typ ically extremely sharp and narrow. Consequen tly, in many cases no distinction is made between a parameter and its expectation value. That is, the expression for the entropy in Eq. (1.72) can be rewritten as σ =logZ c + βU , (1.85) andtheoneinEq.(1.84)as σ =logZ gc + βU + ηN . (1.86) Using Eq. (1.58) one can expressed the Lagrange multipliers β and η as β = µ ∂σ ∂U ¶ N , (1.87) η = µ ∂σ ∂N ¶ U . (1 .88) The chemica l potential µ is defined as µ = −τη . (1.89) In the definition (1.2) the entropy σ is dimensionless. Historically, the entropy was defined as S = k B σ, (1.90) where Eyal Buks Thermodynamics and Statistical Physics 17 Chapter 1. The Principle of Largest Uncertainty k B =1.38 × 10 −23 JK −1 (1.91) is the Boltzmann constant. Moreover, the historical definition of the temper- ature is T = τ k B . (1.92) When the grandcanonical partition function is expressed in terms of β and µ (instead of in terms of β and η), it is convenient to rewrite Eqs. ( 1.80) and (1.81) as (see homework exercises 14 of chapter 1). hUi = − µ ∂ lo g Z gc ∂β ¶ µ + τµ µ ∂ log Z gc ∂µ ¶ β , (1.93) hNi = λ ∂ lo g Z gc ∂λ , (1.94) where λ is the fugacity , which is defined b y λ =exp(βµ)=e −η . (1.95) 1.4 Time Evolution of Entropy of an Isolated System Consider a perturbation which results in transitions between the states of an isolated system. Let Γ rs denotes the resulting rate of transition from state r to state s. The probability that state s is occupied is denoted as p s . The followin g theorem (H theorem) states that if for every pair of states r and s Γ rs = Γ sr , (1.96) then dσ dt ≥ 0 . (1.97) Moreov er, equality holds iff p s = p r for all pairs of states for which Γ sr 6=0. To prove this theorem w e express the rate of c hange in the probability p s in terms of these transition rates dp r dt = X s p s Γ sr − X s p r Γ rs . (1.98) The first term represen ts the transitions to state r, whereas the second one represents transitions from state r. Using property (1.96) one finds dp r dt = X s Γ sr (p s − p r ) . (1.99) Eyal Buks Thermodynamics and Statistical Physics 18 1.5. Thermal Equilibrium Thelastresultandthedefinition (1.2) allows calculating the rate of change of entropy dσ dt = − d dt X r p r log p r = − X r dp r dt (log p r +1) = − X r X s Γ sr (p s − p r )(logp r +1) . (1.100) One the other hand, using Eq. (1.96) and exchanging the summation indices allow rewriting the last result as dσ dt = X r X s Γ sr (p s − p r )(logp s +1) . (1.101) Thus, using both expressions (1.100) and (1.101) yields dσ dt = 1 2 X r X s Γ sr (p s − p r )(logp s − log p r ) . (1.102) In general, since log x is a monotonic increasing function (p s − p r )(logp s − log p r ) ≥ 0 , (1.103) and equality holds iff p s = p r .Thus,ingeneral dσ dt ≥ 0 , (1.104) and equality holds iff p s = p r holds for all pairs is states satisfying Γ sr 6= 0. When σ becomes time independen t the system is said to be in thermal equilibrium. In thermal equilibrium , when all accessible s tates have the same probability, one finds using the definition (1.2) σ =logM, (1.105) where M is the number of accessible states of the system. Note that the rates Γ rs , which can be calculated using quantum mec han- ics, indeed satisfy the p roperty (1.96) for the case of an isolated system. 1.5 Thermal Equilibrium Consider two isolated systems denoted as S 1 and S 2 .Letσ 1 = σ 1 (U 1 ,N 1 )and σ 2 = σ 2 (U 2 ,N 2 )betheentropyofthefirst and second system respectively Eyal Buks Thermodynamics and Statistical Physics 19 [...]... considered in this approach as a continuous random variable) Use the Stirling’s formula ¶ µ 1 1 /2 N + (1.119) N! = (2 N ) N exp −N + 2N to show that µ ¶ 1 M2 f (M ) = √ exp − 2 2m N m 2 N (1. 120 ) Use this result to evaluate the expectation value and the variance of M 9 Consider a one dimensional random walk The probabilities of transiting to the right and left are p and q = 1 − p respectively The step size... Eyal Buks Thermodynamics and Statistical Physics hU i = − ³ ∂ log Zgc ∂β ´ ³ ´η ∂ log Zgc hNi = − ∂η β ­ ® ³ ∂ 2 log Zgc ´ (∆U )2 = 2 ∂β ³ 2 ´η ­ ∂ log Zgc 2 (∆N ) = ∂η 2 β σ= β= {hXn i}n6=l Zgc = e−βU (m)−ηN(m) m hU i = σ = log M P 1 −βU (m) Zc e 1 M ∂ 2 log Z ∂ 2 l σ= L P log Z + ξ l hXl i ξl = pm = log − ∂ ∂ξ Z l ® (∆Xl )2 = ³ Zc = e−βU (m) m L P hXl i = ® P ξl Xl (m) pm = pm hXl i L P 22 σ= log... Uncertainty and let σ = σ 1 + 2 be the total entropy The systems are brought to contact and now both energy and particles can be exchanged between the systems Let δU be an infinitesimal energy, and let δN be an infinitesimal number of particles, which are transferred from system 1 to system 2 The corresponding change in the total entropy is given by µ ¶ ¶ µ ∂σ 2 ∂σ 1 δU + δU δσ = − ∂U1 N1 ∂U2 N2 µ µ ¶ ¶ ∂σ 2. .. = aN (2p − 1) = aN (p − q) E D 2 b) Show that the variance (X − hXi) is given by D E (X − hXi )2 = 4a2 N pq (1. 121 ) (1. 122 ) 10 A classical harmonic oscillator of mass m, and spring constant k oscillates with amplitude a Show that the probability density function f(x), where f (x)dx is the probability that the mass would be found in the interval dx at x, is given by 1 f(x) = √ 2 − x2 π a (1. 123 ) 11... Zgc , ∂λ (1. 127 ) where λ = exp (βµ) (1. 128 ) is the fugacity 15 Consider an array on N distinguishable two-level (binary) systems The two-level energies of each system are ±ε /2 Show that the temperature τ of the system is given by Eyal Buks Thermodynamics and Statistical Physics 25 Chapter 1 The Principle of Largest Uncertainty τ= ε ³ ´, − 2hU i 2 tanh Nε −1 (1. 129 ) where hUi is the average total energy... (βε) + 2 (U − hUi )2 = 2N 2 [1 + 2 cosh (βε) ]2 (1.134) 18 Consider a one dimensional chain containing N À 1 sections (see figure) Each section can be in one of two possible sates In the first one the section contributes a length a to the total length of the chain, whereas in the other state the section has no contribution to the total length of the chain The total length of the chain in N α, and the... of the energy of the system is given by ¡ 2 E D N }ω 2 2 (∆U ) = sinh 2 β}ω 2 (1.1 32) 17 Consider a lattice containing N non-interacting atoms Each atom has 3 non-degenerate energy levels E1 = −ε, E2 = 0, E3 = ε The system is at thermal equilibrium at temperature τ a) Show that the average energy of the system is hU i = − 2N ε sinh (βε) , 1 + 2 cosh βε (1.133) where β = 1/τ b) Show the variance of... temperature τ and chemical potential µ the grandcanonical free energy obtains its smallest possible value Our main results are summarized in table 1 .2 below 1.7 Problems Set 1 Note: Problems 1-6 are taken from the book by Reif, chapter 1 Eyal Buks Thermodynamics and Statistical Physics 21 Chapter 1 The Principle of Largest Uncertainty Table 1 .2 Summary of main results micro general canonical grandcanonical... the molecule from its starting point ? 7 A multiple choice test contains 20 problems The correct answer for each problem has to be chosen out of 5 options Show that the probability to pass the test (namely to have at least 11 correct answers) using guessing only, is 5.6 × 10−4 Eyal Buks Thermodynamics and Statistical Physics 23 Chapter 1 The Principle of Largest Uncertainty 8 Consider a system of... the total chemical potential µtot is given by µtot = µint + µex , (1.110) where µint is the internal chemical potential For example, for particles having charge q in the presence of electric potential V one has µex = qV , Eyal Buks (1.111) Thermodynamics and Statistical Physics 20 1.7 Problems Set 1 whereas, for particles having mass m in a constant gravitational field g one has µex = mgz , (1.1 12) . (1.69) Z c =exp µ βε 2 ¶ +exp µ − βε 2 ¶ =2cosh µ βε 2 ¶ , (1.75) thus using Eqs. (1.70) and (1.71) one finds hUi = − ε 2 tanh µ βε 2 ¶ , (1.76) and D (∆U) 2 E = ³ ε 2 ´ 2 1 cosh 2 βε 2 , (1.77) where. − ³ ∂ log Z gc ∂η ´ β  (∆X l ) 2 ®  (∆X l ) 2 ® = ∂ 2 log Z ∂ξ 2 l  (∆U) 2 ® = ∂ 2 log Z c ∂β 2  (∆U) 2 ® = ³ ∂ 2 log Z gc ∂β 2 ´ η  (∆N) 2 ® = ³ ∂ 2 log Z gc ∂η 2 ´ β σ σ = log Z + L P l=1 ξ l hX l i σ. systems denoted as S 1 and S 2 .Letσ 1 = σ 1 (U 1 ,N 1 )and σ 2 = σ 2 (U 2 ,N 2 )betheentropyofthefirst and second system respectively Eyal Buks Thermodynamics and Statistical Physics 19 Chapter 1.

Ngày đăng: 08/08/2014, 15:21

Từ khóa liên quan

Tài liệu cùng người dùng

Tài liệu liên quan