course assignment topic quantum electrodynamics

22 0 0
Tài liệu đã được kiểm tra trùng lặp
course assignment topic quantum electrodynamics

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

3II.BIOGRAPHIES OF PHYSICISTS...3III.BACKGROUND OF QED...5IV.DEVELOPMENT OF THE THEORY...51.Nuclear force electrodynamics theory Feynman, Schwinger...62.Theory of quantum renormalization

Trang 1

UNIVERSITY OF ECONOMICS AND BUSINESS

COURSE ASSIGNMENT

Student: Nguyen Huong GiangStudent’s ID: 22055812

Instructor: Dr Nguyen Quoc Hung

HANOI, 2023

Trang 2

TABLE OF CONTENTS

ABSTRACT 3

II.BIOGRAPHIES OF PHYSICISTS 3

III.BACKGROUND OF QED 5

IV.DEVELOPMENT OF THE THEORY 5

1.Nuclear force electrodynamics theory (Feynman, Schwinger) 6

2.Theory of quantum renormalization electrodynamics (Tomonaga) 10

3.Mathematical formulation 11

VI.IMPLICATIONS OF THE NOBEL PRIZE IN PHYSICS IN 1965 17

REFERENCE 20

Trang 3

TABLE OF FIGURES

Figure 1: The elementary components of Feynman Diagrams for Quantum Electrodynamics 7

Figure 2: Compton scattering 8

Figure 3: Addition of probability amplitudes as complex numbers 9

Figure 4: Multiplication of probability amplitudes as complex numbers 9

Figure 5: One-loop contribution to the vacuum polarization function 17

Figure 6: One-loop contribution to the electron self-energy function 17

Figure 7: One-loop contribution to the vertex function 17

Trang 4

This scientific report provides an overview of the groundbreaking research that led to the Nobel Prize in Physics in 1965, awarded to Sin-Itiro Tomonaga, Julian Schwinger, and Richard Feynman for their contributions to the development of quantum electrodynamics (QED) The report discusses the limitations of the old model of QED, which only included the exchange of individual photons, and the groundbreaking discoveries made by Tomonaga, Schwinger, and Feynman regarding the complex nature of electron-electron scattering and the exchange of multiple photons The report also highlights the importance of the new QED model in accurately describing high-energy physics phenomena The report concludes by emphasizing the significance of the Nobel Prize-winning research in advancing the field of physics and contributing to our understanding of the fundamental principles of the universe.

I INTRODUCTION

The Nobel Prize in Physics is one of the five Nobel Prizes established by Swedish inventor Alfred Nobel Nobel was known for his contributions in fields such as chemistry, physics, medicine, literature, and peace After his death in 1896, the idea of creating awards to honor significant contributions in these fields was proposed and implemented by the Nobel Foundation The Nobel Prize in Physics is awarded annually to honor significant achievements in physics Initially, the award was given to those who made significant contributions to inventions in the field of physics, but it has since expanded to include those who have made important contributions to discoveries and theories in the field In 1965, the Nobel Prize in Physics was awarded to three physicists - Richard P Feynman, Julian Schwinger, and Tomonaga Shin'ichirō - for their pioneering work in the field of quantum electrodynamics (QED) QED is the study of the interaction between electromagnetic radiation and matter and is one of the most successful and accurate theories in physics The contributions of Feynman, Schwinger, and Tomonaga to QED were groundbreaking, and their work laid the foundation for many advances in the field, including the development of the Standard Model of particle physics In this report, I will explore the contributions of these three Nobel laureates to the field of QED, their impact on modern physics and technology, and the significance of their work in the broader context of scientific discovery We will examine the historical context of their discoveries, the methodology and techniques used in their research, and the implications and applications of their findings Overall, the work of Feynman, Schwinger, and Tomonaga represents a major milestone in the history of physics, and their contributions have had a profound impact on our understanding of the universe and the fundamental laws that govern it By exploring their discoveries and their enduring legacy, we can gain a deeper appreciation for the power and potential of scientific exploration and

Trang 5

II BIOGRAPHIES OF PHYSICISTS

Richard Feynman (1918-1988) was an American physicist, and one of the three

scientists awarded the Nobel Prize in Physics in 1965 He was born in New York and graduated from the Massachusetts Institute of Technology (MIT) in 1939 He then continued his studies and research at Princeton University, where he received his Ph.D in 1942 During World War II, Feynman worked at the Manhattan Project and participated in the development of the atomic bomb After the war, Feynman returned to Princeton and later worked at Cornell University He made significant contributions to the theory of nuclear physics, quantum electrodynamics, and quantum optics Feynman also developed a unique method for calculating quantum electrodynamics effects, called the Feynman diagram In addition, Feynman was a talented and renowned educator He wrote many books on physics and other subjects but is best known for "The Feynman Lectures on Physics," a collection of his lectures at the California Institute of Technology, where he taught from 1950 to 1981 This book has become an important teaching resource in the field of physics and is considered one of the greatest works on physics in the 20th century.

Julian Schwinger (1918-1994) was an American physicist, and one of the three

scientists awarded the Nobel Prize in Physics in 1965 He was born in New York and graduated from Columbia University in 1936 He then continued his studies and research at Harvard University, where he received his Ph.D in 1939 Schwinger made significant contributions to quantum electrodynamics and was one of the first to develop the renormalization theory in nuclear physics Schwinger taught at Harvard University from 1945 to 1972 and then moved to the University of California, Los Angeles He made significant contributions to the theory of quantum electrodynamics and nuclear physics and helped create important computational tools to study the interactions between particles In addition, Schwinger also developed some applications of quantum electrodynamics theory for other problems, such as statistical physics and quantum statistical physics He also developed some new research methods to solve problems in nuclear physics and quantum optics.

Tomonaga Shin'ichirō (1906–1979) was a Japanese physicist and the last of the three

scientists to win the Nobel Prize in Physics in 1965 He was born in Tokyo and graduated from Kyoto University in 1929 He then continued his studies and research at the University of Leipzig in Germany and received his Ph.D in 1938 Tomonaga returned to Japan and became a professor at the University of Tokyo, where he continued his research on the theory of electromagnetic interactions with matter and electrons and made significant contributions to quantum electrodynamics He developed the renormalization method of calculation to solve problems of infinity in quantum electrodynamics theory Tomonaga also helped determine how the quantum electrodynamics theory and Albert Einstein's theory of relativity

Trang 6

could be combined to form a relativistic quantum electrodynamics theory He also developed some applications of this theory in other problems, including the theory of nuclear structure.

III BACKGROUND OF QED

Scientists needed a quantum-mechanical explanation of light when they observed phenomena that could not be explained with the classical theory of electromagnetic radiation developed by British physicist James Clerk Maxwell in the 1860s This classical theory predicted the behavior of light as waves of vibrating electric and magnetic fields However, in 1887, German physicist Heinrich Hertz's discovery of the photoelectric effect showed that the energy of the electrons produced by light was only dependent on the wavelength of the light, not its intensity German physicist Max Planck's further work on light suggested that it may come in tiny packets, or quanta, of energy, while Einstein proposed in 1905 that light could be composed of particles called photons.

Despite initial skepticism from most physicists, American physicist Arthur Compton's discovery of the Compton effect in 1923 provided evidence that light has momentum, a property usually associated with particles, not electromagnetic waves This led physicists to develop a new description of light and its interaction with particles, while also working on developing the important new description of matter known as quantum mechanics Many of the same physicists, including Planck, Einstein, Danish physicist Niels Bohr, and German physicist Arnold Sommerfeld, worked on both problems from 1900 to 1922 The resulting quantum-mechanical explanation of light and matter describes both in terms of waves and particles.

IV DEVELOPMENT OF THE THEORY

In 1926, German physicists Max Born, Werner Heisenberg, and Ernst Pascual Jordan published the first Quantum Electrodynamics (QED) theory, which explained that the energy and momentum of the electric and magnetic fields in a light ray come in bundles called photons British physicist Paul A M Dirac applied the rules of the quantum theory to electromagnetic radiation in 1927, resulting in a theory that explained how atoms emit and absorb photons Dirac also constructed a description of electrons in 1928 that was consistent with both quantum mechanics and the special theory of relativity, which was important in reconciling quantum mechanics and relativity since descriptions involving photons and velocities near the speed of light must involve special relativity Dirac's equation predicted the existence of antimatter.

In the 1930s, physicists such as Heisenberg, Wolfgang Pauli, and J Robert Oppenheimer added more corrections to QED to make the theory more accurate, but their corrections introduced some troubling infinite terms in the equations of QED Physicists struggled for almost two decades to remove the infinite terms from QED equations and keep the theory

Trang 7

In the late 1940s, Willis Lamb and Robert Retherford discovered the Lamb shift, which showed that the interaction between light and electrons was more complicated than previously believed This led to physicists redefining QED's description of the electron in a process called renormalization, which removed the infinite terms plaguing the theory and made QED equations match the new experimental results.

Physicists made changes to QED to account for the Lamb shift, resulting in a set of equations that required the addition of a series of terms, each of which violated the special theory of relativity In the early 1950s, Richard Feynman, Julian Schwinger, and Tomonaga Shin'ichirō developed versions of QED that were consistent with the special theory of relativity Feynman's method allowed physicists to represent particle interactions with simple diagrams, called Feynman diagrams Later, Freeman Dyson showed that the two approaches produced the same results, and that Feynman's approach could be derived from the equations of Schwinger and Tomonaga Feynman, Schwinger, and Tomonaga won the 1965 Nobel Prize in physics for their work with QED, which has been one of the most successful theories of modern physics, showing remarkable agreement with experimental results.

1 Nuclear force electrodynamics theory (Feynman, Schwinger)

Towards the end of his life, Richard Feynman delivered a series of lectures on Quantum Electrodynamics (QED) for the general public These lectures were later transcribed and published as the book "QED: The Strange Theory of Light and Matter" in 1985 This book is considered a classic non-mathematical exposition of QED, and Feynman presented three basic actions that form the foundation of QED.

The first action is the movement of a photon from one place and time to another The second action is the movement of an electron from one place and time to another The third action is the emission or absorption of a photon by an electron at a specific place and time These three actions are visually represented by the three basic elements of Feynman diagrams: a wavy line for the photon, a straight line for the electron, and a junction of two straight lines and a wavy one for a vertex representing the emission or absorption of a photon by an electron.

Feynman diagrams are a form of visual shorthand that allow physicists to represent complex interactions between particles and provide a way to calculate the probabilities of different outcomes Feynman's presentation of QED in "QED: The Strange Theory of Light and Matter" has become a significant contribution to the popular understanding of quantum mechanics and the nature of light and matter.

Trang 8

Figure 1: The elementary components of Feynman Diagrams for QuantumElectrodynamics

Richard Feynman introduced a unique shorthand for numerical quantities known as probability amplitudes in addition to the visual shorthand for the actions in Quantum Electrodynamics (QED) These amplitudes are complex numbers that describe the probability of a particular event occurring in a quantum system The probability of a specific outcome is calculated by squaring the absolute value of the total probability amplitude.

Feynman developed a shorthand notation to represent the probability amplitudes associated with the movement of particles For a photon moving from one place and time A to another place and time B, the associated probability amplitude is written as P (A to B) in Feynman's notation This quantity depends solely on the momentum and polarization of the photon.

Similarly, for an electron moving from place C to time D, the associated probability amplitude is written as E (C to D) However, this quantity not only depends on the momentum and polarization of the electron but also on a constant known as n, which Feynman referred to as the "bare" mass of the electron This constant is related to the electron's measured mass but is not identical to it.

Finally, Feynman introduced a quantity known as j to describe the probability amplitude for an electron to emit or absorb a photon This quantity is sometimes referred to as the "bare" charge of the electron and is a constant related to the electron's measured charge e but not identical to it.

Suppose we have an electron at a specific place and time, labeled A, and a photon at a different place and time, labeled B A common question in physics is to determine the probability of finding the electron at a later time and a different place, labeled C, and the

Trang 9

photon at yet another place and time, labeled D The most straightforward way to achieve this is for the electron to move from A to C, and the photon to move from B to D These are simple actions known as elementary processes To calculate the probability of both processes happening together, we need to know the probability amplitudes of each sub-process, denoted as E (A to C) and P (B to D) We can then estimate the overall probability amplitude by multiplying these two values, using rule b) Once we have the estimated overall probability amplitude, we can calculate the estimated probability by squaring it.

There are alternative ways in which the end result of finding the electron at C and the photon at D could occur For example, the electron could move to a different place and time, labeled E, where it absorbs the photon before moving on and emitting a new photon at F The electron then moves to C while the new photon moves to D To calculate the probability of this complex process, we need to know the probability amplitudes of each individual action involved, including three electron actions, two photon actions, and two vertexes (one emission and one absorption) We can estimate the total probability amplitude by multiplying the probability amplitudes of each action for any chosen positions of E and F To find the actual probability, we need to add up all the probability amplitudes for all possible positions of E and F, which in practice requires integration.

Another possibility is that the electron first moves to a different place and time, labeled G, where it emits a photon that moves to D, while the electron moves on to H, where it absorbs the first photon, before finally moving on to C This process is known as Compton scattering.

Figure 2: Compton scattering

a Probability amplitudes

Quantum mechanics presents a significant departure from traditional probability calculations While probabilities are still represented by real numbers, as they are in our daily

Trang 10

lives, the way in which probabilities are calculated is different In quantum mechanics, probabilities are computed as the square of the absolute value of probability amplitudes, which are complex numbers.

For a given process, if two probability amplitudes, v, and , are involved, the probabilityw

of the process will be given either by

The theory of complex numbers employs addition and multiplication as common operations, which are illustrated in the figures To find the sum of two complex numbers, the second arrow's starting point is placed at the end of the first arrow The resulting sum is represented by a third arrow that goes directly from the beginning of the first arrow to the end of the second arrow When multiplying two complex numbers represented by arrows, the length of the product arrow is equal to the product of the two lengths The direction of the product arrow is determined by adding the angles that each of the two arrows have been turned through relative to a reference direction This yields the angle by which the product arrow is turned relative to the reference direction.

Figure 3: Addition of probability amplitudes as complex numbers

Figure 4: Multiplication of probability amplitudes as complex numbers

Trang 11

An important detail associated with the polarization of electrons is that electrons are fermions and follow Fermi-Dirac statistics As a result, if we have the probability amplitude for a complex process involving multiple electrons, we must also include the complementary Feynman diagram in which two electron events are exchanged In this case, the resulting amplitude is the reverse, or negative, of the first For example, consider the case of two electrons starting at A and B and ending at C and D The amplitude would be calculated as the "difference" between E (A to D) × E (B to C) and E (A to C) × E (B to D) This is in contrast to our everyday idea of probabilities, where we would expect the amplitude to be a sum

b, Propagators

To complete the calculation, it is necessary to determine the probability amplitudes for the photon and the electron, denoted as P (A to B) and E (C to D), respectively These probability amplitudes are obtained by solving the Dirac equation, which describes the behavior of the electron's probability amplitude, and the Maxwell's equations, which describe the behavior of the photon's probability amplitude These solutions are known as Feynman propagators To facilitate understanding and communication, the Feynman notation is often translated into a notation commonly used in standard literature:

where a shorthand symbol such as x{A} stands for the four real numbers that give the time and position in three dimensions of the point labeled A.

c, Mass renormalization

In the early days of Feynman's approach, a significant problem arose that impeded progress for two decades Although the approach was based on three fundamental "simple" actions, the rules required that all possible Feynman diagrams with the given endpoints must be taken into account when calculating the probability amplitude for an electron to move from point A to point B This meant that there could be multiple ways for the electron to travel, such as emitting and absorbing photons at various points along the way This resulted in a fractal-like situation where a line could break up into a collection of "simple" lines, and each of these lines could be further composed of simpler lines, and so on infinitely This complexity presented a significant challenge to handle The situation became even more challenging when it was found that the simple correction mentioned above led to infinite probability amplitudes, which was a disaster To address this issue, the technique of renormalization was developed over time However, Feynman himself remained dissatisfied with the solution, describing it as a "dippy process." Despite the difficulties, Feynman's approach has revolutionized the field of quantum mechanics and continues to be a valuable tool for understanding the behavior of subatomic particles.

Ngày đăng: 03/05/2024, 16:25

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan