Discoveries in physics in the first quarter of the 21st Century.

The Best Physics of the Century (So Far)

Our century is now a quarter complete, from Y2K to today (2000 – 2025).  What have been the greatest discoveries in Physics so far? And what do they portend for the rest of the century?

Every century of physics tends to have its own character:

The 1600’s were the time of Galileo, Descartes, Huygens, Leibniz and Newton who created the science of dynamics out of nothing. 

The 1700’s were the time of du Chatelet, Maupertuis, Euler, Lagrange, and D’Alembert who constructed mathematical physics on the foundation of the calculus. 

The 1800’s were the time of Young, Fresnel, Hamilton, Maxwell, Boltzmann, and Lord Kelvin who completed the program of classical physics. 

The 1900’s were the time of Einstein and Bohr who invented relativistic and quantum physics and launched the grand program of unified forces.

Now we come to the 2000’s. What will this century be known for?

Two topics physicists have at the top of their mind today is Quantum and AI (and there is even quantum AI).  But AI is merely a tool (though an important one that is radically changing how physics is done), and quantum is a catch-all (almost everything is quantum at its core).

So, what are the greatest breakthroughs of the 21st Century so far?  And what do they portend for the eventual “character” of 21st-Century physics when seen in the rear-view mirror of history by the year 2100?

Single-photon Quantum Information (2001)

The century started on July 24, 2000, when a landmark paper was received by Nature magazine submitted by Emanuel Knill and Raymond Laflamme at Los Alamos National Lab in the United States with Gerald Milburn from the University of Queensland, Australia (collectively known as KLM). This little-heralded paper proposed a radical new idea in quantum information, an idea that would have profound effects on the development of quantum science for the coming quarter-century.

The idea was simply that quantum logic could be performed with single photons and linear optics [1]. Up to that point, most research on quantum optical computing was trying to get photons to interact with each other (which they really don’t like to do) in nonlinear media like crystals or trapped atoms. What KLM showed was that quantum information could be manipulated in general ways without interactions. The paper proposed a technique that could perform quantum logic in a universal way using only linear optical elements like single-photon sources, beam splitters, phase shifters, and single-photon detectors, introducing the novel idea of “measurement-based” quantum computing.

Recovery from Z Linear Optical Quantum Computing
Fig. 0. LOQC circuitry from the KLM paper.

In the quarter-century since the publication of the KLM paper, LOQC has steadily progressed via the development of single-photon sources and detectors. Today, numerous start-ups are pursuing LOQC, notably Xanadu in Toronto, Canada, and PsiQuantum in Palo Alto, USA and Brisbane, Australia. By 2100, this century will likely be viewed as the time when applications of quantum information reached their maturity.  Where the 20th century was a century of discovery of quantum phenomena, the 21st will be the century when it was reduced to practice.

Solar Neutrino Oscillation (2001)

The sun is fueled by the fusion of hydrogen that generates electron neutrinos. The reaction looks like

where p is a proton (hydrogen), 2H is deuterium (a hydrogen nucleus with an extra neutron), e+ is a positron (the anti-matter form of an electron) and ν e is an electron neutrino. This reaction accounts for 99% of the neutrinos generated by the Sun, calculated by the theoretical astrophysicist John Bahcall of the Institute for Advanced Study at Princeton University. Already by the late 1960’s it was suspected that too few of the neutrinos were being detected compared to predictions, so he teamed with Raymond Davis of Brookhaven National Lab to build an experiment to detect the flux of solar neutrinos. To shield the detector from cosmic rays, the experiment was placed at the 4850 level of the Homestake Gold Mine in Lead, South Dakota and operated from 1970 to 1994. The deficit of solar neutrinos was confirmed, and it was huge: Fully two-thirds of expected solar neutrinos were missing!

The simplest solution to the missing solar neutrinos was that they just weren’t there because, on their way to Earth from the Sun, they had converted to something else that was not detectedable. This conversion from one particle to another is possible if neutrinos have a non-zero (but extremely small) mass. If so, then an electron neutrino can convert to a muon neutrino, and if the distance is far enough, they can convert back. In other words, the nature of the neutrino particle is that its identity oscillates. This is called the solar neutrino oscillation, and by the time the neutrinos have arrived at Earth, two-thirds of them have converted to muon neutrinos.

There was a general reluctance to accept neutrino oscillations because it represented a departure from the Standard Model of particle physics and introduced uncomfortably small masses for neutrinos that otherwise behave like massless particles. Two experiments put these qualms to rest: the Super-Kamiokanda expeeriment in Japan and the Sudbury Neutrino Oscillation experiment in Canada. By the early years of the century, neutrino oscillations had been confirmed.

Neutrino oscillations
Fig. 1. Electron neutrinos (black) convert to muon (blue) and tau (red) neutrinos as a function of distance relative to their energy. The value of L/E for solar neutrinos and the Earth is much larger than plotted here, so the effects average out to a net deficit of electron neutrinos. From Ref.

By 2100, the mystery of the ultra-small neutrino masses will likely have been solved.  If the answer falls within the Standard Model, then this may be the crowning achievement that “completes” the standard model.  If the answer falls outside the Standard Model, then this may be the beginning of a new chapter in high-energy physics.

WMAP and Planck (2003)

The Big Bang may have occurred 13.7 billion years ago, but that Bang echoes to this day across the Universe. At its inception, the reverberations were incredibly hot, but they have cooled now to a mere 3 degrees Kelvin. In 1987, Paul Richards and Andrew Lange at the University of California at Berkeley, recorded the peak of the Planck black body spectrum during a sounding rocket flight that carried a far-infrared spectrometer to the edge of space. (The dichroic bandpass filters in their spectrometer were the first far-infrared metamaterials. I designed and built them as a young grad student at Berkeley! [2]) This experiment was followed by the COBE satellite that measured the presence of minuscule fluctuations in the temperature, representing the original heterogeneity of the universe just after the Big Bang.

COBE flew for a year, followed in 1998 by the BOOMerAng experiment, led by Andrew Lange, that was suspended from a high-altitude balloon circling the South Pole for ten days. This experiment discovered the literal echoes of the Big Bang, acoustic oscillations, in other words, the “sound” of the Bang. It also established that the universe is gravitationally “flat”, which is a direct consequence of cosmic inflation. Once again, these findings were followed by a satellite experiment, the WMAP mission in 2003, that mapped these oscillations over the entire sky. Even finer resolution was obtained by the Planck mission in 2013, measuring higher harmonics of the sound oscillations. These oscillations in the early universe helped seed regions of slightly higher density that condensed into galaxies, leading to the large-scale structure of the universe that we see today.

Anisotropy of the cosmic background radiation
Fig. 2. Successively higher resolution views of the echos of the Big Bang from COBE (1992) to WMAP (2003) to Planck (2013). From Ref.

The 21st Century will likely be known as the time when the physics of the early universe was finally pinned down, and maybe even of what can before. The answers may tell us if there are parallel universes in a much larger metaverse.

Exoplanets (2009)

The Earth is not alone in the Universe. It is not even alone in our little neighborhood of the Milky Way. Within 50 light years it is estimated that there are about 1000 Earth-sized planets in the habitable zone of their respective stars. Why is 50 light years significant? It is because, within this century, the technology to explore those planets is likely to be developed. With the right designs, an unmanned probe could reach 50 light years from Earth within a century, and the time to call back home is only 50 years. So if a probe is launched in the year 2100, we could be receiving transmissions from the new planet by the year 2250.

This estimate of 1000 New Earths is the result of a quiet revolution in planetary science that has been unfolding over the past quarter century. The very first exoplanet was confirmed in 1995 by Michel Mayor and Didier Queloz. Today, as of the writing of this blog, there are 6,278 confirmed exoplanets. Most of these were disovered by the Kepler satellite that was launched in 2009.

Kepler exoplanet discoveries
Fig. 3. An artists rendition of several of the Earth-sized planets discovered by the Kepler satellite. From Ref.

By 2100, we will know where all the exoplanets are that are within 50 light years of Earth, and we will know which ones are potential inhabitable.  It may even happen that signs of life on one of these planets will have been discovered.  If so, then it is hard to imagine humankind NOT launching probes to visit those planets.  If the right propulsion technology is developed, then those probes could be signaling back information from those planets as early as the year 2250…if anyone is still here to listen.

The Higgs (2012)

The crowning achievement of high-energy physics may also have been the last nail in the coffin. Throughout the second half of the 20th century, high-energy physics took the lion’s share of money and attention showered on physics. Beginning in the aftermath of the Manhattan Project, the search for the fundamental constituents of our universe at first found more and more particles, creating a “zoo” that resisted easy classification, until quarks were proposed that simplified the whole thing down into what is now called the Standard Model of Physics.

But one piece of the puzzle was still missing–the explanation of why particles have the masses they do. This missing piece was supplied by the theoretical physicist Peter Higgs in 1964 who proposed that point-like massless particles interacted with a “field” that permeated space. The interaction energy was equivalent to mass through Einstein’s famous E = mc2, and the quantization of the field predicted the existence of a “Higgs Boson”. The search for the “Higgs”, as it is called for short, became the Holy Grail of Physics at the end of the last century.

Higgs production, decay and pair processes as Feynman diagrams
Fig. 4. Feynman diagrams that involve the generation of Higgs particles.

The discovery of the Higgs boson was announced on the 4th of July in 2012 [3]. It capped 80 years of progress in high-energy particle physics since the discovery of the positron in 1932. But it may also be the last. Since 2012, over the past 14 years, there have been no new “major” discoveries at the Large Hadron Collider (LHC). Most high-energy talks since then have been about speculative experiments seeking deviations from the Standard Model, but so far there is nothing new.

In the year 2100, looking back, the era of high-energy physics may be relegated to the 20th century, with the Higgs just a finishing touch that tipped over into the new millennium … Or sometime in the next 75 years there will be a discovery that goes beyond standard physics and opens a new chapter in the field. We will have to wait to see.

Gravitational Waves (2015)

Where were you on Nov. 11, 2015 at 10:30 am? Can you remember? I can! I was in a conference room in the Physics Building on the Purdue University Campus waiting with a small crowd of physicists for a news conference to begin. Everyone knew it would be something big. It was. They announced the first detection of a gravitational wave by the LIGO detector (the Laser Interferometric Gravitational Wave Observatory). In a way, it was anti-climactic because we all knew that LIGO would eventually see one. But it was also immensely dramatic, because it was the most sensitive measurement ever made by mankind. The displacement of the mirrors in the interferometer caused by the passing gravitational wave was a tiny fraction of a radius of a proton, yet the signal was as clear as a bell. It came from the merger of two 30 solar-mass black holes in a galaxy far, far away.

First detection of gravitational waves by LIGO
Fig. 5. The two LIGO recordings (at Hanfored and at Livingston) of the first detected gravitaitonal wave. From Ref.

By the year 2100, looking back, multi-messenger astronomy will have been a key part of the physics of the 21st century. Multi-messenger astronomy is when an astronomical event is detected across many detection modes, possibly including light, infrared, ultraviolet, x-ray, neutrino and gravitational wave detection. The field is just beginning and has a long way to go to integrate all these different ways of seeing into a complete picture of what happens out in the universe.

Topological physics (2016)

Of all the topics of this blog, this one is perhaps the most abstract. When we think of geometry, it is natural to think in terms of the symmetries that objects have. The last century was the pinnacle of geometric physics, where Einstein showed that gravity is a geometric property of warped space, where group theory classified all the ways that objects can be constructed and behave, and symmetry breaking was invoked to explain the hierarchy of physical forces.

The new century will be the time of topological physics, where symmetries of matter may not even matter, but the way that properties of matter are connected does. By “property of matter” I mean like the electronic states of a solid state material where the states are excluded from portions of state space, creating topology in abstract spaces. Such topological properties govern how freely currents can flow on surfaces but not in the bulk, or vice versa. In quantum systems, topological properties can protect quantum information from decoherence, which is the bane of most real-world implementations of quantum computers. For instance, by “braiding anyons” it is possible to create qubits that resist dephasing.

Braided anyons at Purdue
Fig. 6. Evidence for the braiding of anyons in the solid state. From Ref.

The importance of topology in physics was recognized with the 2016 Nobel Prize to David J. Thouless, F. Duncan M. Haldane, and J. Michael Kosterlitz for “Theoretical discoveries of topological phase transitions and topological phases of matter.”

Images of Black Holes (2019)

Why hasn’t this gotten a Nobel Prize yet? The imaging of black holes is a tour de force, requiring a telescope with the diameter of a planet, and requiring the collaboration of scientists from across that planet to make it all work.

The physics is straightforward. Everyone knows that bigger telescopes have better resolution, so the logical limit is a telescope the size of the Earth. This is accomplished by using interferometric detection, with data from widely spread millimeter-wave telescopes synchronized by an atomic clock in a network of telescopes known as the Event Horizon Telescope (EHT). The results are constructed numerically, as shown below.

Event Horizon Telescope (EHT) images of a black hole

Fig. 7. The EHT images (left) compared to the model (middle) and the blurred model (right) of the black hole in the M87 galaxy. From Ref.

The next logical step for this kind of imaging is a telescope array that is bigger than the Earth … much bigger! This could be accomplished with an array of Lagrange-point satellites, improving the resolution of the images. By the end of this century, we may be imaging the black holes in all the near-by galaxies.

More to Come?

What are the greatest outstanding problems of physics that may yet yield to solutions within this century? It is impossible to say for certain without a crystal ball, but there are some that are likely to be resolved in the next 75 years:

Dark Matter: This is the 500 pound gorilla in the room. If most of the tangible universe is made of this stuff, then we had better get around to detecting it!

Dark Energy: This is the other 500 pound gorilla in the room. If most of the intangible universe is made of this stuff, then we had better get a good understanding of it.

Quantum Gravity: Of the four forces of physics (gravity, electro-magnetic, weak nuclear and strong nuclear) gravity stands apart in several ways, one of which is that there is no quantum theory for it. We have 75 years to fix this if it is to be a crowning achievement of 21st-Century physics.

The Evolution of Life: I didn’t include any biophysics in my list of the best physics of the century mainly because I cannot point to a single revolutionary breakthrough of physics in this area. There has been a lot of good progress on the microphysics of biological systems, but nothing like discovering a Higgs boson. This could change if the origins of life turn out to be physics-based rather than just some chemistry.

The Evolution of Intelligence: I think physics has more to say on the evolution of intelligence than on the evolution of life. Intelligence is the quintessential complex system, and the methods of theoretical physics may yet provide a clear answer to the question of “What is Intelligence?”.

The Early Universe: This is just starting now with the James Webb Telescope peering into the dark depths of history–nearly to the Big Bang itself.

Multiple Dimensions: String theory likes to live in 11-dimensional space, so what other parts of our physical universe live there too? Dark Matter? Dark Energy? Do all the extra dimensions need to be compact?

The Arrow of Time: The physics of time is possibly the greatest unsolved problem in physics. Why does it only go one way?

Singularity Physics: What happens at the center of a Black Hole? Do wormholes provide hyperspace bypasses? These questions may yet get answers from theoretical physics though likely not from the laboratory unless it is from an AMO analog.

References

[1] E. Knill, R. Laflamme and G. J. Milburn, A Scheme for Efficient Quantum Computation with Linear Optics, Nature 409 (6816), 46–52 (2001).

[2] D. NOLTE, A. LANGE and P. RICHARDS, Far-Infrared Dichroic Bandpass-Filters, Applied Optics 24 (10), 1541–1545 (1985).

[3] CERN. (2012, July 4). CERN experiments observe particle consistent with long-sought Higgs boson [Press release]. https://home.cern/news/press-release/cern/cern-experiments-observe-particle-consistent-long-sought-higgs-boson; ATLAS Collaboration. (2012). Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC. Physics Letters B, 716(1), 1–29. https://doi.org/10.1016/j.physletb.2012.08.020 Cited by: 13000+; CMS Collaboration. (2012). Observation of a new boson at a mass of 125 GeV with the CMS experiment at the LHC. Physics Letters B, 716(1), 30–61. https://doi.org/10.1016/j.physletb.2012.08.021

Book Preview: Interference and the Story of Optical Interferometry

 Interference: The History of Optical Interferometry and the Scientists who Tamed Light is available at Oxford University Press and at Amazon and Barnes&Nobles .

The synopses of the first chapters can be found in my previous blog. Here are previews of the final chapters.

Chapter 6. Across the Universe: Exoplanets, Black Holes and Gravitational Waves

Stellar interferometry is opening new vistas of astronomy, exploring the wildest occupants of our universe, from colliding black holes half-way across the universe (LIGO) to images of neighboring black holes (EHT) to exoplanets near Earth that may harbor life.

Image of the supermassive black hole in M87 from Event Horizon Telescope.

Across the Universe: Gravitational Waves, Black Holes and the Search for Exoplanets describes the latest discoveries of interferometry in astronomy including the use of nulling interferometry in the Very Large Telescope Interferometer (VLTI) to detect exoplanets orbiting distant stars.  The much larger Event Horizon Telescope (EHT) used long baseline interferometry and closure phase advanced by Roger Jenison to make the first image of a black hole.  The Laser Interferometric Gravitational Observatory (LIGO) represented a several-decade-long drive to detect the first gravitational waves first predicted by Albert Einstein a hundred years ago.

Chapter 7. Two Faces of Microscopy: Diffraction and Interference

From the astronomically large dimensions of outer space to the microscopically small dimensions of inner space, optical interference pushes the resolution limits of imaging.

Ernst Abbe. Image Credit.

Two Faces of Microscopy: Diffraction and Interference describes the development of microscopic principles starting with Joseph Fraunhofer and the principle of diffraction gratings that was later perfected by Henry Rowland for high-resolution spectroscopy.  The company of Carl Zeiss advanced microscope technology after enlisting the help of Ernst Abbe who formed a new theory of image formation based on light interference.  These ideas were extended by Fritz Zernike in the development of phase-contrast microscopy.  The ultimate resolution of microscopes, defined by Abbe and known as the Abbe resolution limit, turned out not to be a fundamental limit, but was surpassed by super-resolution microscopy using concepts of interference microscopy and structured illumination.

Chapter 8. Holographic Dreams of Princess Leia: Crossing Beams

The coherence of laser light is like a brilliant jewel that sparkles in the darkness, illuminating life, probing science and projecting holograms in virtual worlds.

Ted Maiman

Holographic Dreams of Princess Leia: Crossing Beams presents the history of holography, beginning with the original ideas of Denis Gabor who invented optical holography as a means to improve the resolution of electron microscopes.  Holography became mainstream after the demonstrations by Emmett Leith and Juris Upatnieks using lasers that were first demonstrated by Ted Maiman at Hughes Research Lab after suggestions by Charles Townes on the operating principles of the optical maser.  Dynamic holography takes place in crystals that exhibit the photorefractive effect that are useful for adaptive interferometry.  Holographic display technology is under development, using ideas of holography merged with light-field displays that were first developed by Gabriel Lippmann.

Chapter 9. Photon Interference: The Foundations of Quantum Communication and Computing

What is the image of one photon interfering? Better yet, what is the image of two photons interfering? The answer to this crucial question laid the foundation for quantum communication.

Leonard Mandel. Image Credit.

Photon Interference: The Foundations of Quantum Communication moves the story of interferometry into the quantum realm, beginning with the Einstein-Podolski-Rosen paradox and the principle of quantum entanglement that was refined by David Bohm who tried to banish uncertainty from quantum theory.  John Bell and John Clauser pushed the limits of what can be known from quantum measurement as Clauser tested Bell’s inequalities, confirming the fundamental nonlocal character of quantum systems.  Leonard Mandel pushed quantum interference into the single-photon regime, discovering two-photon interference fringes that illustrated deep concepts of quantum coherence.  Quantum communication began with quantum cryptography and developed into quantum teleportation that can provide the data bus of future quantum computers.

Chapter 10. The Quantum Advantage: Interferometric Computing

There is almost no technical advantage better than having exponential resources at hand. The exponential resources of quantum interference provide that advantage to quantum computing which is poised to usher in a new era of quantum information science and technology.

David Deutsch.

The Quantum Advantage: Interferometric Computing describes the development of quantum algorithms and quantum computing beginning with the first quantum algorithm invented by David Deutsch as a side effect of his attempt to prove the multiple world interpretation of quantum theory.  Peter Shor found a quantum algorithm that could factor the product of primes and that threatened all secure communications in the world.  Once the usefulness of quantum algorithms was recognized, quantum computing hardware ideas developed rapidly into quantum circuits supported by quantum logic gates.  The limitation of optical interactions, that hampered the development of controlled quantum gates, led to the proposal of linear optical quantum computing and boson sampling in a complex cascade of single-photon interferometers that has been used to demonstrate quantum supremacy, also known as quantum computational advantage, using photonic integrated circuits.


From Oxford Press: Interference

Stories about the trials and toils of the scientists and engineers who tamed light and used it to probe the universe.

Frontiers of Physics: The Year in Review (2022)

Physics forged ahead in 2022, making a wide range of advances. From a telescope far out in space to a telescope that spans the size of the Earth, from solid state physics and quantum computing at ultra-low temperatures to particle and nuclear physics at ultra-high energies, the year saw a number of firsts. Here’s a list of eight discoveries of 2022 that define the frontiers of physics.

James Webb Space Telescope

“First Light” has two meanings: the “First Light” that originated at the beginning of the universe, and the “First Light” that is collected by a new telescope. In the beginning of this year, the the James Webb Space Telescope (JWST) saw both types of first light, and with it came first surprises.

NASA image of the Carina Nebula, a nursery for stars.

The JWST has found that galaxies are too well formed too early in the universe relative to current models of galaxy formation. Almost as soon as the JWST began forming images, it acquired evidence of massive galaxies from only a few hundred million years old. Existing theories of galaxy formation did not predict such large galaxies so soon after the Big Bang.

Another surprise came from images of the Southern Ring Nebula. While the Hubble did not find anything unusual about this planetary nebula, the JWST found cold dust surrounding the white dwarf that remained after the explosion of the supernova. This dust was not supposed to be there, but it may be coming from a third member of the intra-nebular environment. In addition, the ring-shaped nebula contained masses of swirling streams and ripples that are challenging astrophysicists who study supernova and nebula formation to refine their current models.

Quantum Machine Learning

Machine learning—the training of computers to identify and manipulate complicated patterns within massive data—has been on a roll in recent years, ever since efficient training algorithms were developed in the early 2000’s for large multilayer neural networks. Classical machine learning can take billions of bits of data and condense it down to understandable information in a matter of minutes. However, there are types of problems that even conventional machine learning might take the age of the universe to calculate, for instance calculating the properties of quantum systems based on a set of quantum measurements of the system.

In June of 2022, researchers at Caltech and Google announced that a quantum computer—Google’s Sycamore quantum computer—could calculate properties of quantum systems using exponentially fewer measurements than would be required to perform the same task using conventional computers. Quantum machine learning uses the resource of quantum entanglement that is not available to conventional machine learning, enabling new types of algorithms that can exponentially speed up calculations of quantum systems. It may come as no surprise that quantum computers are ideally suited to making calculations of quantum systems.

Part of Google's Sycamore quantum computer
Science News. External view of Google’s Sycamore quantum computer.

A Possible Heavy W Boson

High-energy particle physics has been in a crisis ever since 2012 when they reached the pinnacle of a dogged half-century search for the fundamental constituents of the universe. The Higgs boson was the crowning achievement, and was supposed to be the vanguard of a new frontier of physics uncovered by CERN. But little new physics has emerged, even though fundamental physics is in dire need of new results. For instance, dark matter and dark energy remain unsolved mysteries despite making up the vast majority of all there is. Therefore, when physicists at Fermilab announced that the W boson, a particle that carries the nuclear weak interaction, was heavier than predicted by the Standard Model, some physicists heaved sighs of relief. The excess mass could signal higher-energy contributions that might lead to new particles or interactions … if the excess weight holds up under continued scrutiny.

Science magazine. April 8, 2022

Imaging the Black Hole at the Center of the Milky Way

Imagine building a telescope the size of the Earth. What could it see?

If it detected in the optical regime, it could see a baseball on the surface of the Moon. If it detected at microwave frequencies, then it could see the material swirling around distant black holes. This is what the Event Horizon Telescope (EHT) can do. In 2019, it revealed the first image of a black hole: the super-massive black hole at the core of the M87 galaxy 53 million light years away. They did this Herculean feat by combining the signals of microwave telescopes from across the globe, combining their signals interferometrically to create an effective telescope aperture that was the size of the Earth.

The next obvious candidate was the black hole at the center of our own galaxy, the Milky Way. Even though our own black hole is much smaller than the one in M87, ours is much closer, and both subtend about the same solid angle. The challenge was observing it through the swirling stars and dust at the core of our galaxy. In May of this year, the EHT unveiled the first image of our own black hole, showing the radiation emitted by the in-falling material.

BBC image of the black hole at the core of our Milky Way galaxy.

Tetraneutrons

Nuclear physics is a venerable part of modern physics that harkens back to the days of Bohr and Rutherford and the beginning of quantum physics, but in recent years it has yielded few new surprises (except at the RHIC collider which smashes heavy nuclei against each other to create quark-gluon plasma). That changed in June of 2022, when researchers in Germany announced the successful measurement of a tetraneutron–a cluster of four neutrons bound transiently together by the strong nuclear force.

Neutrons are the super-glue that holds together the nucleons in standard nuclei. The force is immense, strong enough to counteract the Coulomb repulsion of protons in a nucleus. For instance, Uranium 238 has 92 protons crammed within a volume of about 10 femtometer radius. It takes 146 neutrons to bind these together without flying apart. But neutrons don’t tend to bind to themselves, except in “resonance” states that decay rapidly. In 2012, a dineutron (two neutrons bound in a transient resonance state) was observed, but four neutrons were expected to produce an even more transient resonance (a three-neutron state is not allowed). When the German group created the tetraneutron, it had a lifetime of only about 1×10-21 seconds, so it is extremely ephemeral. Nonetheless, studying the properties of the tetraneutron may give insights into both the strong and weak nuclear forces.

Hi-Tc superconductivity

When Bednorz and Müller discovered Hi-Tc superconductivity in 1986, it set off both a boom and a crisis. The boom was the opportunity to raise the critical temperature of superconductivity from 23 K that had been the world record held by Nb3Ge for 13 years since it was set in 1973. The crisis was that the new Hi-Tc materials violated the established theory of superconductivity explained by Bardeen-Cooper-Schrieffer (BCS). There was almost nothing in the theory of solid state physics that could explain how such high critical temperatures could be attained. At the March Meeting of the APS the following year in 1987, the session on the new Hi-Tc materials and possible new theories became known as the Woodstock of Physics, where physicists camped out in the hallway straining their ears to hear the latest ideas on the subject.

One of the ideas put forward at the session was the idea of superexchange by Phil Anderson. The superexchange of two electrons is related to their ability to hop from one lattice site to another. If the hops are coordinated, then there can be an overall reduction in their energy, creating a ground state of long-range coordinated electron hopping that could support superconductivity. Anderson was perhaps the physicist best situated to suggest this theory because of his close familiarity with what was, even then, known as the Anderson Hamiltonian that explicitly describes the role of hopping in solid-state many-body phenomena.

Ever since, the idea of superexchange has been floating around the field of Hi-Tc superconductivity, but no one had been able to pin it down conclusively, until now. In a paper published in the PNAS in September of 2022, an experimental group at Oxford presented direct observations of the spatial density of Cooper pairs in relation to the spatial hopping rates—where hopping was easiest then the Cooper pair density was highest, and vice versa. This experiment provides almost indisputable evidence in favor of Anderson’s superexchange mechanism for Cooper pair formation in the Hi-Tc materials, laying to rest the crisis launched 36 years ago.

Holographic Wormhole

The holographic principle of cosmology proposes that our three-dimensional physical reality—stars, galaxies, expanding universe—is like the projection of information encoded on a two-dimensional boundary—just as a two-dimensional optical hologram can be illuminated to recreate a three-dimensional visual representation. This 2D to 3D projection was first proposed by Gerald t’Hooft, inspired by the black hole information paradox in which the entropy of a black hole scales as surface area of the black hole instead of its volume. The holographic principle was expanded by Leonard Susskind in 1995 based on string theory and is one path to reconciling quantum physics with the physics of gravitation in a theory of quantum gravity—one of the Holy Grails of physics.

While it is an elegant cosmic idea, the holographic principle could not be viewed as anything down to Earth, until now. In November 2022 a research group at Caltech published a paper in Nature describing how they used Google’s Sycamore quantum computer (housed at UC Santa Barbara) to manipulate a set of qubits into creating a laboratory-based analog of a Einstein-Rosen bridge, also known as a “wormhole”, through spacetime. The ability to use quantum information states to simulate a highly-warped spacetime analog provides the first experimental evidence for the validity of the cosmological holographic principle. Although the simulation did not produce a physical wormhole in our spacetime, it showed how quantum information and differential geometry (the mathematics of general relativity) can be connected.

One of the most important consequences of this work is the proposal that ER = EPR (Einstein-Rosen = Einstein-Podolsky-Rosen). The EPR paradox of quantum entanglement has long been viewed as a fundamental paradox of physics that requires instantaneous non-local correlations among quantum particles that can be arbitrarily far apart. Although EPR violates local realism, it is a valuable real-world resource for quantum teleportation. By demonstrating the holographic wormhole, the recent Caltech results show how quantum teleportation and gravitational wormholes may arise from the same physics.

Net-Positive-Energy from Nuclear Fusion

Ever since nuclear fission was harnessed to generate energy, the idea of tapping the even greater potential of nuclear fusion to power the world has been a dream of nuclear physicists. Nuclear fusion energy would be clean and green and could help us avoid the long-run disaster of global warming. However, achieving that dream has been surprisingly frustrating. While nuclear fission was harnessed for energy (and weapons) within only a few years of discovery, and a fusion “boost” was added to nuclear destructive power in the so-called hydrogen bomb, sustained energy production from fusion has remained elusive.

In December of 2022, the National Ignition Facility (NIF) focussed the power of 192 pulsed lasers onto a deuterium-tritium pellet, causing it to implode, and the nuclei to fuse, releasing about 50% more energy that it absorbed. This was the first time that controlled fusion released net positive energy—about 3 million Joules out from 2 million Joules in—enough energy to boil about 3 liters of water. This accomplishment represents a major milestone in the history of physics and could one day provide useful energy. The annual budget of the NIF is about 300 million dollars, so there is a long road ahead (probably several more decades) before this energy source can be scaled down to an economical level.

Laser fusion experiment yields record energy at LLNL's National Ignition  Facility | Lawrence Livermore National Laboratory
NIF image.

By David D. Nolte Jan. 16, 2023