Main Logo

Friday, 11 March 2022

බටහිර භෞතික විද්‍යාවේ අර්බුදය

 

බටහිර භෞතික විද්‍යාවේ අර්බුදය 

මේ ලිපිය පසුගිය මාර්තු 4 වැනි දා ක්වොන්ටා සඟරාවේ (Quanta Magazine) පළවූවකි. අප රටේ ඉන්නා දසදහසකට අධික විද්‍යාඥයන්ගේ දැන ගැනීම පිණිස එය පල කරනවා. මෙහි ගණිත සමීකරණ කිසිවක් නැහැ. ඕනෑම පඬි නැට්ටකුට කියව ගන්න පුළුවන්.  ඒත් ලිපිය තරමක් දිගයි. බටහිර භෞතික විද්‍යාවේ ඉහළ ම තැන්වල ඇති අර්බුදය එහි සාකච්ඡා කෙරෙනවා. බටහිර භෞතික විද්‍යාව පසුපස නොන්ඩි ගසමින් යන ඊනියා සමාජයීය විද්‍යාවන් ඔහේ ගියා දෙන්. 

 

මා මෙහි ඇති සියලු කරුණු ගැන එකඟ වන්නේ නැහැ. මේ ඊනියා යථාර්ථය සත්‍යය සොයා යෑමේ ප්‍රතිඵලයක්. කුන්ගේ  සුසමාදර්ශ කතාවලින් වැඩක් නැහැ. 

 

 

A Deepening Crisis Forces Physicists to Rethink Structure of Nature’s Laws

By NATALIE WOLCHOVER

For three decades, researchers hunted in vain for new elementary particles that would have explained why nature looks the way it does. As physicists confront that failure, they’re reexamining a longstanding assumption: that big stuff consists of smaller stuff.

ucture of Scientific Revolutions, the philosopher of science Thomas Kuhn observed that scientists spend long periods taking small steps. They pose and solve puzzles while collectively interpreting all data within a fixed worldview or theoretical framework, which Kuhn called a paradigm. Sooner or later, though, facts crop up that clash with the reigning paradigm. Crisis ensues. The scientists wring their hands, reexamine their assumptions and eventually make a revolutionary shift to a new paradigm, a radically different and truer understanding of nature. Then incremental progress resumes.

 

For several years, the particle physicists who study nature’s fundamental building blocks have been in a textbook Kuhnian crisis.

 

The crisis became undeniable in 2016, when, despite a major upgrade, the Large Hadron Collider in Geneva still hadn’t conjured up any of the new elementary particles that theorists had been expecting for decades. The swarm of additional particles would have solved a major puzzle about an already known one, the famed Higgs boson. The hierarchy problem, as the puzzle is called, asks why the Higgs boson is so lightweight — a hundred million billion times less massive than the highest energy scales that exist in nature. The Higgs mass seems unnaturally dialed down relative to these higher energies, as if huge numbers in the underlying equation that determines its value all miraculously cancel out.

 

The extra particles would have explained the tiny Higgs mass, restoring what physicists call “naturalness” to their equations. But after the LHC became the third and biggest collider to search in vain for them, it seemed that the very logic about what’s natural in nature might be wrong. “We are confronted with the need to reconsider the guiding principles that have been used for decades to address the most fundamental questions about the physical world,” Gian Giudice, head of the theory division at CERN, the lab that houses the LHC, wrote in 2017.

 

At first, the community despaired. “You could feel the pessimism,” said Isabel Garcia Garcia, a particle theorist at the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara, who was a graduate student at the time. Not only had the $10 billion proton smasher failed to answer a 40-year-old question, but the very beliefs and strategies that had long guided particle physics could no longer be trusted. People wondered more loudly than before whether the universe is simply unnatural, the product of fine-tuned mathematical cancellations. Perhaps there’s a multiverse of universes, all with randomly dialed Higgs masses and other parameters, and we find ourselves here only because our universe’s peculiar properties foster the formation of atoms, stars and planets and therefore life. This “anthropic argument,” though possibly right, is frustratingly untestable.

 

Many particle physicists migrated to other research areas, “where the puzzle hasn’t gotten as hard as the hierarchy problem,” said Nathaniel Craig, a theoretical physicist at UCSB.

Nathaniel Craig and Isabel Garcia Garcia have probed how gravity might help reconcile nature’s vastly different energy scales.

 

Jeff Liang

 

Some of those who remained set to work scrutinizing decades-old assumptions. They started thinking anew about the striking features of nature that seem unnaturally fine-tuned — both the Higgs boson’s small mass, and a seemingly unrelated case, one that concerns the unnaturally low energy of space itself. “The really fundamental problems are problems of naturalness,” Garcia Garcia said.

 

Their introspection is bearing fruit. Researchers are increasingly zeroing in on what they see as a weakness in the conventional reasoning about naturalness. It rests on a seemingly benign assumption, one that has been baked into scientific outlooks since ancient Greece: Big stuff consists of smaller, more fundamental stuff — an idea known as reductionism. “The reductionist paradigm … is hard-wired into the naturalness problems,” said Nima Arkani-Hamed, a theorist at the Institute for Advanced Study in Princeton, New Jersey.

 

Now a growing number of particle physicists think naturalness problems and the null results at the Large Hadron Collider might be tied to reductionism’s breakdown. “Could it be that this changes the rules of the game?” Arkani-Hamed said. In a slew of recent papers, researchers have thrown reductionism to the wind. They’re exploring novel ways in which big and small distance scales might conspire, producing values of parameters that look unnaturally fine-tuned from a reductionist perspective.

 

“Some people call it a crisis. That has a pessimistic vibe associated to it and I don’t feel that way about it,” said Garcia Garcia. “It’s a time where I feel like we are on to something profound.”

 

What Naturalness Is

 

The Large Hadron Collider did make one critical discovery: In 2012, it finally struck upon the Higgs boson, the keystone of the 50-year-old set of equations known as the Standard Model of particle physics, which describes the 17 known elementary particles.

 

The discovery of the Higgs confirmed a riveting story that’s written in the Standard Model equations. Moments after the Big Bang, an entity that permeates space called the Higgs field suddenly became infused with energy. This Higgs field crackles with Higgs bosons, particles that possess mass because of the field’s energy. As electrons, quarks and other particles move through space, they interact with Higgs bosons, and in this way they acquire mass as well.

 

After the Standard Model was completed in 1975, its architects almost immediately noticed a problem.

When the Higgs gives other particles mass, they give it right back; the particle masses shake out together. Physicists can write an equation for the Higgs boson’s mass that includes terms from each particle it interacts with. All the massive Standard Model particles contribute terms to the equation, but these aren’t the only contributions. The Higgs should also mathematically mingle with heavier particles, up to and including phenomena at the Planck scale, an energy level associated with the quantum nature of gravity, black holes and the Big Bang. Planck-scale phenomena should contribute terms to the Higgs mass that are huge — roughly a hundred million billion times larger than the actual Higgs mass. Naively, you would expect the Higgs boson to be as heavy as they are, thereby beefing up other elementary particles as well. Particles would be too heavy to form atoms, and the universe would be empty.

 

For the Higgs to depend on enormous energies yet end up so light, you have to assume that some of the Planckian contributions to its mass are negative while others are positive, and that they’re all dialed to just the right amounts to exactly cancel out. Unless there’s some reason for this cancellation, it seems ludicrous — about as unlikely as air currents and table vibrations counteracting each other to keep a pencil balanced on its tip. This kind of fine-tuned cancellation physicists deem “unnatural.”

 

Within a few years, physicists found a tidy solution: supersymmetry, a hypothesized doubling of nature’s elementary particles. Supersymmetry says that every boson (one of two types of particle) has a partner fermion (the other type), and vice versa. Bosons and fermions contribute positive and negative terms to the Higgs mass, respectively. So if these terms always come in pairs, they’ll always cancel.

 

The search for supersymmetric partner particles began at the Large Electron-Positron Collider in the 1990s. Researchers assumed the particles were just a tad heavier than their Standard Model partners, requiring more raw energy to materialize, so they accelerated particles to nearly light speed, smashed them together, and looked for heavy apparitions among the debris.

Meanwhile, another naturalness problem surfaced.

 

Merrill Sherman for Quanta Magazine

 

The fabric of space, even when devoid of matter, seems as if it should sizzle with energy — the net activity of all the quantum fields coursing through it. When particle physicists add up all the presumptive contributions to the energy of space, they find that, as with the Higgs mass, injections of energy coming from Planck-scale phenomena should blow it up. Albert Einstein showed that the energy of space, which he dubbed the cosmological constant, has a gravitationally repulsive effect; it causes space to expand faster and faster. If space were infused with a Planckian density of energy, the universe would have ripped itself apart moments after the Big Bang. But this hasn’t happened.

 

Instead, cosmologists observe that space’s expansion is accelerating only slowly, indicating that the cosmological constant is small. Measurements in 1998 pegged its value as a million million million million million times lower than the Planck energy. Again, it seems all those enormous energy injections and extractions in the equation for the cosmological constant perfectly cancel out, leaving space eerily placid.

Gravity … mixes physics at all length scales — short distances, long distances. Because it does that, it gives you this way out.

 

 

 

 

Nathaniel Craig

 

Both of these big naturalness problems were evident by the late 1970s, but for decades, physicists treated them as unrelated. “This was in the phase where people were schizophrenic about this,” said Arkani-Hamed. The cosmological constant problem seemed potentially related to mysterious, quantum aspects of gravity, since the energy of space is detected solely through its gravitational effect. The hierarchy problem looked more like a “dirty-little-details problem,” Arkani-Hamed said — the kind of issue that, like two or three other problems of the past, would ultimately reveal a few missing puzzle pieces. “The sickness of the Higgs,” as Giudice called its unnatural lightness, was nothing a few supersymmetry particles at the LHC couldn’t cure.

 

In hindsight, the two naturalness problems seem more like symptoms of a deeper issue.

“It’s useful to think about how these problems come about,” said Garcia Garcia in a Zoom call from Santa Barbara this winter. “The hierarchy problem and the cosmological constant problem are problems that arise in part because of the tools we’re using to try to answer questions — the way we’re trying to understand certain features of our universe.”

 

Reductionism Made Precise

 

Physicists come by their funny way of tallying contributions to the Higgs mass and cosmological constant honestly. The calculation method reflects the strange nesting-doll structure of the natural world.

 

Zoom in on something, and you’ll discover that it’s actually a lot of smaller things. What looks from afar like a galaxy is really a collection of stars; each star is many atoms; an atom further dissolves into hierarchical layers of subatomic parts. Moreover, as you zoom in to shorter distance scales, you see heavier and more energetic elementary particles and phenomena — a profound link between high energies and short distances that explains why a high-energy particle collider acts like a microscope on the universe. The connection between high energies and short distances has many avatars throughout physics. For instance, quantum mechanics says every particle is also a wave; the more massive the particle, the shorter its associated wavelength. Another way to think about it is that energy has to cram together more densely to form smaller objects. Physicists refer to low-energy, long-distance physics as “the IR,” and high-energy, short-distance physics as “the UV,” drawing an analogy with infrared and ultraviolet wavelengths of light.

 

In the 1960s and ’70s, the particle physics titans Kenneth Wilson and Steven Weinberg put their finger on what’s so remarkable about nature’s hierarchical structure: It allows us to describe goings-on at some big, IR scale of interest without knowing what’s “really” happening at more microscopic, UV scales. You can, for instance, model water with a hydrodynamic equation that treats it as a smooth fluid, glossing over the complicated dynamics of its H2O molecules. The hydrodynamic equation includes a term representing water’s viscosity — a single number, which can be measured at IR scales, that summarizes all those molecular interactions happening in the UV. Physicists say IR and UV scales “decouple,” which lets them effectively describe aspects of the world without knowing what’s going on deep down at the Planck scale — the ultimate UV scale, corresponding to a billionth of a trillionth of a trillionth of a centimeter, or 10 billion billion gigaelectron-volts (GeV) of energy, where the very fabric of space-time probably dissolves into something else.

 

 

Kenneth Wilson, an American particle and condensed matter physicist who was active from the 1960s until the 2000s, developed a formal mathematical method for describing how properties of a system change depending on the scale at which they are measured.

Cornell University Faculty Biographical Files, #47-10-3394. Division of Rare and Manuscript Collections, Cornell University Library.

 

“We can do physics because we can remain ignorant about what happens at short distances,” said Riccardo Rattazzi, a theoretical physicist at the Swiss Federal Institute of Technology Lausanne.

 

Wilson and Weinberg separately developed pieces of the framework that particle physicists use to model different levels of our nesting-doll world: effective field theory. It’s in the context of EFT that naturalness problems arise.

 

An EFT models a system — a bundle of protons and neutrons, say — over a certain range of scales. Zoom in on protons and neutrons for a while and they will keep looking like protons and neutrons; you can describe their dynamics over that range with “chiral effective field theory.” But then an EFT will reach its “UV cutoff,” a short-distance, high-energy scale at which the EFT stops being an effective description of the system. At a cutoff of 1 GeV, for example, chiral effective field theory stops working, because protons and neutrons stop behaving like single particles and instead act like trios of quarks. A different theory kicks in.

 

QUANTIZED COLUMNS

How Steven Weinberg Transformed Physics and Physicists

 

AUGUST 11, 2021

 

READ LATER

 

Importantly, an EFT breaks down at its UV cutoff for a reason. The cutoff is where new, higher-energy particles or phenomena that aren’t included in that theory must be found.

In its range of operation, an EFT accounts for UV physics below the cutoff by adding “corrections” representing these unknown effects. It’s just like how a fluid equation has a viscosity term to capture the net effect of short-distance molecular collisions. Physicists don’t need to know what actual physics lies at the cutoff to write these corrections; they just use the cutoff scale as a ballpark estimate of the size of the effects.

 

Typically when you’re calculating something at an IR scale of interest, the UV corrections are small, proportional to the (relatively smaller) length scale associated with the cutoff. The situation changes, though, when you’re using EFT to calculate a parameter like the Higgs mass or the cosmological constant — something that has units of mass or energy. Then the UV corrections to the parameter are big, because (to have the right units) the corrections are proportional to the energy — rather than the length — associated with the cutoff. And while the length is small, the energy is high. Such parameters are said to be “UV-sensitive.”

 

The concept of naturalness emerged in the 1970s along with effective field theory itself, as a strategy for identifying where an EFT must cut off, and where, therefore, new physics must lie. The logic goes like this: If a mass or energy parameter has a high cutoff, its value should naturally be large, pushed higher by all the UV corrections. Therefore, if the parameter is small, the cutoff energy must be low.

 

Some commentators have dismissed naturalness as a mere aesthetic preference. But others point to when the strategy revealed precise, hidden truths about nature. “The logic works,” said Craig, a leader of recent efforts to rethink that logic. Naturalness problems “have always been a signpost of where the picture changes and new things should appear.”

 

What Naturalness Can Do

 

In 1974, a few years before the term “naturalness” was even coined, Mary K. Gaillard and Ben Lee made spectacular use of the strategy to predict the mass of a then-hypothetical particle called the charm quark. “The success of her prediction and its relevance to the hierarchy problem are wildly underappreciated in our field,” Craig said.

 

That summer of ’74, Gaillard and Lee were puzzling over the difference between the masses of two kaon particles — composites of quarks. The measured difference was small. But when they tried to calculate this mass difference with an EFT equation, they saw that its value was at risk of blowing up. Because the kaon mass difference has units of mass, it’s UV-sensitive, receiving high-energy corrections coming from the unknown physics at the cutoff. The theory’s cutoff wasn’t known, but physicists at the time reasoned that it couldn’t be very high, or else the resulting kaon mass difference would seem curiously small relative to the corrections — unnatural, as physicists now say. Gaillard and Lee inferred their EFT’s low cutoff scale, the place where new physics should reveal itself. They argued that a recently proposed quark called the charm quark must be found with a mass of no more than 1.5 GeV.

 

The charm quark showed up three months later, weighing 1.2 GeV. The discovery ushered in a renaissance of understanding known as the November revolution that quickly led to the completion of the Standard Model. In a recent video call, Gaillard, now 82, recalled that she was in Europe visiting CERN when the news broke. Lee sent her a telegram: CHARM HAS BEEN FOUND.

 

 

In 1974, Mary K. Gaillard (pictured here in the 1990s) and Ben Lee used a naturalness argument to predict the mass of a hypothetical elementary particle called the charm quark. The charm was discovered months later.

AIP Emilio Segrè Visual Archives

 

Such triumphs led many physicists to feel certain that the hierarchy problem, too, should herald new particles not much heavier than those of the Standard Model. If the Standard Model’s cutoff were up near the Planck scale (where researchers know for sure that the Standard Model fails, since it doesn’t account for quantum gravity), then the UV corrections to the Higgs mass would be huge — making its lightness unnatural. A cutoff not far above the mass of the Higgs boson itself would make the Higgs about as heavy as the corrections coming from the cutoff, and everything would look natural. “That option has been the starting point of the work that has been done in trying to address the hierarchy problem in the last 40 years,” said Garcia Garcia. “People came up with great ideas, like supersymmetry, compositeness [of the Higgs], that we haven’t seen realized in nature.”

 

Garcia Garcia was a few years into her particle physics doctorate at the University of Oxford in 2016 when it became clear to her that a reckoning was in order. “That’s when I became more interested in this missing component that we don’t normally incorporate when we discuss these problems, which is gravity — this realization that there’s more to quantum gravity than we can tell from effective field theory.”

 

Gravity Mixes Everything Up

 

Theorists learned in the 1980s that gravity doesn’t play by the usual reductionist rules. If you bash two particles together hard enough, their energies become so concentrated at the collision point that they’ll form a black hole — a region of such extreme gravity that nothing can escape. Bash particles together even harder, and they’ll form a bigger black hole. More energy no longer lets you see shorter distances — quite the opposite. The harder you bash, the bigger the resulting invisible region is. Black holes and the quantum gravity theory that describes their interiors completely reverse the usual relationship between high energies and short distances. “Gravity is anti-reductionist,” said Sergei Dubovsky, a physicist at New York University.

 

Quantum gravity seems to toy with nature’s architecture, making a mockery of the neat system of nested scales that EFT-wielding physicists have grown accustomed to. Craig, like Garcia Garcia, began to think about the implications of gravity soon after the LHC’s search came up empty. In trying to brainstorm new solutions to the hierarchy problem, Craig reread a 2008 essay about naturalness by Giudice, the CERN theorist. He started wondering what Giudice meant when he wrote that the solution to the cosmological constant problem might involve “some complicated interplay between infrared and ultraviolet effects.” If the IR and the UV have complicated interplay, that would defy the usual decoupling that allows effective field theory to work. “I just Googled things like ‘UV-IR mixing,’” Craig said, which led him to some intriguing papers from 1999, “and off I went.”

 

It’s a time where I feel like we are on to something profound.

 

Isabel Garcia Garcia

 

UV-IR mixing potentially resolves naturalness problems by breaking EFT’s reductionist scheme. In EFT, naturalness problems arise when quantities like the Higgs mass and the cosmological constant are UV-sensitive, yet somehow don’t blow up, as if there’s a conspiracy between all the UV physics that nullifies their effect on the IR. “In the logic of effective field theory, we discard that possibility,” Craig explained. Reductionism tells us that IR physics emerges from UV physics — that water’s viscosity comes from its molecular dynamics, protons get their properties from their inner quarks, and explanations reveal themselves as you zoom in — never the reverse. The UV isn’t influenced or explained by the IR, “so [UV effects] can’t have a conspiracy to make things work out for the Higgs at a very different scale.”

 

The question Craig now asks is: “Could that logic of effective field theory break down?” Perhaps explanations really can flow both ways between the UV and the IR. “That’s not totally pie in the sky, because we know that gravity does that,” he said. “Gravity violates the normal EFT reasoning because it mixes physics at all length scales — short distances, long distances. Because it does that, it gives you this way out.”

 

How UV-IR Mixing Might Save Naturalness

 

Several new studies of UV-IR mixing and how it might solve naturalness problems refer back to two papers that appeared in 1999. “There is a growth of interest in these more exotic, non-EFT-like solutions to these problems,” said Patrick Draper, a professor at the University of Illinois, Urbana-Champaign whose recent work picks up where one of the 1999 papers left off.

Draper and his colleagues study the CKN bound, named for the authors of the ’99 paper, Andrew Cohen, David B. Kaplan and Ann Nelson. The authors thought about how, if you put particles in a box and heat it up, you can only increase the energy of the particles so much before the box collapses into a black hole. They calculated that the number of high-energy particle states you can fit in the box before it collapses is proportional to the box’s surface area raised to the three-fourths power, not the box’s volume as you might think. They realized that this represented a strange UV-IR relationship. The size of the box, which sets the IR scale, severely limits the number of high-energy particle states within the box — the UV scale.

 

 

They then realized that if their same bound applies to our entire universe, it resolves the cosmological constant problem. In this scenario, the observable universe is like a very large box. And the number of high-energy particle states it can contain is proportional to the observable universe’s surface area to the three-fourths power, not the universe’s (much larger) volume.

 

That means the usual EFT calculation of the cosmological constant is too naive. That calculation tells the story that high-energy phenomena should appear when you zoom in on the fabric of space, and this should blow up the energy of space. But the CKN bound implies that there may be far, far less high-energy activity than the EFT calculation assumes — meaning precious few high-energy states available for particles to occupy. Cohen, Kaplan and Nelson did a simple calculation showing that, for a box the size of our universe, their bound predicts more or less exactly the tiny value for the cosmological constant that’s observed.

 

Their calculation implies that big and small scales might correlate with each other in a way that becomes apparent when you look at an IR property of the whole universe, such as the cosmological constant.

 

Draper and Nikita Blinov confirmed in another crude calculation last year that the CKN bound predicts the observed cosmological constant; they also showed that it does so without ruining the many successes of EFT in smaller-scale experiments.

 

The CKN bound doesn’t tell you why the UV and IR are correlated — why, that is, the size of the box (the IR) severely limits the number of high-energy states within the box (the UV). For that, you probably need to know quantum gravity.

 

Other researchers have looked for answers in a specific theory of quantum gravity: string theory. Last summer, the string theorists Steven Abel and Keith Dienes showed how UV-IR mixing in string theory might address both the hierarchy and cosmological constant problems.

A candidate for the fundamental theory of gravity and everything else, string theory holds that all particles are, close up, little vibrating strings. Standard Model particles like photons and electrons are low-energy vibration modes of the fundamental string. But the string can wiggle more energetically as well, giving rise to an infinite spectrum of string states with ever-higher energies. The hierarchy problem, in this context, asks why corrections from these string states don’t inflate the Higgs, if there’s nothing like supersymmetry to protect it.

 

Video: The Standard Model of particle physics is the most successful scientific theory of all time. In this explainer, Cambridge University physicist David Tong recreates the model, piece by piece, to provide some intuition for how the fundamental building blocks of our universe fit together.

Emily Buder/Quanta Magazine;


Kristina Armitage and Rui Braz for Quanta Magazine

 

Dienes and Abel calculated that, because of a different symmetry of string theory called modular invariance, corrections from string states at all energies in the infinite spectrum from IR to UV will be correlated in just the right way to cancel out, keeping both the Higgs mass and the cosmological constant small. The researchers noted that this conspiracy between low- and high-energy string states doesn’t explain why the Higgs mass and the Planck energy are so widely separated to begin with, only that such a separation is stable. Still, in Craig’s opinion, “it’s a really good idea.”

 

The new models represent a growing grab bag of UV-IR mixing ideas. Craig’s angle of attack traces back to the other 1999 paper, by the prominent theorist Nathan Seiberg of the Institute for Advanced Study and two co-authors. They studied situations where there’s a background magnetic field filling space. To get the gist of how UV-IR mixing arises here, imagine a pair of oppositely charged particles attached by a spring and flying through space, perpendicular to the magnetic field. As you crank up the field’s energy, the charged particles accelerate apart, stretching the spring. In this toy scenario, higher energies correspond to longer distances.

 

Gravity is anti-reductionist.

 

Sergei Dubovsky

 

Seiberg and company found that the UV corrections in this situation have peculiar features that illustrate how the reductionist arrow can be spun round, so that the IR affects what happens in the UV. The model isn’t realistic, because the real universe doesn’t have a magnetic field imposing a background directionality. Still, Craig has been exploring whether anything like it could work as a solution to the hierarchy problem.

 

Craig, Garcia Garcia and Seth Koren have also jointly studied how an argument about quantum gravity called the weak gravity conjecture, if true, might impose consistency conditions that naturally require a huge separation between the Higgs mass and the Planck scale.

 

Dubovsky, at NYU, has mulled over these issues since at least 2013, when it was already clear that supersymmetry particles were very tardy to the LHC party. That year, he and two collaborators discovered a new kind of quantum gravity model that solves the hierarchy problem; in the model, the reductionist arrow points to both the UV and the IR from an intermediate scale. Intriguing as this was, the model only worked in two-dimensional space, and Dubovsky had no clue how to generalize it. He turned to other problems. Then last year, he encountered UV-IR mixing again: He found that a naturalness problem that arises in studies of colliding black holes is resolved by a “hidden” symmetry that links low- and high-frequency deformations of the shape of the black holes.

 

RELATED:


What No New Particles Means for Physics

A New Theory to Explain the Higgs Mass

What Is a Particle?

Like other researchers, Dubovsky doesn’t seem to think any of the specific models discovered so far have the obvious makings of a Kuhnian revolution. Some think the whole UV-IR mixing concept lacks promise. “There is currently no sign of a breakdown of EFT,” said David E. Kaplan, a theoretical physicist at Johns Hopkins University (no relation to the author of the CKN paper). “I think there is no there there.” To convince everyone, the idea will need experimental evidence, but so far, the existing UV-IR mixing models are woefully short on testable predictions; they typically aim to explain why we haven’t seen new particles beyond the Standard Model, rather than predicting that we should. But there’s always hope of future predictions and discoveries in cosmology, if not from colliders.

 

Taken together, the new UV-IR mixing models illustrate the myopia of the old paradigm — one based solely on reductionism and effective field theory — and that may be a start.

“Just the fact that you lose reductionism when you go to the Planck scale, so that gravity is anti-reductionist,” Dubovsky said, “I think it would be, in some sense, unfortunate if this fact doesn’t have deep implications for things which we observe.”

 

Natalie Wolchover

Senior Writer/Editor