Month: May 2021

Standard model challenged by new measurement

The following article appeared in UConn Today on May 20, 2021 under by-line  

Physicists are one step closer to describing an anomaly, called the Muon g-2, that could challenge the fundamental laws of physics. It seems the muon may be breaking what have been understood as the laws of physics, and the findings announced on April 7th were met with much excitement and speculation at what this might mean. UConn physics researchers Professor Thomas Blum and Assistant Professor Luchang Jin helped pioneer the theoretical physics behind the findings, and they recently met with UConn Today to help explain the excitement.

What is a muon, and how do you study them?

Blum: A muon is a “fundamental particle,” meaning it’s an elementary particle like an electron or a photon. Muons are unstable, so they don’t live very long. Unlike an electron, where we can focus on them as long as we want and do measurements, we only have a little bit of time to take measurements of muons.

The way researchers perform the experiment is by slamming particles into other particles to create the muons, and they eventually collect them into a beam. This beam of muons travels at almost the speed of light where they live a little bit longer than they would if they were at rest. That’s Einstein’s theory of relativity in action.

The researchers put the muons into what’s called a storage ring where, eventually, they decay into other particles, and it’s those other particles that are detected in the experiment.

Muons have a property called a magnetic moment, which is like a little compass that points in the direction of the magnetic field that it’s in. In the storage ring, there’s a uniform magnetic field, and as the muons are going around in the storage ring, their magnetic moment, which would be perfectly aligned with their direction of travel if there were no anomaly, actually precesses with respect to the direction of travel as it goes around the ring, because of the interaction with the magnetic field.

It’s that precession that they’re measuring, because the precession is proportional to the strength of the magnetic moment. We can measure this magnetic moment extremely precisely in experiments, and we can calculate its value theoretically very precisely, to less than one-half part per million. Then we can compare the two and see how well they agree.

Can you explain the excitement surrounding these results?

Blum: For a long time — almost 20 years — the best measurement had been done at Brookhaven National Lab on Long Island, where they measured this magnetic moment very precisely, and found that it didn’t agree with our best fundamental theory, which is called the Standard Model of particle physics. The discrepancy wasn’t big enough to say that there was definitely something wrong with the Standard Model or not.

The new results are from a new experiment done to measure the magnetic moment even more precisely. That effort has been going on at Fermilab outside of Chicago for a few years now, and they just announced these results in early April. Their measurement is completely compatible with the Brookhaven value, and if you take the two together, then the disagreement with the Standard Model gets even worse: it now stands at 4.2 standard deviations.

People are very excited, because this could possibly signal that there is new physics in the universe that that we don’t know about yet. The new physics could be new particles that we’ve never seen before, or new interactions beyond the ones we know about already and that could explain the difference between what’s measured and what’s calculated. So that’s what everybody’s excited about.

Can you tell us about the Standard Model?

Jin: The Standard Model describes electromagnetic interactions between charged particles. It also describes the so called weak interactions, which is responsible for nuclear decay. The weak interactions become more important in high energy collisions, and unifies with the electromagnetic interactions. Lastly, the Standard Model describes the strong interactions, which bind quarks into nucleons and nuclei.

Basically, the Standard Model describes everything around us, ranging from things happening in our daily lives to the high-energy proton collisions in the Large Hadron Collider, with the major exception being gravity, which is only sort of visible, but we can feel it because gravity forces always add up, and there are a lot of other massive objects around us. It also doesn’t include dark matter, if we actually do have that in the universe.

People believe, and I think this is really true, that the Standard Model cannot possibly describe everything to extremely high precision, especially when we accelerate subatomic particles to very high energies. However, it was not very clear how high the energy or the precision has to be before we can see some discrepancies. We know the upper bound — usually referred to as the Planck scale, where the Standard Model has to fail due to the omission of gravity. But the Planck scale is so high that there is little hope to be able to perform experiments at that high energy. It is very nice to find a concrete example that the Standard Model actually misses something, and the g-2 anomaly is a very good candidate.

What roles did you each have in this research?

Jin: Theoretically, we decomposed the g-2 into contributions from the different types of interactions. At present, most of the values are obtained by analytic calculations of the various contributions. Other experimentally measurable quantities that have little to do with the muon magnetic moment experiments in terms of what they measure can be related through the Standard Model to the Muon g-2 value. So, to a large extent, this can still be viewed as a theory prediction. Blum pioneered the first lattice calculation for a certain g-2 contribution called the hadronic vacuum polarization, which doesn’t use experimental data at all.

Blum: Jin came up with new methods to compute the Hadronic Light-by-Light contribution which allowed us – with colleagues at Brookhaven National Lab, Columbia University, and Nagoya University – to compute it completely for the first time without experimental input. What Jin and I are doing, along with a host of other theorists around the world, is trying to better calculate the value of this magnetic moment from the theory side, so that we can have an even better comparison with the experimental measurements.

Jin: The Standard Model itself has a few parameters, which for most, we know very, very precisely. This includes the masses of the fundamental particles. In principle, as one might imagine, the theory prediction of the Muon g-2 is a very complicated expression just in terms of these numbers. We are not able to do that yet, but maybe soon we can. We expect that if we continue to improve our calculations, and as computers continue to get faster, the last digit determination may become more accurate.

To dig deeper into the science behind the findings, read Blum and Jin’s feature article on the findings in CERN Courier.

Prof. Battersby’s research featured in UConn Today article

Professor Cara Battersby (center)
Professor of Physics Cara Battersby (center) talks to attendees at a solar eclipse viewing in 2017.

Professor Cara Bettersby’s research is featured in the article “The Study of Big Data: How CLAS Researchers Use Data Science” published by UConn Today.

Prof. Battersby’s work focuses on describing and studying the center of the Milky Way galaxy, which she calls an “experimental playground” for the distant cosmos. Her work described the spectroscopy of the galaxy’s center, which analyzes imagery to understand the chemical makeup of the area, as well as its temperature and the velocity of objects.

Battersby works on data from the Submillimeter Array facility, a collection of eight powerful telescopes situated atop Mount Maunakea in Hawaii. The telescope can collect up to a terabyte of data every day, and Battersby’s project used 61 days of data.

Battersby refers to her computer as “her laboratory,” and ensures the students in her classes do, too. In her courses, she often assigns programming and analysis problems, like using a large data set to determine the material composition of the Sun.

“We have a lot of the tools to train students in data science,” she says. “Research is moving in that direction, and students in our programs are prepared for it.”