
Last April, Los Alamos National Laboratory nudged open the security gates and welcomed outside researchers from biophysics, plasma physics, materials science, Earth systems studies, and more for an unusual monthlong conference.
The goal of the inaugural Scale Bridging Meeting and Workshop was for interdisciplinary scientists to share the challenges they face and the tricks they employ when it comes to solving complex computational problems. The Los Alamos organizers also hoped that the gathering would lead to advances in the simulations that physicists use to understand—and thus maintain and modernize—nuclear weapons.
“Often, what you find in science in general is you have these silos, and people are making advancements in their own silo and often reinventing things that other fields have already developed,” says Jesse Capecelatro, an engineer at the University of Michigan who attended the meeting. “I think this cross-fertilization is really important for advancing science as a whole, and that was sort of the vibe.”
That academic researchers were conversing with scientists doing research that is at least partially classified added intrigue to the proceedings.
Making a guest list
Chris Fryer, a computational physicist at Los Alamos who co-organized the meeting, says the idea came in part from a historical perspective regarding the lab’s role in the computational sciences landscape. The lab’s secrecy and siloed nature was, perhaps, fine when the Department of Energy was the powerhouse in computation, with world-class supercomputers and computational methods. “Now they’re used everywhere,” says Fryer.
And everywhere, people without security clearances are coming up with clever ideas. “We can’t isolate ourselves because we are now a small fraction of all the computational scientists in the world,” Fryer says. At the same time, Los Alamos scientists are tackling “stuff that computational scientists across all disciplines are worried about,” he says, such as innovative computational methods and algorithms that could make more accurate models of nuclear weapons or airplane wings.
One way to foster an exchange of knowledge, Los Alamos officials thought, would be to host a long, intensive computational workshop in the style of the Aspen Center for Physics, which brings experts together for weeks-long collaboration sessions on focused physics topics. “Los Alamos seemed like a great place to do that with our prowess and long history in computing,” says Aimee Hungerford, the deputy leader for the lab’s computer, computational, and statistical sciences division.
The organizers settled on the topic of bridging scales: connecting small size and time scales to large ones in a computational problem. Los Alamos scientists saw scale-bridging problems popping up and plaguing their work on nuclear weapons and on basic physics. And they knew that the same issues plagued researchers in other fields.
The organizers both advertised and looked to their home turf for potential attendees. Los Alamos scientists study so many topics, including pandemics, clean energy, and drug design—in addition to nuclear weapons and the related scientific disciplines that inform their design and function. Fryer asked his topically diverse lab colleagues to recommend thinkers in their fields.

In the end, that meant attendees like Paul Ricker, a computational astrophysicist at the University of Illinois Urbana-Champaign who researches active galactic nuclei, the bright centers of distant galaxies where supermassive black holes are releasing energy in the form of relativistic jets. To grok the jets, he has to understand galaxy clusters that are around 3 million light-years across, galaxies that are perhaps 300 000 light-years in diameter, and black holes roughly the size of our solar system. And he hasn’t yet.
Capecelatro studies fluid dynamics and turbulence and their applications in fields such as renewable energy, disease transmission, and space exploration. “One of the beautiful things is, we actually have a set of equations that describe exactly how fluids move around and interact,” he says. But the huge time and size scales cause analytic problems. “Even though we know the equations, there’s no analytic solution, and we don’t have any computer big enough to solve them.”
The invite list included many people who were familiar not only with scale bridging but also with the lab, its scientists, and its sometimes controversial work. Capecelatro was a postdoc funded by the National Nuclear Security Administration; Ricker did unclassified work at Los Alamos on DOE’s Accelerated Strategic Computing Initiative, which was established after the US stopped explosive testing of nuclear weapons and needed better simulations of them. Other attendees had used DOE supercomputers—outside scientists can collaborate with lab researchers on projects and so be included on applications for time on the machines.
Knowledge diffusion
Each morning from the end of April through the end of May, the attendees commuted from their hotels and Airbnbs in town and met for an hour or so to chat about what they’d been pondering overnight. Then they outlined goals for the coming day, went off to think more about them, and reconvened in the evening to talk about what they’d learned.
The discussions weren’t always smooth. For instance, the environmental scientists in attendance described the concept of diffusion in terms of Darcy’s law, which is used to describe the flow of a fluid through a porous medium; astrophysicists and others had no familiarity with that term. “This is why it’s good to bring people together, because at some point it’s like, ‘I don’t understand what you’re saying. Write up the equation on the board,’ ” says Fryer. “And you write the equation, and you go, ‘So we do have a common language. It’s called math.’ ”
Once they got their lexicon under control, the researchers went over the tools they’ve been using to bridge scales. Those include stochastic methods, like Monte Carlo simulations, and finite-volume methods that take a continuous equation and break it down into small parts that can be represented on a grid.

During discussions, people came across new methods that weren’t common in their own disciplines. Ricker is keen on heterogeneous multiscale modeling. In hydrodynamic simulations, Ricker explains, you move forward in time in finite steps, and you see how the system changes at each step. “The time step that you take is really long compared with the characteristic time scales of the small scale,” he says.
To account for that, Ricker learned through discussions, you can do a sort of sub-simulation. “In between one of these giant steps, you actually run local simulations of the small scales that are resolved but that don't cover the entire domain, that just cover the small region, and then only go for the duration of that one step,” he says. Within the giant step, the local sub-simulations take many small steps and make a prediction. Ricker is interested in seeing how the technique might benefit his work on active galactic nuclei.
Other astrophysicists embraced a technique employed by materials scientists. Following a supernova explosion, radiation travels through and interacts with the clumpy stellar wind. The x-ray photons that astronomers detect are often more energetic than calculations predict because those calculation methods don’t account for the small-scale interactions that trigger shocks and energize the outgoing radiation.
Materials scientists at the meeting described ways to preserve multiple effects that they quantify from the microscale. To do that, they break down a material into imagined components; each piece has its own characteristics that together determine the properties of the whole. An initial calculation might involve how pairs of adjacent pieces interact; subsequent iterations might involve groups of three, then four. With each refinement, the model of the whole grows more accurate while retaining the information of its parts. If astronomers can similarly break down a supernova remnant into such pieces, they may be able to capture the multiscaled interactions between the radiation and circumstellar material in the same simulation.
The back-and-forth learning flowed between disciplines and between academic and national lab scientists. Sometimes, the lab scientists have pinned down more detailed physics in their simulations because they’re dealing with real-world problems of high consequence and can’t abide the large error bars of some astrophysics calculations. “They have a lot of practical problems to address,” Ricker says. But that has a flip side. “There’s a problem focus that I think is less true in academia, where you’re more wide ranging, and maybe if an interesting idea comes up, then you’re willing to go off in this direction.”
That intellectual freedom can lead to more creativity. And academics’ more frequent and less managed interactions with students and colleagues can make them better at rendering their ideas comprehensible to people from different backgrounds. “We don’t have anything we can’t talk about, and there’s obviously stuff that they can’t talk about,” says Ricker about national lab scientists like Fryer. That secrecy can limit both sides’ ability to collaborate. “The type of problems that they’re working on, I don’t have clearance to know a lot of those details,” says Capecelatro. “And so it’s interesting to be in a setting where you have to sort of guess why they care about certain things.”
The workshop’s results, however, will be wide open. Fryer is writing up the conclusions for publication in a peer-reviewed journal. He hopes that scientists, particularly early-career researchers, from any relevant discipline can learn to bridge scales from the month on the mesa.