Assistant professor Patrick Rafter was sitting in his University of South Florida office in April when he saw an unusual banner across his computer screen. The federal database he had accessed for his research that day would be removed in a matter of weeks. Rafter had visited the NOAA-maintained Index to Marine and Lacustrine Geological Samples (IMLGS) regularly over his 20-year career studying Earth’s climate.

An emergency video call soon followed. Rafter listened as NOAA staff spoke with university research leaders about finding a new home for the database, which served as a browsable repository for all lake and marine sediment cores kept at institutions around the world. Listed in the database were scientific samples worth “millions and millions of dollars,” says Val Stanley, the Antarctic core curator at Oregon State University’s repository. Without it, scientists would need to individually contact the nearly 30 global repositories when looking for cores.

The IMLGS is one of at least 25 databases and products that NOAA has announced for retirement since April. Some of the databases would still be available online, but no longer updated, while others would be removed from public websites and available only upon archive request. The databases include historic earthquake recordings, satellite readings of cloud radiative properties, and a tool for studying billion-dollar disasters. Rick Spinrad, former NOAA administrator under Joe Biden, says that climate-related physical sciences data seem particularly at risk.

Still more data could be taken offline or no longer updated because of proposed cuts in President Trump’s fiscal year 2026 budget request, which would significantly reduce or cut science programs at NSF, NASA, the Environmental Protection Agency, the National Park Service, NOAA, and the Department of Agriculture. Experts around the world use the data in myriad ways–for example, scientists use them for their research; companies, for developing products; lawmakers, for crafting legislation; and nongovernmental organizations and nonprofits, for creating and improving community services.

Grassroots data rescue efforts to protect such vital resources began in Trump’s first term (see Physics Today, March 2017, page 31). Those initiatives have continued and expanded. The volunteer-led Data Rescue Project, the nonprofit Public Environmental Data Project, and groups housed at universities have downloaded more than a thousand federal public datasets, webpages, and online tools.

Sediment cores at the US Geological Survey’s Pacific Coastal and Marine Science Center in Santa Cruz, California, and other repositories around the world are no longer searchable online after NOAA removed its public core database in May. (Photo by Rex Sanders, USGS Pacific Coastal and Marine Science Center.)

Sediment cores at the US Geological Survey’s Pacific Coastal and Marine Science Center in Santa Cruz, California, and other repositories around the world are no longer searchable online after NOAA removed its public core database in May. (Photo by Rex Sanders, USGS Pacific Coastal and Marine Science Center.)

Close modal

But backing up data for safekeeping is often only the first step. Ideally, datasets would have dedicated staff to curate them and to support continued data collection.

Yuri Ralchenko sounded the alarm bells in mid-March. The atomic spectroscopy group he led at NIST was being shut down ahead of potential reduction-in-force staff cuts in the Department of Commerce, which oversees the institute. Said Ralchenko in an email to the scientific community, “The very first scientific paper from the National Bureau of Standards [NIST’s original name] back in 1904 was on spectra of mixed gases … . Unfortunately, the story of atomic spectroscopy at NIST is coming to an end.”

The group oversaw the Atomic Spectra Database, a collection of critically evaluated reference spectra of neutral and ionized atoms. Specialists in medical fields, semiconductor chips, astronomy, and other areas used the database; it received about 70 000 search requests per month. If the team of six technical staff lost their jobs, the database would shutter. Hundreds of emails poured in, and a change.org petition racked up almost 5600 signatures. At a time when tens of thousands of federal workers were losing their jobs, the six-person group attracted a large outpouring of support.

The outreach paid off. The University of Maryland, through a collaboration with NASA’s Goddard Space Flight Center, agreed to employ the team. But the transition will take time, Ralchenko says, because the researchers must move laboratory equipment and install it in their new laboratory housed at Goddard. The Atomic Spectra Database will still live on NIST’s website, but it won’t receive any new updates. By the end of the year, the team plans to release a mirrored version of the data repository at Goddard, likely under a new name, that will prioritize astronomy and astrophysics applications going forward.

A spectrograph from NIST’s former atomic spectroscopy group will not be relocated to the team’s new home at NASA’s Goddard Space Flight Center, since it is too large and delicate to move. (Photo courtesy of Yuri Ralchenko.)

A spectrograph from NIST’s former atomic spectroscopy group will not be relocated to the team’s new home at NASA’s Goddard Space Flight Center, since it is too large and delicate to move. (Photo courtesy of Yuri Ralchenko.)

Close modal

The team’s instrumental research program that provides some of the information for the database will reach full operations in 1.5 years, says Ralchenko. “Since we will be busy with the move, we’ll not be able to produce new results in the meantime.”

For the IMLGS, salvation also came rather quickly. About two weeks after the initial group video call, a new home was announced: The NSF-funded SESAR2 (System for Earth and Extraterrestrial Sample Registration), an online service hosted at the Lamont–Doherty Earth Observatory of Columbia University. SESAR2 went to work creating a mirror of the frozen NOAA database, but their plans for allowing future additions were listed as “TBD.”

University of South Florida’s Rafter was glad to see that IMLGS would live on, but he couldn’t wrap his head around why the change was necessary. “Public data should be in the public domain,” he says. “And run by the feds.”

Brittany Janis, executive director of the nonprofit Open Environmental Data Project, says that some physical sciences data may stay accessible but frozen in time.

C. David Keeling began sampling atmospheric carbon dioxide levels in 1958 at Mauna Loa Observatory in Hawaii. The data record, now called the Keeling curve, shows average CO2 concentrations increasing from roughly 313 parts per million in the first measurement to nearly 430 ppm by April 2025. It has served as one of the key indicators of climate change.

Keeling died in 2005, and his son, Ralph Keeling, took up the work as a professor at Scripps Institution of Oceanography at the University of California, San Diego. Ralph is not worried that the data could be lost because they have a relatively small digital footprint. But he does fear for the future of the data record: Trump’s FY 2026 budget request would cut NOAA’s programs that support taking measurements for the Keeling curve.

The agency also maintains air measurements at more than 50 stations around the world and provides calibrated CO2 samples to hundreds of groups conducting their own measurements, says Keeling. When considering the full scope of federally funded climate- and environment-related observations, he says, “there’s no way that private philanthropy or other organizations can take over more than a fraction of what’s going on.”

Janis says that if the federal government abdicates its role in collecting and hosting data, private companies may selectively pick up the slack but the products could be proprietary and thus out of reach from the research community and the public.

This article was originally published online on 13 June 2025.

1.