In the event of a nuclear security incident—for example, if investigators were to discover stolen nuclear material—a number of pressing questions would arise.
Chiefly, who is responsible for the illicit activity? Is it state-sponsored or the work of terrorists?
To help answer these questions, and ultimately hold the perpetrators accountable, experts in nuclear forensics try to learn as much about the nuclear material or weapon as quickly as possible.
One way to start is by measuring the gamma rays produced by the material.
These measurements reveal important information about the material’s composition, such as the type of isotopes present, which is instrumental in guiding an investigation.
“Knowing these details about a material helps investigators deduce how it could’ve gotten into someone’s hands to make a bomb,” says Arrielle Opotowsky, a PhD student in nuclear engineering and engineering physics. “Analyzing these measurements points you to the reactor technology that created the material, and that’s specific enough to individual countries that you can really narrow down who is responsible just from that one piece of information.”
Scientists perform these measurements on a sample of the material in a lab, but it can take months to complete this complex analysis. In her nuclear forensics research, which is supported by a U.S. Department of Homeland Security fellowship, Opotowsky is exploring a way to potentially speed up this process using a computational approach rather than physical experiments.
Specifically, she’s researching machine learning methods that could allow us to quickly determine where a nuclear material came from based on measurements of that material. This machine learning approach could ultimately enable investigators to rapidly assess the material using a small instrument out in the field.
“It won’t replace an actual scientist doing experiments, but it might be good enough to give some preliminary results quickly, and that can help guide an investigation in the right direction sooner,” says Opotowsky, a member of Grainger Professor of Nuclear Engineering Paul Wilson’s research group.
Computational tools are also a powerful resource for preventing the spread of nuclear weapons—and EP PhD student Katie Mummah’s research promises to aid international nonproliferation efforts.
Mummah is also a member of Wilson’s research group, which has developed nuclear fuel cycle simulation tools. Mummah is harnessing those software tools to provide additional insight into how nuclear material is passing through various pathways, from one facility to another, throughout its lifecycle. This information, such as how quickly material is accumulating in a certain pathway, can help alert the nonproliferation community to possible red flags.
Now, the EP department is growing its research and education footprint in nuclear security by participating in two new consortia—the Consortium of Enabling Technologies and Innovation (ETI), and the Consortium for Monitoring, Technology and Verification (MTV).
The U.S. Department of Energy’s National Nuclear Security Administration (NNSA) is funding both consortia, which link basic research at universities with the capabilities of the U.S. national laboratories to advance the NNSA’s nuclear science, security and nonproliferation goals and educate its future workforce.
In the ETI consortium, composed of 12 universities and 10 national laboratories and led by Georgia Tech, Wilson is leading the data science research area.
Researchers in this area will investigate methods for combining and analyzing many different kinds of data to detect behaviors and anomalies in the actions of both nation states and non-state actors that telegraph nuclear weapons proliferation.
Wilson says that data could come from a wide variety of places, including satellite images, U.S. intelligence services, process data measured at nuclear facilities, data from news stories and social media feeds, and nuclear safeguards put in place by mutual agreement of the countries.
“We’re interested in modern algorithms that can combine all of this data and infer whether people are acting in accordance with how we expect them to act,” Wilson says.
ETI is designed to draw top researchers from other fields and disciplines into the area of nonproliferation research, and UW-Madison is making a significant contribution to the consortium’s interdisciplinary strength with five principal investigators across ETI’s three core research areas.
Dan Thoma, materials science and engineering professor and director of the Grainger Institute for Engineering at UW-Madison, is participating in the consortium’s advanced manufacturing area, which is focused on recognizing when others might be using new manufacturing technologies for covert nuclear weapons purposes.
Electrical and Computer Engineering Professor Rob Nowak is contributing his expertise in signal processing, machine learning, optimization and statistics. ECE Assistant Professor Andreas Velten is researching novel light-collection methods that could enable smaller, lower-energy radiation detection systems.
And Phil Townsend, a professor of forest and wildlife ecology at UW-Madison, will contribute his expertise in airborne hyperspectral imaging of plants. His research aims to identify how changes to foliage in an environment could signal nuclear weapons proliferation.
“I was able to bring all these researchers together with their diverse expertise, and as a result, UW-Madison is one of the major participants in ETI,” Wilson says.
Additionally, Wilson is participating in the MTV consortium as a principal investigator. That consortium, a partnership of 14 universities led by the University of Michigan, seeks to improve U.S. capabilities to monitor the nuclear fuel cycle.
In his cross-cutting role as the nuclear policy lead, Wilson’s goal is to ensure that the students understand how their fundamental research is relevant to the actual monitoring and detection of nuclear nonproliferation. In particular, he will provide insight on how policy issues play a key role in determining what technologies will ultimately be implemented.
“Every time a new technology is available doesn’t necessarily mean it’s going to be deployed in the field,” he says. “That’s because international safeguards are put in place through an open, negotiated process, and the countries and companies that are being monitored have to agree to certain technologies being deployed in their facilities. And they may have legitimate reasons, such as protecting proprietary business information, for opposing a technology. So I want to help the students understand that these policy issues matter and it’s not just a purely technical pursuit.”