In 1958, physicist Phillip Anderson of Bell Labs first proposed a theory about wave diffusion that won him the 1977 Nobel Prize. Since then, the idea, known as Anderson localization, has remained just that—a theory. That’s because experimental confirmation of the concept has proved extremely tricky, and the computational firepower needed to fully simulate the phenomenon is prohibitively expensive.
But now, Zongfu Yu, a professor of electrical and computer engineering at the University of Wisconsin-Madison, along with colleagues at Yale University, the Missouri University of Science & Technology, CNRS in France and startup company Flexcompute, have used cutting-edge techniques developed at UW-Madison to successfully simulate Anderson localization using computational electromagnetics. The research not only solves a longstanding problem in physics, it also provides a roadmap to solving similar intractable physics problems. The research was published in June by the journal Nature Physics.
“For many decades, people have worked on Anderson localization and tried to prove this theory experimentally and using computational methods,” says Yu. “But it has always been elusive. Being able to provide compelling evidence to settle the debate is tremendously useful for the community.”
Anderson localization is a phenomenon believed to affect all sorts of waves, including electromagnetic, electron, seismic, and water waves. In general, if a wave encounters an obstacle, it will scatter, but it will also continue to propagate beyond the object. Anderson localization proposes that if the wave encounters multiple obstacles in a disordered material, the wave will scatter and bounce back and forth between the objects, becoming trapped and producing a localized wave.
Yu says that experimental researchers have found some indirect evidence for the phenomenon, but because the signals produced in the experiments are so subtle, they are often contaminated or drowned out by statistical “noise.”
To prove the theory using computational simulation has also proven difficult. “You not only need a very large computational domain to accommodate the large size of the problem, but you also need to have fine resolution to resolve all the wave physics,” says Yu. “More importantly, a single hero simulation will not be sufficient. You need a systematic study looking at different parameters, and probing the physics from different angles. That requires thousands of ultra-fast simulations, which makes it practically difficult and time consuming.”
For this project, however, the team was able to use a tool called Tidy3D developed by Flexcompute, a company co-founded by Yu. The tool uses an algorithm for modeling electromagnetic waves based on the finite-difference time-domain method (a powerful tool for modeling nanoscale optical devices). By coupling this algorithm with advanced computing chips available in the cloud, the same chips that power AI, the team was able to speed up its calculations by orders of magnitude and run its simulation thousands of times, giving a much better picture of what’s going on. “Our progress was enabled by this new computational power we developed here in Madison that allows people to study huge problems in a high-throughput fashion,” says Yu.
In fact, Susan Hagness, a professor of electrical and computer engineering at UW-Madison, coauthored the textbook Computational Electrodynamics: The Finite-Difference Time-Domain Method, that Yu relied on for the simulation. “We used exactly the same method written in the book, but implemented it with a newer generation of computing chips,” says Yu. “This new technology enabled simulations to run 100 times faster than traditional computing technology.”
The simulation confirmed Anderson’s theory more than a half-century after it was first proposed. It also explained why experimentalists have had such a difficult time; the simulation suggests the effects of Anderson localization were not observable in previous experiments. It did, however, identify a non-intuitive setup that may allow for the theory to be proven experimentally.
Yu says this type of computing resource could also be applied to many other electromagnetic physics problems, like optimizing the complex optical signals used in data centers and developing nanostructured nanolenses for things like chip-based lidar, a remote sensing technology. “Those are just a few problems these computational methods can be used for,” he says. “The discoveries enabled by ever more powerful computing never fail to surprise and excite us.”
Zongfu Yu is the Jack St. Clair Kilby Associate Professor and H.I. Romnes Fellow. Susan Hagness is the Philip Dunham Reed Professor and chair of electrical and computer engineering.
Other authors include Professor Alexey Yamilov of Missouri University of Science & Technology, Sergey E. Skipetrov of CNRS, Tyler Hughes and Momchil Minkov of Flexcompute, and Professor Hui Cao of Yale University.
The authors acknowledge support from the National Science Foundation under grants DMR-1905442, DMR-1905465 and the Office of Naval Research under grant N00014-20-1-2197.
Featured image credit: iStock.