Skip to main content
autonomous research testbed (ART) software framework

NSF funding to reduce real-to-sim gap for AI-enabled robotics

Written By: Caitlin Scott



Mechanical Engineering faculty Dan Negrut recently received a four year NSF award for the project ‘Collaborative Research: Differentiable and Expressive Simulators for Designing AI-enabled Robots.’ Negrut is the Bernard A. and Frances M. Weideman Professor of Mechanical Engineering and co-Principal Investigator on this project with colleague Karen Liu from Stanford University.

This project produces knowledge and establishes algorithms that enable computer simulation to reduce design time and costs in engineering intelligent and safe robots. This award supports fundamental research to reduce, and whenever possible eliminate, the sim-to-real gap, unlocking the full potential of AI-enabled robots. The project also advances the state of the art in robotic technologies and broaden the participation in computing of high-school students from underrepresented groups. This project is supported by the cross-directorate Foundational Research in Robotics program, jointly managed and funded by the Directorates for Engineering (ENG) and Computer and Information Science and Engineering (CISE). Learn more.

Special thanks to students Asher Elmquist and Aaron Young, who laid the groundwork for this simulation-in-robotics NSF project. Asher will be graduating this semester with a PhD, and Aaron has been admitted this year into the PhD program at MIT.

The project period is September 2022 – August 2026. The total project award is $0.94 million, with UW-Madison receiving $423,000.

autonomous research testbed (ART) software framework
ART real sim
digital ART vehicle
art simulation

The autonomous research testbed (ART) is a software framework and 1/6th scale physical vehicle that allows the study of simulation in robotics by allowing the same autonomy algorithms to control the physical ART vehicle (top left) and the digital ART vehicle (dART – top right). The dynamics and the sensors on the vehicle are modeled in simulation, using Project Chrono, to provide synthetic data streams that mimic those provided by the real vehicle. To further facilitate the comparison between simulation and reality, real scenarios (e.g. lower left) can be replicated in simulation (lower right), to understand where simulation can be improved to better predict reality. Images provided by Asher Elmquist.