HUGSIM: A Real-Time, Photo-Realistic and Closed-Loop Simulator for Autonomous Driving
Hongyu Zhou, Longzhong Lin, Jiabao Wang, Yichong Lu, Dongfeng Bai, Bingbing Liu, Yue Wang, Andreas Geiger, Yiyi Liao
2024-12-03
Summary
This paper introduces HUGSIM, a new simulator designed to evaluate autonomous driving algorithms in a realistic and interactive environment.
What's the problem?
As autonomous driving technology advances, it's important to test how well these systems perform in real-world scenarios. However, traditional methods often evaluate individual components separately rather than assessing the entire system's performance in a cohesive way. This makes it difficult to understand how well these algorithms will work together in practice.
What's the solution?
HUGSIM addresses this problem by creating a closed-loop simulator that uses real-time, photo-realistic graphics to evaluate autonomous driving algorithms. It transforms 2D images into 3D environments, allowing for realistic navigation and interaction. The simulator includes features like dynamic object rendering and the ability to simulate various driving scenarios, making it possible to test how well different algorithms perform under different conditions. HUGSIM also provides a benchmark with over 400 scenarios from multiple datasets, ensuring comprehensive evaluation.
Why it matters?
This research is significant because it offers a more effective way to test and improve autonomous driving systems. By using HUGSIM, developers can better understand how their algorithms will behave in real-world situations, leading to safer and more reliable self-driving cars. This could ultimately enhance the development of autonomous vehicles and their integration into everyday life.
Abstract
In the past few decades, autonomous driving algorithms have made significant progress in perception, planning, and control. However, evaluating individual components does not fully reflect the performance of entire systems, highlighting the need for more holistic assessment methods. This motivates the development of HUGSIM, a closed-loop, photo-realistic, and real-time simulator for evaluating autonomous driving algorithms. We achieve this by lifting captured 2D RGB images into the 3D space via 3D Gaussian Splatting, improving the rendering quality for closed-loop scenarios, and building the closed-loop environment. In terms of rendering, We tackle challenges of novel view synthesis in closed-loop scenarios, including viewpoint extrapolation and 360-degree vehicle rendering. Beyond novel view synthesis, HUGSIM further enables the full closed simulation loop, dynamically updating the ego and actor states and observations based on control commands. Moreover, HUGSIM offers a comprehensive benchmark across more than 70 sequences from KITTI-360, Waymo, nuScenes, and PandaSet, along with over 400 varying scenarios, providing a fair and realistic evaluation platform for existing autonomous driving algorithms. HUGSIM not only serves as an intuitive evaluation benchmark but also unlocks the potential for fine-tuning autonomous driving algorithms in a photorealistic closed-loop setting.