Upgrade to Pro — share decks privately, control downloads, hide ads and more …

[ICRA2021] Automatic Hyper-Parameter Tuning for...

koide3
June 28, 2024

[ICRA2021] Automatic Hyper-Parameter Tuning for Black-box LiDAR Odometry

Automatic Hyper-Parameter Tuning for Black-box LiDAR Odometry
Kenji Koide, Masashi Yokozuka, Shuji Oishi, and Atsuhiko Banno
National Institute of Advanced Industrial Science and Technology (AIST), Japan

IEEE International Conference on Robotics and Automation (ICRA2021)

koide3

June 28, 2024
Tweet

More Decks by koide3

Other Decks in Research

Transcript

  1. Automatic Hyper-Parameter Tuning for Black-box LiDAR Odometry Kenji Koide, Masashi

    Yokozuka, Shuji Oishi, and Atsuhiko Banno National Institute of Advanced Industrial Science and Technology (AIST), Japan
  2. Odometry Estimation LiDAR Odometry Visual Odometry Engel et al., Direct

    Sparse Odometry Pan et al., MULLS: Versatile LiDAR SLAM via Multi-metric Linear Least Square
  3. Tuning is important Odometry estimation/SLAM frameworks involve many hyper-parameters (e.g.,

    downsample resolution, map resolution, keyframe interval...) Many parameters need to be tuned depending on the sensor and environment (e.g., Indoor/Outdoor, Mechanical Rotating/Solid-State LiDAR) w/o parameter tuning Estimation quality largely depends on the choice of the parameters
  4. Tuning is difficult https://google-cartographer-ros.readthedocs.io/en/latest/tuning.html Google Cartographer Tuning Guide says: "Tuning

    Cartographer is unfortunately really difficult. The system has many parameters many of which affect each other." MULLS, SOTA LiDAR SLAM framework, involves over 80 params It's well documented, but you still need to understand in detail how it works https://github.com/YuePanEdward/MULLS Some other frameworks don't even provide documentation... Odometry estimation methods are surprisingly complex, parameter tuning is difficult
  5. Typical tuning process Pickup a param set See the result

    Satisfying? “params”: { “voxel_resolution”: 0.222, “keyframe_interval”: 0.123, … } “params”: { “voxel_resolution”: 0.444, “keyframe_interval”: 0.456, … } Very time-consuming and tedious process • Large amount of human effort • Sub-optimal parameter set Run algorithm Wait for a while KITTI ~40 mins
  6. Automating the tuning process • One may think it can

    easily be automated • Grid/Random search are commonly used • These naïve methods work well if the number of params is small (e.g., 1 or 2) • SLAM frameworks involve many params (e.g., N > 80) • Grid/random search cannot find a good parameter set in a reasonable time
  7. Automatic Hyper-Parameter Tuning with SMBO • Our proposal is to

    introduce a technique called Sequential Model-based Optimization (SMBO) to the parameter tuning problem • SMBO is a technique getting popular for parameter tuning of neural networks • SMBO smartly samples a parameter set to be evaluated to efficiently explore the parameter space with fewer evaluations • It is a black-box optimization technique; knowledge on the objective function is not required • From evaluation results, we found that SMBO tend to suffer from overfitting • To overcome this problem, we propose simple data augmentation techniques for LIDAR data
  8. Sequential Model-Based Optimization Sample a param set Run algorithm Evaluate

    score Maximizes acquisition function Balance between optimization and exploration Expected Improvement: : quantile of See [Bergstra, 2011] for details Tree Structured Parzen Estimator: Error Param Maximizes EI [Akiba, 2019] We use the TPE implementation in optuna
  9. Problem: Overfitting We simply applied SMBO to LeGO-LOAM on KITTI

    dataset Training: Seq 00-05* Test: Seq 06-10 * 01/02 excluded # of params: 6 Average RTEs Too aggressive parameter set was selected Train set Test set : Improved : Deteriorated Overfitting can be a serious problem because: - Amount of training data tend to be small for the odometry estimation problem - Estimation corruption results in a catastrophic error
  10. Data Augmentation Three simple data augmentation techniques are introduced: -

    Random range noise - Random transformation noise - Reversing order Four augmented sequences for each training sequence Injecting perturbations Let SMBO find a robust param set
  11. Evaluation on KITTI We fine-tuned params of three SLAM frameworks

    with different architectures (LeGO-LOAM, hdl_graph_slam, SuMa) The combination of SMBO and data augmentation successfully improved the accuracy of all the methods on both the test and training sets
  12. Real Environment Velodyne HDL32e We optimized the params of LeGO-LOAM

    in real envionments Improved Improved Deteriorated
  13. Real Environment Velodyne HDL32e We optimized the params of LeGO-LOAM

    in real envionments Improved Improved Improved
  14. Conclusion • SMBO was applied to hyper-parameter tuning for LiDAR

    odometry estimation • Simple data augmentation techniques were proposed to prevent overfitting • The combination of SMBO and data augmentation successfully improved the accuracy of several LiDAR odometry frameworks Code available: https://github.com/SMRT-AIST/automatic_tuning