We present a class of information metric optimization methods for Bayesian inverse problems. Two examples of information metrics are studied, including the Fisher-Rao metric and the Wasserstein-2 metric. Focusing on the Wasserstein-2 metric, we introduce accelerated gradient and Newton flows. We also formulate their practical discrete-time algorithms in particles. Facing a curse of dimensionality problem, we further introduce a projected Wasserstein gradient method to handle it. Numerical experiments, including PDE-constrained Bayesian inference and parameter estimation in COVID-19 modeling, demonstrate the effectiveness of the proposed method. This is based on joint works with Yifei Wang and Peng Chen.