Abstract. This paper proposes a novel data-driven approach for inertial navigation, which learns to estimate trajectories of natural human
motions just from an inertial measurement unit (IMU) in every smartphone. The key observation is that human motions are repetitive and
consist of a few major modes (e.g., standing, walking, or turning). Our
algorithm regresses a velocity vector from the history of linear accelerations and angular velocities, then corrects low-frequency bias in the
linear accelerations, which are integrated twice to estimate positions. We
have acquired training data with ground truth motion trajectories across
multiple human subjects and multiple phone placements (e.g., in a bag
or a hand). The qualitatively and quantitatively evaluations have demonstrated that our simple algorithm outperforms existing heuristic-based
approaches and is even comparable to full Visual Inertial navigation to
our surprise. As far as we know, this paper is the first to introduce supervised training for inertial navigation, potentially opening up a new
line of research in the domain of data-driven inertial navigation. We will
publicly share our code and data to facilitate further research