EKF-based self-attitude estimation with DNN learning landscape information

Ryota Ozaki, Yoji Kuroda

Research output: Contribution to journalArticlepeer-review


This paper presents an EKF-based self-attitude estimation with a DNN (deep neural network) learning landscape information. The method integrates gyroscopic angular velocity and DNN inference in the EKF. The DNN predicts a gravity vector in a camera frame. The input of the network is a camera image, the outputs are a mean vector and a covariance matrix of the gravity. It is trained and validated with a dataset of images and corresponded gravity vectors. The dataset is collected in a flight simulator because we can easily obtain various gravity vectors, although the method is not only for UAVs. Using a simulator breaks the limitation of amount of collecting data with ground truth. The validation shows the network can predict the gravity vector from only a single shot image. It also shows that the covariance matrix expresses the uncertainty of the inference. The covariance matrix is used for integrating the inference in the EKF. Flight data of a drone is also recorded in the simulator, and the EKF-based method is tested with it. It shows the method suppresses accumulative error by integrating the network outputs.

Original languageEnglish
Article number9
JournalROBOMECH Journal
Issue number1
Publication statusPublished - Dec 2021


  • Attitude estimation
  • Deep learning
  • Extended Kalman filter
  • Mobile robotics

Fingerprint Dive into the research topics of 'EKF-based self-attitude estimation with DNN learning landscape information'. Together they form a unique fingerprint.

Cite this