From the AHRS sensor fusion orientation filter to a 3d graphical rotating cube

In a comment from Ciskje on my sensor fusion implementation, I’ve been asked to explain how the Processing graphical uses rotations to visualize the orientation informations coming from the sensor fusion algorithm.

The question was basically, what’s the reason about using the Yaw, Pitch and Roll in a “strange” order in the following lines of code in my Processing graphical cube example:

rotateZ(-Euler[2]); // phi - roll
rotateX(-Euler[1]); // theta - pitch
rotateY(-Euler[0]); // psi - yaw

The following explanation assumes some basic knowledge of rotation matrices and 3d coordinate systems. For a good introduction on the topics, I strongly suggest reading Chapter 2 of “Robot Modeling and Control” by Mark W. Spong.

Demonstration

The first thing you have to understand is that we have 3 coordinate systems here.

  • OsXsYsZs attached to the sensor array
  • OwXwYwZw attached to the real world, with Xw pointing to the Hearth North and Zw pointing to the sky
  • OmXmYmZm the coordinate system of the graphics program, usually it is attached on the top left corner of the monitor with Xm pointing right, Ym pointing down and Zm pointing forward to the monitor.

Note that when Xw and Zm are aligned (you point the monitor to the Earth north in your location) there is a simple relationship between OwXwYwZw and OmXmYmZm: a point pw defined in the coordinate system W having coordinates pw = [pw_x, pw_y, pw_z] can be expressed in the coordinate system M doing pm = [-pw_y, -pw_z, pw_x]

The sensor fusion algorithm running on Arduino computes a quaternion representing the orientation of OsXsYsZs with respect to OwXwYwZw, and from the quaternionToEuler function in the Processing code we can get the Euler angles expressed in the aerospace sequence, so they are the yaw (ψ – psi), pitch (θ – theta) and roll (φ – phi) angles.

We can use them in the following way: supposing that OsXsYsZs and OwXwYwZw are aligned in the beginning we can rotate OsXsYsZs so that it would assume the orientation described in the quaternion or yaw, pitch, roll angles by:

  1. rotate by ψ around Zs
  2. rotate by θ around Ys
  3. rotate by φ around Xs

After doing the above we will obtain that OsXsYsZs is oriented with respect to OwXwYwZw according to the orientation described by the output quaternion from the sensor fusion algorithm.

The same result could have been produced by rotations relative to the W coordinate system, but using the reverse order:

  1. rotate by φ around Xw
  2. rotate by θ around Yw
  3. rotate by ψ around Zw

However, we are interested in rotating the graphical cube which is defined in the OmXmYmZm coordinate system so a slightly different approach is needed. From the relationship between OwXwYwZw and OmXmYmZm described above we can see that there is a relationship between the rotations made in the W and M coordinate system:

  • a rotation φ around Xw corresponds to a rotation of -φ around Zm
  • a rotation θ around Yw corresponds to a rotation of -θ around Xm
  • a rotation ψ around Zw corresponds to a rotation of -ψ around Ym

Given the above and remembering that in Processing the rotations are made in respect to the monitor frame we can conclude that, in order to align a graphical cube defined in the OmXmYmZm to the orientation described by the quaternion and then Euler angles coming from the sensor fusion algorithm of the sensors coordinate system OsXsYsZs, we have to do the following rotations:

  1. a rotation of -φ around Zm
  2. a rotation of -θ around Xm
  3. a rotation of -ψ around Ym

as we wanted to demonstrate.

Scroll to Top