Design and Implementation of an Exercise Monitoring System ...ece.eng.umanitoba.ca/undergraduate/ECE4600/ECE4600/... · Design and Implementation of an Exercise Monitoring System
Post on 23-Mar-2020
3 Views
Preview:
Transcript
Design and Implementation of an Exercise Monitoring System
Using 3-Dimensional Skeleton Tracking
Group 09
Buzukira Chabikuli
Joshua Fast
Kurtis Gibson
Mark Kolynchuk
Kristian Torres
ECE 4600 Group Design Project
Undergraduate Electrical and Computer Engineering
Advisor: Dr. Jun Cai
University of Manitoba
Winnipeg, MB Canada
March 18, 2017
Abstract
The objective of this report is to complete the design, implementation, and verifica-
tion of a wireless exercise monitoring system which utilizes inertial measurement sensors
(IMUs), Bluetooth communication, and an iOS application. The system’s purpose is to
track and display exercise completion percent, and provide a visualization of exercise
motion through a manipulatable 3D model on iOS.
Our design presents a hardware setup that is tailored to be wearable with minimal
impact on exercise motion. Four inertial measurement sensors are placed across a user’s
upper body to collect motion data which is then transmitted over Bluetooth 4.0 to an
iOS device. An algorithm is subsequently used to derive usable joint angles from quater-
nion data. The exercise motions are graphically displayed in an iOS application as a
3D human model. Exercise completion levels are successfully displayed as a percentage,
within reasonable error, for five di↵erent upper body exercises in the iOS application.
Near real time wireless motion tracking is also achieved through e�cient wireless data
transmission and a combination of precise graphical rendering.
i
1 Acknowledgments
We would like to thank our advisor, Dr. Jun Cai, for the opportunity to partake in this
project as well all the help and suggestions he provided throughout the design process.
A huge thank you goes to both Marcia Friesen and Robert McCleod for providing our
group with the Apple hardware (the iPhone 6 and iPad Mini 2), a licenced Apple developer
account, and access to a Mac Pro machine. Without them, we would not have been able to
complete our project.
We would like to thank Dr. Ahmad Byagowi for his helpful suggestions regarding quater-
nions.
We would also like to thank the sta↵ at North Centennial Recreation and Leisure Fa-
cility for providing the group a board room for meetings and a pool CPR mannequin to act
as a dummy user during graphical rendering testing.
Lastly, we would like to extend our gratitude to the University of Manitoba Faculty of
Electrical and Computer Engineering for their continued support over the years.
ii
2 Contributions
Table 1: Tasks, Milestones, and Division of Labour
Milestone Module TaskMember(s)Responsible
Design &Simulation
Framework Simulation of sensor data processing algorithms. Josh
Hardware Determine hardware components & architecture.Mark &Christian
Prototyping &Testing
HardwareSensor functionality (individual & multiplexed). Mark
Bluetooth module testing.Christian,Kurtis &Kristian
Data encapsulation and transmission.
Mark ,Christian,Kurtis,Kristian
FrameworkCreating the framework layout. KurtisImplementation of algorithms and exercisemonitoring logic within framework.
Kurtis &Josh
Application
Bluetooth module.Christian &Kurtis
iOS application layout.Kurtis &Kristian
Data handling.Kurtis &Kristian
3-D Rendering within the application. Kristian
Integration &Validation
Hardware Hardware optimization and fabrication.Mark,Josh,Kristian
Application &Hardware
Integration and validation of application and hardware.
Christian,Josh,Mark,Kristian &Kurtis
Application &Framework
Integration of application and framework.Kristian,Kurtis,Josh
Final Report Writing Drafting, editing, revising, and documenting the report
Christian,Josh,Mark,Kristian &Kurtis
iii
Contents
1 Acknowledgments ii
2 Contributions iii
3 Introduction 1
3.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
3.2 System Performance Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . 2
4 Hardware 3
4.1 BNO055 Inertial Measurement Sensor . . . . . . . . . . . . . . . . . . . . . 3
4.2 Sensor Output and Drift . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
4.3 Sensor Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
4.4 Sensor Placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
4.5 9584A I2C Multiplexer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
4.6 Bluno Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.7 Power and Battery Life . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
5 Mathematics of Orientations 11
5.1 Euler Angles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.1.1 Gimbal Lock . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.2 Quaternions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.2.1 Body Joint Angles from Quaternions . . . . . . . . . . . . . . . . . . 14
5.3 Accuracy of IMU Orientation Data . . . . . . . . . . . . . . . . . . . . . . . 15
5.4 Exercise Completion Level Calculations . . . . . . . . . . . . . . . . . . . . 17
6 Communication 17
6.1 Integrated Bluetooth . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
6.2 Data Encapsulation and Bluno Transmission . . . . . . . . . . . . . . . . . 18
6.3 iOS Bluetooth Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
7 Software Framework 20
7.1 Overview of ExerciseMotionTracker framework . . . . . . . . . . . . . . . . 20
8 Three-Dimensional Modelling 25
8.1 3D Modelling Application: Blender . . . . . . . . . . . . . . . . . . . . . . . 25
8.2 Initial Skeleton Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
8.3 Armature & Rigging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
8.4 Posing vs. Animation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
8.5 Exporting Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
iv
9 Graphical Rendering Environments 31
9.0.1 OpenGL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
9.1 iOS Scene Kit API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
9.2 Scene Kit within XCode. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
10 iOS Application 34
10.1 StiOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
10.2 Overview of the app layout . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
10.2.1 Integration of the Three Dimensional Model into the Application . . 37
10.2.2 Model Manipulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
11 Conclusion 40
A Body Joint Angle Calculation Results 43
B Project Budget Summary 48
C Schematics 49
D Software Repository 52
D.1 Arduino Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
D.2 Scene Kit Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
D.3 Updating the Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
D.4 GitHub Repository Links . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
v
List of Tables
1 Tasks, Milestones, and Division of Labour . . . . . . . . . . . . . . . . . . . iii
2 Summary of Project Technical Specification. . . . . . . . . . . . . . . . . . . 2
3 Mapping between the sensors and the BodyJoints. . . . . . . . . . . . . . . 22
4 Project Budget Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5 Budget from External Funds. . . . . . . . . . . . . . . . . . . . . . . . . . . 48
vi
List of Figures
1 Complete Hardware Schematic. . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 Photo of BNO055 inertial measurement unit. . . . . . . . . . . . . . . . . . 3
3 BNO055 available data output and drift over time. . . . . . . . . . . . . . . 5
4 Upper body IMU placement. . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5 BNO055 inertial measurement unit secured to wrist with velcro. . . . . . . 6
6 Photo of TCA9548A I2C Multiplexer. . . . . . . . . . . . . . . . . . . . . . 7
7 Photo of DFRobot Bluno Board. . . . . . . . . . . . . . . . . . . . . . . . . 8
8 Arduino Code Flowchart. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
9 Photo of the Battery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
10 Euler angle visualization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
11 Rotations through the 2D Imaginary Plane. . . . . . . . . . . . . . . . . . . 12
12 Photographic measurement of elbow joint angle during a bicep curl. . . . . 15
13 Inertial measurement unit derived body joint angle measurements during a
bicep curl. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
14 Bluetooth scanning protocol flow chart. . . . . . . . . . . . . . . . . . . . . 19
15 Bluetooth transmission flow chart. . . . . . . . . . . . . . . . . . . . . . . . 20
16 Exercise Motion Tracker framework class diagram. . . . . . . . . . . . . . . 21
17 Delegate Class Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
18 SensorRcv Flowchart. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
19 A graphical representation of the initial skeletal design of a human arm. . . 26
20 Anatomy of a blender armature node. . . . . . . . . . . . . . . . . . . . . . 27
21 Anatomy of a blender armature node. . . . . . . . . . . . . . . . . . . . . . 27
22 The entire rig used for a three dimensional human model. . . . . . . . . . . 28
23 Anatomy of a Blender armature node. . . . . . . . . . . . . . . . . . . . . . 29
24 The local coordinate system of the BNO055 sensor. . . . . . . . . . . . . . . 31
25 The local coordinate system of an armature node. . . . . . . . . . . . . . . 31
26 The viewing pipeline that highlights the steps of OpenGL during the graph-
ical rendering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
27 Using Scene Kit’s physics engine to apply a rotational force to a node’s
coordinate axes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
28 The icon that is shown on the iOS springboard. . . . . . . . . . . . . . . . . 34
29 Application Login Page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
30 Application patient list page. . . . . . . . . . . . . . . . . . . . . . . . . . . 36
31 The main session screens of the STiOS application. . . . . . . . . . . . . . . 36
32 The 3D model as shown in the STiOS application on an iPad Mini 2. . . . . 37
33 Importing the model into the Scene Kit Environment. . . . . . . . . . . . . 38
34 The 3D model as shown in the STiOS application on an iPad Mini 2. . . . . 39
35 Angle between left and right elbows during a Flyes exercise. . . . . . . . . . 43
vii
36 Angle between left and right wrists during a Flyes exercise. . . . . . . . . . 44
37 Left elbow angle during an overhead press exercise. . . . . . . . . . . . . . . 44
38 Right elbow angle during an overhead press exercise. . . . . . . . . . . . . . 45
39 Left elbow angle during a one-arm row exercise. . . . . . . . . . . . . . . . . 45
40 Right elbow angle during a one-arm row exercise. . . . . . . . . . . . . . . . 46
41 Left elbow angle during triceps exercise. . . . . . . . . . . . . . . . . . . . . 46
42 Right elbow angle during triceps exercise. . . . . . . . . . . . . . . . . . . . 47
43 TCA9548 I2C multiplexer schematic. . . . . . . . . . . . . . . . . . . . . . . 49
44 BNO055 inertial measurement unit schematic. . . . . . . . . . . . . . . . . . 50
45 Arduino Bluno schematic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
46 Bluno Arduino code. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
viii
Glossary
3D Three-dimensional
API Application Program Interface
ATT Attribute Protocol
BLE Bluetooth Low Energy
EMF Electromagnetic Field
GATT Generic Attribute Profile
I2C Inter-Integrated Circuit
IMU Inertial Measurement Unit
iOS Apple Mobile Operating System
LiPo Lithium Polymer
OSX Macintosh Operating System X
STiOS Skeleton Tracker for the iOS
UUID Universally Unique Identifier
ix
3 Introduction
3.1 System Overview
Our project is a wireless exercise monitoring system that displays a 3D skeleton model
which follows a user’s movement. The two functions of the device are:
1. To track a users range of motion in order to display an exercise completion percent.
2. To provide a visualization of exercise motion through a manipulatable 3D model on
an iOS device.
The project is broken down into three separate modules: the hardware system, the software
framework and the iOS application.
The design of the hardware system consists of four inertial measurement units, each of
which include a 3D gyroscope, a 3D accelerometer, and a 3D magnetometer. These sensors
are mounted on upper body limb segments and physically connected through a multiplexer
to a Bluno microcontroller. The Bluno is responsible for reading the sensors and transmit-
ting the collected inertial data via Bluetooth 4.0 to an iOS device.
The software framework splits the received sensor quaternion data into its four w, x, y,
and z components and translates them into useable joint angles. Comparing the calculated
body joint angles to known ranges of motion for specified exercises, the framework calcu-
lates the range of motion of selected exercises and expresses them as a percentage. The
framework also maps individual sensor orientations to specific limb segments in order to
facilitate 3D modelling.
The iOS application is responsible for receiving sensor data from the hardware via Blue-
tooth 4.0. Upon successful reception the data is passed to the framework which interprets
and stores the data. Once the appropriate sensor quaternion components are handled, the
iOS application is then able to graphically render a manipulatable 3D human model by
changing the orientation of the associating body part frame-by-frame in order to simulate
real time movement. The graphical rendering is handled by an OSX application program-
ming interface, Scene Kit, a natively integrated environment within Apple’s libraries. In
addition to interfacing with the hardware and framework, the application is also designed
to allow a user to navigate through, set up a Bluetooth 4.0 connection, and visually see a
3D model mimic the motions made by the hardware.
Our wearable technology will ultimately prove useful in the fields of injury rehabilita-
tion and biomechanics within sport. With the use of inertial sensors, our system will allow
experts to more accurately analyze a user’s range of motion throughout various exercises
1
beyond the capabilities of visual assessment.
3.2 System Performance Metrics
Project design requirements along with their measured and achieved values are seen in Table
2.
Table 2: Summary of Project Technical Specification.
Performance Metric Proposed
Value(s)/
Range
Achieved
Value(s)/
Range
Comment
Number of Inertial Sen-
sors
3� 8 4 (Up to 8) The current hardware setup
makes use of 4 IMUs. Our
design however, is expandable
and allows for the addition of
4 additional sensors totaling 8
IMUs.
Operating Voltage 5V 5V No Comment.
Operating Time (Bat-
tery Life)
30 minutes 15 Hours During continuous exercise,
the 2500mAh battery has
minimal impact on wearabil-
ity (90g weight)
Minimum Operating
Range (Bluetooth)
3 meters 20 meters Large physical obstructions
have been noted to reduce sig-
nal integrity.
Number of exercises 5+ 5 The application allows for se-
lection of bicep curls, flyes,
bench presses, tricep exten-
sions and overhead presses.
Exercise Completion
Levels
3+ 100 Completion levels are repre-
sented as a percentage as the
user moves through known
ranges of motion.
2
4 Hardware
Figure 1: Complete Hardware Schematic.
The schematic in Figure 1 represents the complete hardware setup of our wireless exercise
monitoring system. It is comprised of four concurrently operating BNO055 IMU sensors
which collect various forms of motion data. These sensors communicate via I2C protocol
and are wired through a TCA9548A I2C multiplexer to the Bluno microcontroller. The
Bluno functions as a master which facilitates the collection, formatting and transmission of
motion data over Bluetooth 4.0 to our iOS application.
4.1 BNO055 Inertial Measurement Sensor
Figure 2: Photo of BNO055 inertial measurement unit. [1]
The need to track exercise motion is an inherent design requirement. As such, an inertial
sensor, specifically the Adafruit Industries 9-DOF Absolute Orientation IMU Fusion Break-
out BNO055 was selected for our design. A close up of the breakout is shown in Figure 2
and its technical details are as follows:
3
• Dimensions: 20mm x 27mm x 4mm / 0.8” x 1.1” x 0.2”
• Header holes begin 4mm from the mounting holes
• Mounting hole dimensions: 20mm x 12mm apart
• Uses I2C address 0x28 (default) or 0x29
• Operating Voltage: 3-5V
• Weight: 3g
The Adafruit breakout was chosen as it provides ease of access to the Bosch BNO055
IMU. Bosch’s BNO055 sensor integrates a triaxial 14-bit accelerometer, a triaxial 16-bit
gyroscope, a triaxial geomagnetic sensor, and makes use of a 32-bit cortex M0+ running
Boschs Sensortec sensor fusion software [2]. The fusion software gathers raw accelerometer,
gyroscope and magnetometer data and provides usable quaternions or Euler angles as out-
puts which perfectly suit our exercise motion tracking needs.
The breakout is also conveniently pre-configured to make use of the BNO055’s inherent
I2C communication capabilities (Refer to Appendix C for detailed schematic). The serial
data (SDL) and clock (SCL) pinouts provide e�cient data transfer and support the use of
multiple devices.
4.2 Sensor Output and Drift
To confirm the useability of sensor data, individual BNO055 sensor testing was done for
various movements and rotational speeds which helped to verify the functional limits of
each sensor and explore their various output types. Our design makes use of the quaternion
values to track motion, however each BNO055 sensor is also able to provide the following
data output types: [2]
• Acceleration [m/s2] (tri-axial)
• Linear acceleration [m/s2] (excludes gravity)
• Gravity vector [m/s2] (excludes linear acceleration due to motion)
• Magnetic field strength [µT]
• Angular rate [dps/rps]
• Euler angles [degrees/radians] (yaw, roll, pitch)
• Quaternions [quaternion units] (w,x,y,z)
4
The sensor data generated has been verified in Figure 3 as having minimal standstill
drift over a period of one hour. This reinforces our selection of the BNO055 IMU and fur-
ther supports the reliability of received data. For a more detailed validation of quaternion
data output during exercise motion, refer to Section 5.
Figure 3: BNO055 available data output and drift over time.
4.3 Sensor Calibration
Testing of individual sensors revealed that the BNO055 IMUs require calibration prior to
each use in order to ensure data accuracy. The calibration values for the accelerometer
and gyroscope remain constant and have been hard-coded into the appropriate registers.
The magnetometer however, requires new calibration values after each power on as a result
of varying EMF in their surrounding environment [2]. Moving each sensor through a sim-
ple figure eight motion provides su�cient self-calibration values within a handful of seconds.
The work involved in concurrently calibrating multiple magnetometers through our mul-
tiplexer unfortunately prohibits the immediate use of our wearable device. The user must
first undergo a short calibration motion before data is broadcast. Once powered on and
calibrated, each BNO055 undergoes continual self recalibration while transmitting data.
Even with the introduction of obtrusive EMF, the BNO055 will recalibrate its magnetome-
ter within a handful of seconds provided the sensor is subjected to motion. This provides
added assurance that received data is reliable and enforces our choice of the BNO055s to
capture motion data.
5
4.4 Sensor Placement
Figure 4: Upper body IMU placement.
Given the design requirement to track multiple upper body exercises and display the motion
of limb segments, placement of the BNO055 sensors on the user is crucial to ensure the
collected inertial data is usable. Our design makes use of four sensors, one on each segment
of the users upper limbs. This is seen in Figure 5 and was done to facilitate the calculation
of relative joint angles. The sensors are specifically placed on the rigid portions of the user’s
limbs which helps to reduce unwanted sensor movement.
Figure 5: BNO055 inertial measurement unit secured to wrist with velcro.
The decision was also made to implement Velcro straps rather than sewing sensors
directly on to clothing. This helps to secure conduit and reduce unwanted sensor movement.
Throughout testing, this change has proven to substantially reduce error and improve the
overall accuracy of data as discussed in section 5.
6
4.5 9584A I2C Multiplexer
Figure 6: Photo of TCA9548A I2C Multiplexer. [3]
Although our chosen IMU sensors make use of I2C communication, they all share the same
hardware ID. This prohibits the use of a common I2C bus as the sensors are indistinguishable
from one another. To overcome this issue, our design provides independent I2C channels
for each sensor. The TCA9548A I2C multiplexer, seen in Figure 6, is required to collect
data across all four concurrently operating I2C channels. Its technical specifications are as
follows (See also Appendix C for detailed schematic):
• Selectable I2C address 0x70-0x77
• Operating Voltage: 1.65-5.5V
• Weight: 1.8g
• Product Dimensions: 30.6mm x 17.6mm x 2.7mm / 1.2” x 0.7” x 0.1”
The use of an eight channel I2C multiplexer allows our design to support up to eight
independent sensors and presents a layer of expandability to our system. The option to add
an additional four sensors is further supported by the fact that our 3D model represents a
fully manipulable body and is not limited to upper body exercises.
7
4.6 Bluno Board
Figure 7: Photo of DFRobot Bluno Board.
The DFRobot Bluno board seen in Figure 7, was chosen as our microprocessor due to the
availability of Arduino sensor libraries and the fact that it comes pre-manufactured with
an on-board Bluetooth 4.0 module. Its technical specifications are as follows (See also
Appendix C for detailed schematic):
• On-board BLE chip: TI CC2540
• Baud Rate: 9600 - 115200 bps
• Transmission range(Open Space): more than 70m
• Transmission range(O�ce): about 20m
• Power Supply: USB Powered or External 7V 12V DC
• Output Current (I/O pin): 40mA
• Output Current (Power pin): 200mA
• Microcontroller: Atmega328
• Size: 60mm x 53mm / 2.36” x 2.08”
• Weight: 30g
The flowchart shown in Figure 8 represents the code used to poll across all available
IMUs, collect instantaneous quaternions and prepare the motion data for transmission.
A more detailed discussion regarding data transmission and the Blunos Bluetooth 4.0 func-
tionality can be found in Section 6. Refer also to Appendix D for the full Arduino code.
8
Figure 8: Arduino Code Flowchart.
9
4.7 Power and Battery Life
Figure 9: Photo of the Battery.
In order to power the hardware and meet the 30 minute operating time design requirement,
a 7.4V 2500mAh lithium polymer battery is used. Our battery of choice, manufactured by
DFRobot, has the following technical specifications and can be seen in Figure 9:
• 7.4V 2-cell pack
• 2500mAh of charge
• 1C continuous discharge rate (5C Max)
• Arduino compatible DC2.1 power jack
• Size:103mmx34mmx15mm / 4.05”x1.34”x0.59”
• Weight: 90g
• Capacity: 18.5Wh
The battery connects directly to the Blunos onboard regulator which steps down the
provided 7.4V to an operational 5V. Marginal power losses are incurred as a result. Never-
theless, using the formula below, the battery life is calculated as 15 hours.
Battery Capacity [mAh]
Device Consumption [mA]· 0.8 = Battery Life [hours] (1)
2500mAh
133mA· 0.8 u 15hours (2)
The factor of 0.8 accounts for various losses and provides a more conservative battery
life estimate. This far exceeds the practical design requirement of a 30 minute operating
time and has minimal impact on wearability (weight).
10
5 Mathematics of Orientations
5.1 Euler Angles
The first output of the BN0055 inertial measurement unit (IMU) that was considered for
use was the euler angle absolute orientation output. The inertial measurement unit has a
local coordinate frame originating at the center of the sensor. The current orientation of
the IMU can be described by relating the local IMU frame relative to a global reference
frame, called the world frame. [6]
The most common example used to describe euler angles is that of an airplane. An
airplane can perform three independent rotations: roll (a rotation about an axis from nose
to tail); pitch (a rotation that elevates the nose up or down about an axis through the wing
of the plane); and yaw (a rotation the turns the nose of the airplane left or right about a
vertical axis). This concept is neatly illustrated in Figure 10.
Figure 10: Euler angle visualization. [7]
5.1.1 Gimbal Lock
There are several problems that arise when utilizing euler angles to track orientations, first
of which is the phenomena of gimbal lock. To understand gimbal lock, first consider a euler
angle rotation in three dimensional space represented with matrices in the following form:
Rotation =
2
641 0 0
0 cos(x) �sin(x)
0 sin(x) cos(x)
3
75 ·
2
64cos(y) 0 sin(y)
0 1 0
�sin(y) 0 cos(y)
3
75 ·
2
64cos(z) �cos(z) 0
sin(z) cos(z) 0
0 0 1
3
75 (3)
11
However, a problem is encountered when we have a rotations at certain angles, such as
when we have a rotation in the y-dimension by 90 degrees, or ⇡/2 radians.
Rotation =
2
641 0 0
0 cos(x) �sin(x)
0 sin(x) cos(x)
3
75 ·
2
640 0 1
0 1 0
�1 0 0
3
75 ·
2
64cos(z) �cos(z) 0
sin(z) cos(z) 0
0 0 1
3
75 (4)
Rotation =
2
640 0 1
sin(x+ z) cos(x+ z) 0
�cos(x+ z) sin(x+ z) 0
3
75 (5)
We now have an equation where a transformation of x or z has the same result. For this
reason the quaternion output of the IMU is the preferred absolute orientation output from
the BNO055 IMU.
5.2 Quaternions
Quaternions are fundamental in our design of the IMU data processing methods. To begin
to explain quaternions, we will start by considering imaginary numbers in a 2-dimensional
plane as in Figure 11 below.
Figure 11: Rotations through the 2D Imaginary Plane.
If we multiply a complex number by the imaginary number i, we can rotate the complex
12
number through a 90 degree (⇡/2 radian) span. For example, if we take an arbitrary complex
number p:
p = 1 + i (6)
and multiply by the complex number i,
q = p · i = (1 + i) · i = �1 + i (7)
and again multiply by i, to get a new complex number r,
r = q · i = (�1 + i) · i = �1� i (8)
and again multiply by i, to get a new complex number s,
s = r · i = (�1� i) · i = 1� i (9)
and again multiply by i, to get a new complex number t, that is equivalent to the original
complex number p.
t = s · i = (1� i) · i = 1 + i = p (10)
Clearly we would get the same end rotation result as in Figure 11 if we had multiplied
by �i instead of +i throughout.
Using this concept, it is then possible to perform any arbitrary rotation through the
above complex plane by first defining a complex number n:
n = cos(✓) + i ⇤ sin(✓) (11)
This is referred to as rotor; if we then multiply any complex number d by the rotor n,
we get the following:
d = a+ bi (12)
n = cos(✓) + i ⇤ sin(✓) (13)
a0 + b0i = a · cos(✓)� b · cos(✓) + (a · sin(theta) + b · cos(theta)) · i (14)
Which can also be represented in matrix form as:
"a0 �b0
b0 a0
#=
"cos(✓) �sin(✓)
sin(✓) cos(✓)
#·"a �b
b a
#(15)
This relationship can be extended into three dimensions by adding two more complex
13
numbers to generate a relation of the following form:
Quaternion = w + x · i+ y · j + z · k (16)
Where Hamilton’s famous equation derived in 1843 states that,
i2 = j2 = k2 = i · j · k = �1 (17)
and
i · j = k, j · k = i, k · i = j, j · i = �k, k · j = �i, i · k = �j (18)
i2 = j2 = k2 = �1 (19)
allowing us to track rotations and orientations in three dimensions. [8]
5.2.1 Body Joint Angles from Quaternions
For our application, it is necessary to convert the quaternions into a more useable form.
We converted the quaternion data into body joint angles using the quaternion data from
two sensors,
Q1 = [s, x1 · i+ y1 · j + z1 · k] (20)
Q2 = [s, x2 · i+ y2 · j + z2 · k] (21)
The quaternion data is then normalized,
Q01 = Q1/
qs2 + x21 + y21 + z21 (22)
Q02 = Q2/
qs2 + x22 + y22 + z22 (23)
Taking the conjugate of one of the normalized quaternion gives,
Q02 = [s02,�x02 · i+ y02 · j + z02 · k] (24)
Then the dot product of the normalised quaternion Q1 and the normalised conjugate of
Q2,
Q01 ·Q0
2 = |Q01| · |Q0
2| · cos(✓) = s01 · s02 � x01 · x02 � y01 · y02 � z01 · z02 (25)
14
cos(✓) = (s02 � x01 · x02 � y01 · y02 � z01 · z02)/Q01 ·Q0
2 (26)
Since we are using unit-norm quaternions, this can simplify to,
cos(✓) = (s02 � x01 · x02 � y01 · y02 � z01 · z02) (27)
and the desired body joint angle is then equivalent to,
✓ =q(s02 � x01 · x02 � y01 · y02 � z01 · z02) (28)
5.3 Accuracy of IMU Orientation Data
To determine if the above quaternion dot product method of determining body joint angles
was executed with acceptable accuracy, an experiment was performed that used Matlab
to produce the body joint angle with BNO055 data. The body joint angles calculated in
Matlab were then compared to the corresponding body joint angles obtained from video of
the exercise that produced the data.
Figure 12 and 13 illustrate the results of the quaternion dot product method compared
to the actual body joint angle obtained from photographic measurements.
Figure 12: Photographic measurement of elbow joint angle during a bicep curl.
15
Figure 13: Inertial measurement unit derived body joint angle measurements during a bicep
curl.
Body joint angles were calculated using quaternion IMU data and compared to a number
of photographs similar to Figure 12. The di↵erence between body joint angle from pho-
tographs and quaternion derived joint angles was approximately 10�18%. This experiment
to determine accuracy could be expanded during further development to find a statistically
significant body joint angle range of error. However, as compared to other literature re-
garding human body tracking with IMUs, this initial accuracy check agrees with the level
of error that is expected. In Human Motion Analysis with Wearable Inertial Sensors by Xi
Chen, the author was able to produce a body joint angle error of approximately 8% using a
similar IMU setup.[9] The inertial measurement units used by Chen were secured in a more
permanent way eliminating some jostling of the sensors during exercise movement. The
portion of the error not due to how securely the sensors are fastened is largely due to mus-
cle vibrations and movements during flexion and extensions. Since the sensors are secured
directly to the user’s body, any body tremors or movements will be conducted directly to
the inertial measurement unit where error in the current body joint angle can be introduced.
Body joint angle graphs for all five exercises and corresponding data from this Matlab
16
experiment can be found in Appendix B.
5.4 Exercise Completion Level Calculations
Using the concepts presented in the previous two sections, it is possible to track completion
levels for various exercises.
For example, using a pre-determined range of motion for each exercise, the completion
level can be calculated using simple relations. For example, for any exercise,
Completion Level (%) =range of motion� change in elbow joint angle (degrees)
range of motion (degrees)· 100%
(29)
The iOS application utilizes this relation to display the completion level as a percentage
for five upper body exercises, meeting the design requirements.
6 Communication
6.1 Integrated Bluetooth
As discussed in Section 4.6, the hardware used in our design is the DFRobot Bluno micro-
controller. The Bluno microcontroller is a combination of an Arduino Uno microcontroller
and a TI CC2540 BLE (Bluetooth Low Energy) chip. Data transmission between the Bluno
board and an iOS device is done via BLE. BLE is a low power version of Bluetooth 4.0 that
is used for devices which run for a long period of time on power sources [11]. We chose the
Bluno board over Arduino shields for a few reasons, the first being that the BLE chip is
already embedded onto the Bluno board and comes equipped with its own GATT (Generic
Attributes) profile. This is required for successful connection and data transmission. It also
removes the need for external Bluetooth hardware. Secondly, BLE is ideal for our project
because it has a low power consumption for small periodic data exchange. Transmission
range was confirmed through testing to be reliable up to 20 metres, provided no large phys-
ical barriers exist.
17
6.2 Data Encapsulation and Bluno Transmission
The Bluno board’s integrated Bluetooth module operates by directly transmitting any data
written to its serial port over a Bluetooth Low Energy (BLE) connection. This results in the
Bluno’s characteristic value being continuously updated which informs the iOS application.
When the iOS application is informed of an update to the Bluno’s characteristic value,
a method is called on Apple’s CoreBluetooth framework’s class, PeripheralDelegate. To
facilitate transmission, our design makes use of a protocol which avoids byte stream cuto↵.
This allows for e↵ective transfers of sensor readings from the Bluno board to the receiver
iOS application and ensures values are interpreted correctly. The protocol is formatted as
such:
/sensor#,quatW ,quatX ,quatY ,quatZ ˜
A formatted data stream, using this protocol, for sensor zero would be represented as:
/s0, 0.1234,�0.4321,0.1234,�0.4321 ˜
When an update is received by the iOS application, the values are parsed character-by-
character into a string. Upon initial receipt of a string, indicated by the “/ ”delimiter, data
is stored into a bu↵er and a flag is set to indicate that all subsequent characters are to be
bu↵ered. Once the final “˜”delimiter is received, the flag is reset to indicate bu↵ering is
complete and we have a full sensor data string ready for use. In order to identify the sub-
strings, the constructed data string is partitioned with the“,”delimiter. The first substring
indicates to the application which of the four sensor has been received while the other four
strings are known to be the w, x, y and z quaternion values, respectively. Sensor number
and data substrings are mapped directly to limb segments as seen in Section 7.1 table 3.
6.3 iOS Bluetooth Framework
The iOS framework used in the application development is Apple’s Core Bluetooth frame-
work that is responsible for establishing connections and handling data transaction between
the two connected devices. The Core Bluetooth framework contains the CBCentralMan-
ager class and CBPeripheral class. CBCentralManager class is responsible for managing,
scanning, discovering and connecting to advertised peripherals. The CBPeripheral is used
to discover, explore and interact with services available on a remote peripheral. This is also
where the UUID (Universally Unique Identifier) of the peripheral device is specified and
where the data handling processes takes place. UUID is a 128-bit number that is used to
identify a computer system.
18
Our Bluetooth framework also makes use of the BLE GATT profiles protocol to ensure
proper connection and data transmission. GATT defines a way two BLE devices transfer
data back and forth using Services and Characteristics[10], it is required once a connection
has been established between two devices. Services are used to break data up into logic
entities and can contain one or more characteristics, each Service has its own UUID to
distinguish itself from other Services.Characteristics is the lowest level in GATT, it is used
to send data back to the peripheral and the overall interaction with the BLE peripheral.
Data exchange is done through Client-Server protocol in which the Bluno board acts as the
server and the iOS device acts as the client. The GATT server holds the ATT (Attribute
Protocol) lookup data and Service and Characteristics definitions, and the GATT Client
sends requests to the server. ATT is used to store Services, Characteristics and other re-
lated data into a lookup table.
Figure 14 displays the process to initiate scanning for peripherals.
Figure 14: Bluetooth scanning protocol flow chart.
19
Figure 15 displays the overall process of Bluetooth transmission.
Figure 15: Bluetooth transmission flow chart.
7 Software Framework
7.1 Overview of ExerciseMotionTracker framework
The Exercise Motion Tracker Framework is an iOS framework developed in Swift that pro-
vides several classes that enable an iOS application to use data from the BNO055 IMU
sensors to track a users motion. The designed framework serves two main functions: 1) to
translate the raw sensor data into a useable format for rendering the 3D human model, and
2) to calculate completion levels for a user-selected exercise to display a user’s motion in
real time.
20
Figure 16: Exercise Motion Tracker framework class diagram.
The essential building block required for the Exercise Motion Tracker Framework to
function is a data structure called Quaternion (note that Quaternion in this context is
capitalized as it is referring to a data structure. A lower case quaternion references the
mathmatical term). It was decided, early in the design phase, to make a custom Quater-
nion structure as the graphical rendering only required a few quaternion functions to operate
with. The Quaternion structure contains four instance variables: components w, x, y and z.
Quaternion also contains three instance methods: 1) the method getUnitQuaternion() that
returns a normalized version of the quaternion calling it; 2) the method getConjugate() that
returns the specific quaternion’s conjugate; 3) the method getDotProduct(q2: Quaternion)
that takes in another quaternion instance and returns a float variable that represents the
dot product between the associated quaternion and the parameter quaternion.
Another essential basic class within the framework is the BodyJoint class, the left most
node in Figure 16. The diagram shows that BodyJoint structures are made to represent
the actual segments of the users body with the two encapsulated segments: the orientation
instance variable and the getAngleBetweenJoints class method. The orientation instance
variable is of type Quaternion, holding all four quaternion components (w, x, y, and z).
Its sole class method, getAngleBetweenJoints(otherJoint: BodyJoint), is essential to the
exercise monitoring algorithm as it calculates the dot product between the BodyJoint in-
stance’s orientation and the passed BodyJoint instance’s orientation. The dot product is
calculated by normalizing both mentioned Quaternions, calculating the conjugate of one
of the two Quaternions, and applying the dot product algorithm to the conjugate and the
other normalized Quaternion. The result is equal to the cosine of the angle between the
two Quaternions, which is inverted to find the angle.
21
The main class of Exercise Motion Tracker Framework is the Skeleton class, the second
node in Figure 16. This class is used by both the iOS application to update the 3D human
model, and the ExerciseMonitor class. The Skeleton class contains a Swift dictionary data
structure called bodyJoints which represent the di↵erent segments of the users body. The
bodyJoints dictionary has keys of type String and values of type BodyJoint. In the current
version for the application, there are four keys to represent four body parts: rightForearm,
rightBicep, leftForearm and leftBicep. We designed the ExerciseMotionTracker framework
to scale easily, and therefore more BodyJoints can be implemented as needed. This dic-
tionary structure provides an easy way for the iOS application to look up the orientation
data values for certain body segments as the only requirement is the dictionary entry from
which the data is stored under.
The Skeleton class also contains a method called updateFromSensors(sensorData: [String:Float]).
The parameter for this method is a Dictionary called sensorData which contains the latest
orientation update for each BNO055 sensor as a quaternion. The keys for sensorData are
simple strings formatted as an s followed by the sensor number, followed by the quaternion
component (for example: s0x for sensor zero, component x). The sensorData values are
turned into Quaternions and used to update the orientations of the BodyJoint instances in
the bodyJoints dictionary. We created the following mapping between the sensors and the
BodyJoints as shown in table 3.
Table 3: Mapping between the sensors and the BodyJoints. Note the scalability of this
approach by adding additional sensors and mapping them to di↵erent BodyJoints from the
standpoint of the framework.
Sensor Identifier BodyJoint
s0w, s0x, s0y, s0z rightForearm
s1w, s1x, s1y, s1z rightBicep
s2w, s2x, s2y, s2z leftBicep
s3w, s3x, s3y, s3z leftForearm
An important instance variable of the Skeleton class is skeletonDelegate which is of
type SkeletonDelegate. SkeletonDelegate is a custom Swift protocol that enables any class
that conforms to it to receive Skeleton updates. This is necessary to be able to com-
municate between iOS view controllers and pass variables between them, which would
not be possible under normal circumstances. The SkeletonDelegate protocol consists of
just one method, updateBodyJonts(sensorData: [String:BodyJoint]). Whenever the Skele-
22
ton receives a sensor update and updates its bodyJoints Dictionary, it calls skeletonDele-
gate.updateBodyJoints(). This allows every class implementing the SkeletonDelegate pro-
tocol to receive the new Skeleton orientation.
Figure 17: Delegate Class Diagram.
In our application, formatted sensor information strings are received by CoreBluetooth
periodically. As these sensor strings are received they are parsed and stored in a sensor-
Data Dictionary. As each sensor string is received a flag indicating that the specified sensor
has been received and stored is also set. Once all the sensor flags are set, the sensorData
Dictionary is then passed to the Skeleton instance. After the Skeleton instance updates as
outlined above, it notifies its SkeletonDelegate (a visual representation of this process is
shown in Figure 18). The SkeletonDelegate in our case is the GameViewController. The
GameViewController oversees loading and rendering the 3D human model. Once its up-
dateBodyJoints() method is called it uses the sensorData Dictionary that is passed to it
in order to update the models orientation (which is described in detail in the Posing vs.
Animation section, under section 8.4).
23
Figure 18: SensorRcv Flowchart.
The class that facilitates tracking of exercise motion is called the ExerciseMonitor class,
the third node in Figure 16. The ExerciseMonitor class has two instance variables: a Skele-
ton typed variable called skeleton and an ExerciseMonitorDelegate typed instance called
exerciseMonitorDelegate. The variable skeleton is used to access the users BodyJoint ori-
entations. The exerciseMonitorDelegate functions in the same way as the skeletonDelegate
outlined in Figure 18 to allow for communication between separate view controllers. The
ExerciseMonitor class has an updateDelegates() method that informs any classes conforming
to the ExerciseMonitorDelegate protocol that a new update is available. The ExerciseMon-
itor class also contains an instance method called getPercentComplete() which returns the
current exercise completion level as a percentage.
The ExerciseMonitor class is meant to be subclassed by specific exercise classes that
override the getPercentComplete() method via polymorphism, an object-oriented program-
ming concept that allows for overwriting of a specified method to either extend its definition
or overwrite it completely. The reason that the framework is structured in this way is to
allow for flexibility to the handler and/or developer of the framework for expandability,
allowing more exercises to be easily added given specific exercise details. The criteria for
exercise completion could be determined through testing, and it would even be possible to
24
create some sort of ExerciseMonitor classloader to allow users to (safely) create their own
ExerciseMonitor subclasses.
The ExerciseMonitor subclass that we implemented in our application is the BicepCurl
class. We chose the bicep curl exercise as a starting point since it is simple and only re-
quires two sensors to demonstrate. The getPercentageComplete() method of the BicepCurl
class works as follows: 1) the orientation Quaternions for the rightForearm and rightBicep
are taken from the Skeleton instance. 2) The curl angle is determined by calling getAn-
gleBetweenBodyJoints on the rightForearm BodyJoint with the rightBicep BodyJoint as a
parameter. 3) Lastly, the percentComplete is calculated by subtracting the minimum angle
from the curl angle and then dividing by the di↵erence between the minimum and maximum
angles. The minimum and maximum curl angles were determined by testing the algorithm
within a Matlab simulation.
8 Three-Dimensional Modelling
The following section(s) will report on the modules of the project that involve the iOS
application, STiOS (Skeleton Tracker for the iOS), as well as the 3 dimensional modelling
and graphic rendering. Each section will cover the intial design(s) of the associated mod-
ule, initial implementation, design changes, and integration complications, challenges, and
achievements.
8.1 3D Modelling Application: Blender
The graphical (3D) human model was created using the graphical program Blender, as it
allowed for the creation of custom 3D models. Blender also includes a feature that creates
a skeletal rig that can be used along with a model to perform tasks such as animations,
posing, and deformations. Exporting created models can also be done through Blender
using a variety of file formats to suit a developer’s needs. Lastly, Blender is free and open
sourced, allowing for custom modifications and content creation to be done.
25
8.2 Initial Skeleton Design
The model that is required in our design had to be based o↵ a skeletal rig. The rig (skeleton)
was created using armature nodes (bones) that can be graphically manipulated via trans-
formations. The user will be wearing BN005 sensors that produce (normalized) quaternion
values of the associated limb segments. Initial designs proposed that the skeletal shapes
were to be placed at the joints to act as pivot points to adjacent nodes, as seen in Figure
19. It is evident that through updated changes of positional coordinates, the cube joint
bones mimic movement through a three-dimensional space. However, this poses a problem
because the shapes are not connected. Quaternion values will indicate change of orientation
for an entire node relative to connected nodes. Without physically connecting the nodes,
the quaternion values may not perform correct transformations relative to adjacent bones
causing a shift to arbitrary positions relative to the origin. This may distort any model
attached to the shapes which is a result that is not desirable.
Figure 19: A graphical representation of the initial skeletal design of a human arm. where
the cubes represent joints and the yellow lines represent the connection between each joint
cube (which will be replaced by the model). Not that the connection(s) are not set and are
theoretical.
8.3 Armature & Rigging
Enabling motion is possible in Blender through the use of a process called rigging. Rigging
involves the creation of a combination of bones, that binds to a model to allow for a user
to manipulate, animate, and pose a three-dimensional model. Each bone is referred to as
an armature node and is shown in Figure 20. An important feature of each of the armature
nodes is that they have their own individual coordinate system.
26
Figure 20: Anatomy of a blender armature node. It is evident that each bone has an
orientation and it’s own coordinate system. The coordinate system of the bone (object
coordinates) are independent of the space coordinates (world coordinates)
Figure 21: Anatomy of a blender armature node. It is evident that each bone has an
orientation and it’s own coordinate system.
Related to the design of the full armature (skeleton), each armature node is extruded
from its head to create another connected armature node. Repetition of this process will
yield the entire armature of a limb as seen in Figure 21. The full skeletal rig can be created
by repeating this same process for the other limbs, spine, and collarbone, as shown in Figure
22. Going into pose mode now manipulates the position and orientation of not only each
bone, but all bones extruded from it. The bones follow a hierarchy that applies appropriate
transformations that stem from the original transformation applied to the bone from which
27
it extruded from. However, momentarily each limb acts independently of each adjacent
limb, for example, rotation of both shoulders in opposite directions do not rotate the spine
as a human skeleton would.
Figure 22: The entire rig used for a three dimensional human model.
Fortunately, Blender includes a mechanism in handling situations like the one mentioned
above called parenting. Recalling what was just observed, each armature node follows the
transformation of the bone it extruded from. This can be applied to the node that began
each limb chain (which will be referred to as a root node), in this case the upper arm bone
(left and right), the upper leg bones (left and right) and each collarbone. By parenting
each root to a disconnected armature node, a similar e↵ect can be seen where movement
28
of the root of each limb results in manipulation of the bone from which parents it. This is
observable in Figure 22, where each limb root is connected to another bone via a dashed line.
Figure 23: Anatomy of a Blender armature node. It is evident that each bone has an
orientation and it’s own coordinate system.
The last thing that was to be done on the armature was to correct the orientation of
each node. Recalling that each individual node has its own coordinate axis. This coordinate
system is independent of the space coordinate system and gets moved, shifted, and rotated
with each respective transformation applied to a node. A simple adjustment has to be made
to rotate the nodes so its positive Z direction is forward of the model as shown in Figure 23.
With a consistent forward direction, application of the quaternion values yield consistent
results whenever the armature gets posed.
29
The full rig shown in Figure 22 was expanded past the initial design of the upper arm
manipulation to allow for further expandability. Through the expanded rig, each armature
node can be accessed through XCode. Once the hardware expands to include sensors for
more bones the gathered data can be mapped to the rest of the rig, to allow for full real
time body tracking.
8.4 Posing vs. Animation
Initially, the model was planned to be animated within Blender and when hitting specific
points during an exercise (indicated by a signal from the framework), the arm would follow.
This allowed the model to always remain within appropriate joint ranges and not exceed
certain restraints. Unfortunately, animation is fixed and the problem statement requires
body tracking. This forced a need for the model to constantly update the position and
orientation of its armature nodes. In modelling terms, this is referred to as posing.
A big advantage to resorting to posing is no latency during rendering. The hardware
being tested includes the iPhone 6 and iPad Mini 2. These devices contain an Apple A8
and Apple Cyclone processor which run at 1.1 to 1.3GHz, a clock speed which can handle
quick graphical rendering under lower graphical loads. Manipulation of the skeletal rig and
distortion of the model can be processed very quickly as opposed to following a progam-
matically preset motion after waiting for a command. This avoids synchronization of the
framework and animation, causing a reduction in frame rate latency.
Another important advantage to posing is the use of normalized quaternions to ma-
nipulate the model’s orientation of its armature nodes. This makes the model directly
compatible with the BNO055 sensors that return to the phone normalized quaternion val-
ues. The posing can easily be done within Scene Kit by changing the appropriate armature
orientation to match the values given by the sensors. This can be done by setting the ori-
entation property to a four-piece vector composed of the w, x, y, and z components.
Despite the compatibility of the model and the sensors, one noteable disadvantage of
using the posing approach is the need to match the coordinate system of the BNO055 sensor
with that of the armature. There are two things that have to be considered: the implemen-
tation of the hardware to physically mount the sensors correctly, and the implementation
of the rig within the model to programmatically set the rig of armatures to a place of agree-
ment with the hardware. The orientations of both the sensors and an armature are shown
in figures 24 and 25 respectively. To make the two figures match and agree with each other,
the top of the sensor has to be facing the same direction as the head of the armature and
the armature has to be rotated along the z-axis to line up the z and x axis orientations.
30
Figure 24: The local coordinate system of the BNO055 sensor where the short edge closest
to the dot is the top edge of the sensor.
Figure 25: The local coordinate system of an armature node where the local coordinate
system is shown on the armature head while the world coordinate system is shown at its
root. The world coordinate system is as follows: red is the x-axis, green is the y-axis, and
blue is the z axis.
8.5 Exporting Options
To be able to export both the model and its rig, the model has to be exported as a collada
file (.dae). Blender o↵ers exportations into collada file, allowing the model to be integrated
into the xcode project.
9 Graphical Rendering Environments
9.0.1 OpenGL
The initial proposal required that the system includes a way to display a 3D skeleton that
models a user’s movements. Before a skeleton can be created the application must set up
an environment that supports the use of the OpenGL application program interface (API).
The OpenGL libraries are required within the design plan to aid in the handling of all 3D
graphical tasks from the graphical proessing unit within all the iOS devices.
31
The way OpenGL allows for 3D rendering within the application is by manipulating
a 3D model that is imported to the iOS application project. Manipulation is done through
programmatically prompted transformations to the object such as translation, rotation,
scaling, shearing, and reflecting. Manipulation through transformations are also applied to
the object’s coordinate system to fit the model within the appropriate iOS device’s screen.
Finally, the final transformed model is stored onto the iOS device’s frame bu↵er which is
rasterized, or drawn, onto the screen. A graphical representation of how OpenGL works is
shown in Figure 26.
Figure 26: The viewing pipeline that highlights the steps of OpenGL during the graphical
rendering of an object to a screen. Note the di↵erent coordinate axial transformations that
take place prior to rasterizing of the picture.[13]
Although the mentioned OpenGL libraries allow for 3D models to be displayed through
the graphical units of a given iOS device, creating the 3D human model cannot be done
programmatically.
9.1 iOS Scene Kit API
Scene Kit was the appropriate design choice for the graphical rendering environment as it
meets all of our design needs. Our application facilitates the modelling of human move-
ment through Scene Kit’s physics engine that can change the orientation of an imported
3D model through the use of vector data. An example is shown in figure 27 where an
arbitrary rotational force is applied to a basic primitive’s coordinate axes through Scene
32
Kit’s physics engine. Using the natively integrated Scene Kit libraries, the application can
be programmed to perform as expected.
Figure 27: Using Scene Kit’s physics engine to apply a rotational force to a node’s coordinate
axes. [12]
9.2 Scene Kit within XCode.
XCode is Apple’s integrated development environment (IDE) that is used for software devel-
opment for macOS, iOS, and any operating system exclusive to Apple devices using either
the Objective C or the Swift programming language. The project uses XCode and Swift to
develop the application for the iPhone 6 and iPad Mini 2. XCode also allows for the use of
Scene Kit to allow for the setting up of the graphical environment where the model will be
displayed, manipulated, and interacted with.
The three-dimensional scene has been set up during loading of the rendering environ-
ment. The snippet of code that has the set up of the scene is included in Appendix D.
Comments are included to explain the flow of the code. The integration of the three-
dimensional model will be covered in the following section.
33
10 iOS Application
10.1 StiOS
The application has been given the name STiOS, an abbreviation for “Skeleton Tracker
for the iO”. This name has been decided upon because it describes the function of the
application. The application simply tracks the motion of a skeleton and displays on an
iDevice. The associated icon is included in Figure 28 and was created by Kristian Torres.
Figure 28: The icon that is shown on the iOS springboard (or home screen).
10.2 Overview of the app layout
The initial view within the application is a sign-in screen, as seen in the left figure of Figure
29. This screen allows a clinician or administrator to sign-in to an account by entering
their email address associated with the account and account password. Users using the
application for the first time must first create an account. This can be done by touching
the “Create an Account”text to go to the registration screen, the right figure in Figure 29.
The registration screen has three fields to fill in: Email Address, “Password”, and “Confirm
Password”. The “Password”and “Confirm Password”fields must be identical or an account
will not be successfully created. Email addresses can be tagged with an identification that
distinguishes between a professional that needs access to all patients under their care, or a
user that only gets access to their personal information.
34
Figure 29: STiOS Login Page (left). STiOS Registration Page (right)
These authentication features were implemented within our application by embedding
a service called Firebase, a Google owned platform, that can provide mobile and web ap-
plications with several powerful features, such as data storage, login authentication, and
security features. We chose to use Firebase for authentication within our application for
a few reasons. First, it cut down on time to spend implementing authentication features
such as password constraints. Secondly, Firebase authentication provides a level of security
that would be very di�cult to implement as classified information and security is of utmost
concern especially when it comes to sensitive fields and professions such as health care.
The screen that follows successful authentication is the patient list, as shown in the left
portion of Figure 30. For this we once again used a Firebase service, this time Firebase’s
cloud-based real-time database. In this screen the user can add patients using the “+”but-
ton at the top of the screen. This brings up the new patient screen (the right portion of
Figure 30. The new patient screen contains the fields “First Name”, “Last Name”, “Lower
Arm Length”, “Upper Arm Length”, and “Shoulder Width”. These fields are included to
allow for customizeability and personalization of the 3D model during use to allow for a
more relateable experience. For now, one generic model is included as the application is
still a prototype.
When a patient is added, an entry is added to the remote database and to the patient
list screen table view. On re-entry of the application with the same log-in credentials, the
patient list entries will be retrieved from the remote database and the patient list will be
populated. Upon choosing a patient from the list, the application will store the patient
35
data in a structure called Patient and proceed to the main session tabbed view, the middle
figure in Figure 31.
Figure 30: Application patient list page (left). Application measurement entry page(right).
The transition from the left and right screens can be made from the button on the top right
corner of the left figure.
Figure 31: The main session screens of the STiOS application include: Exercise Selection
Screen (left), Graphical Rendering Screen (middle), and Exercise Completion Display Screen
(right). Each adjacent screen can be accessed by pressing its associated tab space at the
bottom of the screen.
36
10.2.1 Integration of the Three Dimensional Model into the Application
Integrating the model into the xcode project was possible due to the exporting of the model
into a collada (.dae) file. Once the model .dae file is added to the project, the program
simply assigns the scene variable to the model file (Referring to line 2 in Appendix D.2).
As the application is built, the model can now be displayed in the rendering view, shown in
Figure 32. Tapping the model zeros the model to the position in Figure 32. The manipu-
lation can be shown in Appendix D.3. An example would be on line 41, the portion of the
code that manipulates the orientation with the data received from the Bluno Board.
Figure 32: The 3D model as shown in the STiOS application on an iPad Mini 2 in its initial
position (left) and zeroed out position (right)
This function, updateBodyJoints, is called whenever the iDevice receives new informa-
tion from the Bluno. As a result, the model is constantly updating the nodes that are
defined. This is possible due to the exporting of the model and rig into the collada format
that shows, in a hierarchy, the nodes that were created in the Blender application, as shown
in Figure 33.
37
Figure 33: Importing the model into the Scene Kit Environment as a collada file exposes
the model rig, allowing developers to see the armatures that are hidden within the model.
10.2.2 Model Manipulation
With the model being integrated into the application, real time tracking needs to be verified
with the Bluno and the BNO055 sensors. This was achieved through the updateBodyJoint
method being called whenever the iDevice receives new data from the Bluno. The collection
of data was increased due to increasing the polling rate of the sensors which allowed for a
constant flow of complete data to the iDevice.
Figure 34 shows the result of orientation manipulation using the received data from the
Bluno. The constant flow of complete data, which allowed for no latency within the pro-
gram caused by waiting for complete data to arrive, as well as some error checking to prevent
empty values from being inputted to the model, allows for near real time tracking. The
framework also displays a percentage completion of the exercises to a one-to-one ratio with
the model orientation changes.
38
Figure 34: The 3D model as shown in the STiOS application on an iPad Mini 2.
39
11 Conclusion
In conclusion, successful planning and appropriate design changes allowed the project to be
completed on time and within the allocated budget (Appendix B). Implementation of our
wireless exercise monitoring system has met the following specifications, as presented in the
initial design:
1. The use of IMUs to collect inertial data of a user’s motion.
2. Successful transmission of IMU data via Bluetooth 4.0.
3. Translation of sensor data into useable values through the software framework
4. The creation of an interactive iOS application.
5. Calculation and display of exercise completion percent in iOS application.
6. Wireless manipulation of a 3D model in iOS application to simulate exercise motion.
The accuracy of body joint angles were verified using photographs and video, and is
consistent with values encountered by authors of similar research literature.[9] Our design’s
hardware solution and robust 3D model also lend themselves to future full body scalabil-
ity. This may be accomplished through the implementation of additional sensors and the
extension of the joint angle algorithms.
40
References
[1] Kevin Townsend. “Adafruit Learning System”, 2017. [Online].
Available: https : //learn.adafruit.com/product/assets/24585. [Accessed 17-03-
2017].
[2] Bosch Sensortec. (2016, June.) “BNO055 Intelligent 9-axis absolute orientation
senso”[Online].
Available: https : //cdn� learn.adafruit.com/assets/assets/000/036/832/original/
BSTBNO055DS00014.pdf . [Accessed 17-03-2017].
[3] Lady Ada. “Adafruit Learning System”, 2017. [Online].
Available: https : //learn.adafruit.com/product/assets/27696. [Accessed 17-03-
2017].
[4] DFRobot. (2017, Mar.). “Bluno SKU:DFR0267”[Online].
Available: https : //www.dfrobot.com/wiki/index.php/BlunoSKU : DFR0267. [Ac-
cessed 17-03-2017].
[5] Texas Instruments. (2015, Jan.).“TCA9548A Low-Voltage 8-Channel I2C Switch With
Reset ”[Online].
Available: https : //cdn� shop.adafruit.com/datasheets/tca9548a.pdf . [Accessed 17-
03-2017].
[6] Yan-Bin Jia, “Rotations in Space”, Computer Science 477/577 Notes, 2016.
[7] CH Robotics, “Understanding Euler Angles”, CH Robotics LLC, [Online]. Available:
http : //www.chrobotics.com/library/understanding � euler � angles.[Accessed 17-
03-2017].
[8] Jack Kuipers, “Quaternions and Rotation Sequences”, Geometry, Integrability and
Quantization, Coral Press, 2000, pp 127-143.
[9] Xi Chen, “Human Motion Analysis with Wearable Inertial Sensors”, Ph. D diss., Uni-
versity of Tennessee, 2013. [Online].
Available: http : //trace.tennessee.edu/utkgraddiss/2407.[Accessed 17-03-2017].
[10] Kevin Townsend. “Introduction to Bluetooth Low Energy ”. [Online].
Available: https : //learn.adafruit.com/introduction � to � bluetooth � low �energy/gatt. [Accessed 17-03-2017].
[11] Bluetooth. “What is Bluetooth?”. [Online].
Available: http : //www.bluetooth.com/what� is� bluetooth� technology/how� it�works/low � energy. [Accessed 17-03-2017].
41
[12] Chris Language. “Scene Kit with Swift Part 3: Physics”. [Online].
Available: URL:https://www.raywenderlich.com/128711/scene-kit-tutorial-swift-part-
3-physics.
[13] University of Manitoba, COMP 3490 Unit 1 Class Notes, University of Manitoba,
Winnipeg, 2017.
42
A Body Joint Angle Calculation Results
This appendix contains Matlab generated graphs of body joint angles during exercises. The
Matlab program uses the IMU quaternion data to generate the body joint angles over five
repetitions of the exercise.
In Figure 35 and 36 the angle between the wrists and elbows smoothly changes through-
out five exercise repetitions.
Figure 35: Angle between left and right elbows during a Flyes exercise.
43
Figure 36: Angle between left and right wrists during a Flyes exercise.
In figures 37 and 38 the elbow angle during an overhead press changes smoothly through
the five repetitions.
Figure 37: Left elbow angle during an overhead press exercise.
44
Figure 38: Left elbow angle during an overhead press exercise.
In figures 39 and 40 the elbow angle during a row exercise changes for one arm only, as
this a one-armed exercise. The angle change for the left (exercise performing) arm changes
smoothly through the five repetitions.
Figure 39: Left elbow angle during a one-arm row exercise.
45
Figure 40: Right elbow angle during a one-arm row exercise.
In figures 41 and 42 the elbow angle during a triceps extension exercise changes for one
arm only, as this a one-armed exercise. The angle change for the left (exercise performing)
arm changes smoothly through the five repetitions.
Figure 41: Left elbow angle during triceps exercise.
46
Figure 42: Right elbow angle during triceps exercise.
47
B Project Budget Summary
A budget of 477.31was originally proposed. Below is an updated breakdown of complete
project costs totaling $319.46 CAD. Note the reduced number of sensors purchased.
Table 4: Project Budget Summary.
Item Description Quantity Cost (CAD)
1 Adafruit 9-DOF Absolute orientation
IMU Fusion Breakout - BNO055
4 198.48
2 Adafruit TCA9648A I2C Multiplexer 1 9.87
3 DFRobot Bluno - An Arduino Blue-
tooth 4.0 (BLE) Board
1 58.85
4 7.4V 2500mAh Lithium Polymer Bat-
tery
1 25.41
5 22 AWG Solid-Core Hookup Wire 80ft 11.47
6 Velcro/Adhesive/Fabric - 15.38
Total 319.46
*All Prices Include Shipping and Taxes
Table 5: Budget from External Funds.
Item Description Quantity Cost (CAD)
1 iPhone 6 64gb 1 $ 628.17
2 iPad 2 Mini 1 $ 359.62
3 Apple Developers Account 1 $ 128.70
*Table 5 represents zero incurred project costs, as the equipment was donated for the
duration of the project.
48
C Schematics
Figure 43: TCA9548 I2C multiplexer schematic.
49
Figure 44: BNO055 inertial measurement unit schematic.
50
Figure 45: Arduino Bluno schematic.
51
D Software Repository
D.1 Arduino Code
Figure 46: Bluno Arduino code.
52
D.2 Scene Kit Setup
Listing 1: The swift code that sets up the rendering environment for the three dimensional
model.
1 // Create a new scene variable that will hold the model file.
2 let scene = SCNScene(named:"art.scnassets/<INSERT MODEL
FILENAME HERE >")!
3
4 class SceneKitViewController: UIViewController ,
SkeletonDelegate{
5
6 override func viewDidLoad ()
7 {
8 // Set up the tab bar delegate to communicate with
adjacent tab bar controllers.
9 if let tbs = self.tabBarController as?
SessionTabBarController
10 {
11 tbs.userSkeleton.skeletonDelegate = self
12
13 // "Initial Position" of the model is set up here:
14 <INSERT MODEL POSITIONS AND ORIENTATION INFO HERE >
15 }
16
17 super.viewDidLoad () // Overrides the standard
viewDidLoad () function.
18
19 // Camera creation and setup:
20 let cameraNode = SCNNode ()
21 cameraNode.camera = SCNCamera ()
22 scene.rootNode.addChildNode(cameraNode)
23 cameraNode.position = SCNVector3(x: 0, y: 0, z: 15)
24
25 // Light creation and setup:
26 let lightNode = SCNNode ()
27 lightNode.light = SCNLight ()
28 lightNode.light !.type = .omni
29 lightNode.position = SCNVector3(x: 0, y: 10, z: 10)
30 scene.rootNode.addChildNode(lightNode)
53
31 let ambientLightNode = SCNNode ()
32 ambientLightNode.light = SCNLight ()
33 ambientLightNode.light!.type = .ambient
34 ambientLightNode.light!. color = UIColor.darkGray
35 scene.rootNode.addChildNode(ambientLightNode)
36
37 // Retrieve the SCNView:
38 let scnView = self.view as! SCNView
39
40 // Put the scene into view and set up the scene:
41 scnView.scene = scene
42 scnView.allowsCameraControl = true
43 scnView.showsStatistics = true
44 scnView.backgroundColor = UIColor.black
45
46 // add a tap gesture recognizer -> This will be used
for "zeroing" out the model.
47 let tapGesture = UITapGestureRecognizer(target: self ,
action: #selector(handleTap(_:)))
48 scnView.addGestureRecognizer(tapGesture)
49 }// <End of vidDidLoad ()>
54
D.3 Updating the Model
Listing 2: The swift code that integrates the 3D model and manipulates its orientation.
1 // Set up the arm nodes , under declaraction of the scene.
2 let RightBicep = scene.rootNode.childNode(withName:"upperArm.R
", recursively: true)!
3 let RightForeArm = scene.rootNode.childNode(withName: "foreArm
.R", recursively: true)!
4 let LeftBicep = scene.rootNode.childNode(withName:"upperArm.L"
, recursively: true)!
5 let LeftForeArm = scene.rootNode.childNode(withName:"foreArm.L
", recursively: true)!
6
7 class SceneKitViewController: UIViewController ,
SkeletonDelegate{
8
9 override func viewDidLoad ()
10 {
11 if let tbs = self.tabBarController as?
SessionTabBarController
12 {
13 print("setting delegate here")
14 tbs.userSkeleton.skeletonDelegate = self
15
16 // "Initial Position" of the model is set up here:
17 RightBicep.orientation = SCNVector4(x:0, y:0, z:0, w:1)
18 RightForeArm.orientation = SCNVector4(x:0, y:0, z:0, w
:1)
19 LeftBicep.orientation = SCNVector4(x:0, y:0, z:0, w:1)
20 LeftForeArm.orientation = SCNVector4(x:0, y:0, z:0, w:1)
21 }
22
23 super.viewDidLoad ()
24
25 //... The rest of the code is omitted in this section ,
it is shown in listing 1.
26
27 } // End of viewDidLoad ()
28
55
29 // Called whenever the application receives a new set of data.
30 public func updateBodyJoints(sensorData: [String : BodyJoint ])
31 {
32 // Set up references to the framework dictionary that stores
the received values:
33 let rightForearm: BodyJoint = sensorData["rightForearm"]!
34 let rightBicep: BodyJoint = sensorData["rightBicep"]!
35 let leftForearm: BodyJoint = sensorData["leftForearm"]!
36 let leftBicep: BodyJoint = sensorData["leftBicep"]!
37
38 // Change the orientation of the bones.
39
40 // Right Arm
41 RightBicep.orientation = SCNVector4(x:rightBicep.
orientation.x_, y:rightBicep.orientation.y_, z:rightBicep.
orientation.z_, w:rightBicep.orientation.w_)
42 RightForeArm.orientation = SCNVector4(x:rightForearm.
orientation.x_, y:rightForearm.orientation.y_, z:
rightForearm.orientation.z_ , w:rightForearm.orientation.w_)
43
44 // Left Arm
45 LeftBicep.orientation = SCNVector4(x:leftBicep.orientation.
x_, y:leftBicep.orientation.y_, z:leftBicep.orientation.z_,
w:leftBicep.orientation.w_)
46 LeftForeArm.orientation = SCNVector4(x:leftForearm.
orientation.x_, y:leftForearm.orientation.y_, z:leftForearm
.orientation.z_, w:leftForearm.orientation.w_)
47 }
48
49 // Handles a tap gesture. This zeros out model and returns it
to its original position.
50 func handleTap(_ gestureRecognize: UIGestureRecognizer)
51 {
52 // retrieve the SCNView
53 let scnView = self.view as! SCNView
54
55 RightBicep.orientation = SCNVector4(x:0, y:0, z:0, w:1)
56 RightForeArm.orientation = SCNVector4(x:0, y:0, z:0, w:1)
57 LeftBicep.orientation = SCNVector4(x:0, y:0, z:0, w:0.1)
56
58 LeftForeArm.orientation = SCNVector4(x:0, y:0, z:0, w:1)
59
60 SCNTransaction.commit ()
61 }
D.4 GitHub Repository Links
• STiOS: https://github.com/kagibson/STiOS
• Exercise Motion Tracker: https://github.com/kagibson/exerciseMotionTracker
57
top related