Advanced information processing of MEMS motion sensors for gesture interaction

Sensor-based gesture interaction technology has been widely adopted in consumer electronics. Nevertheless, bias, drift, and noise existing in sensor signals are difficult to eliminate, and accurate movement trajectory information is still needed to achieve flexible interaction application. This paper presents micro-electromechanical system (MEMS) motion sensor information processing algorithms designed on a gesture interaction system which integrates multiple low-cost MEMS motion sensors with ZigBee wireless technology to support embodied communication while acting together with machines. Sensor signal processing systems mainly solve noise removal, signal smoothing, gravity influence separation, coordinate system conversion, and position information retrieval. The attitude information which is an important movement parameter and required by position estimation is calculated with a quaternion-based extended Kalman filter (EKF). The effectiveness of the movement information retrieval of this gesture interface is verified by experiments and test analysis, both in static and moving cases. In the end, related applications of the described sensor information processing are discussed.


Introduction
Recently, intelligent platforms, such as tablets, personal computers, personal digital assistants, gaming systems, and smartphones, have become ever more popularly used consumer electronics.There is no doubt that this trend is partly caused by user interaction and natural interaction technologies which help the machine to better understand the expressions of human beings and provide more sensing capabilities for machines to support humans' spontaneous ways of discovering the real world (Vali, 2008).Within these senses involving gesture (Lian et al., 2014), voice (Kim et al., 2013), movement, hearing, vision, and so on, recognition of human gestures has become the focus of new-generation interface technology for improving the communication between people and machines, and for the purpose of realizing an intuitive and natural interface (Kong et al., 2013).The enhancement of user experience via a gesture-based interface could facilitate interaction between users and electronic equipment in real time as users may no longer have any physical connection to the facility being controlled (Premaratne et al., 2012).
In order to capture human gesture, a variety of sensors as the medium measuring the required signal, for instance, we-bcams (Premaratne and Nguyen, 2007), Kinect (Osunkoya and Chern, 2013), proximity sensors (Um and Hung, 2006), and touch sensors (Zeng et al., 2009;Aguilar and Meijer, 2002;Choi et al., 2007), have been designed and integrated for gesture interaction applied in consumer electronic devices.But many application paradigms require the usage of surfaces, cameras, sensor bars, or tethering, thereby restricting these applications, including more facility provision, observation and interaction coverage range limitation, and more complicated algorithms.Compared with these sensor systems mentioned above, low-cost inertial sensors based on micro-electro-mechanical system (MEMS) technology, which are made using the techniques of microfabrication, enhance the motion and tracking abilities for gesture user interfaces because of more real-time motion information provision, high performance, ruggedness, and low power (Shaeffer, 2013).Nevertheless, directly integrating inertial measurements to acquire motion information induces problems since bias, drift, and noise exist in signals (Shaeffer, 2013;Yi et al., 2009).Most research on motion-based gesture interaction with MEMS sensors focuses on gesture recognition targets, with less influence caused by noise, bias, and drift.Among these gesture identification systems, hidden Markov models (HMMs), artificial neural networks (ANNs), and discrete cosine transforms (DCTs), which are to reduce the dimension of the input gestures, are integrated well with good recognition results (Lee-Cosio et al., 2012;Zhou et al., 2009).In addition, Wiimote, which uses accelerometers and Bluetooth technology together with the Wii sensor bar, is generating great interest for gesture interaction application.The usage and feeling for this facility, however, are still diminished by the fact that the user has to hold the Wiimote with his or her hand (Holzinger et al., 2010).Flexible gesture interaction applications and systems with enough dexterity still need accurate motion and trajectory information (Muller et al., 2008;Lin and Ding, 2013).Therefore, this research applies multiple-axis sensors with powerful microcontroller assistance provided by MEMS and silicon manufacturers to gesture interaction.Challenging tasks of this research involve accurate motion information extraction and motion state estimation.Our designed MEMS-based gesture interaction tool provides flexible usage with hand-free scenarios.
In this work, the designed and developed wearable gesture interactive system integrates tri-axis accelerometers, tri-axis gyroscopes, and a microcontroller.ZigBee wireless technology with a broader coverage range is designed on this board for data transmission between people and machine to realize unrestricted and unobstructed interaction.Moreover, the proposed sensor signal processing removes gravity influence and decreases drift and noise.With the assistance of coordinate system conversion and integration, the designed attitude information estimation with an extended Kalman filter (EKF) algorithm achieves motion trajectory retrieval.
The remainder of this paper is organized as follows.Section 2 reviews the related work of this research.Section 3 describes this MEMS inertial measurement unit (IMU) hardware system design scheme.MEMS motion sensor signal processing technologies and information fusion algorithms regarding the developed MEMS gesture interactive tool are illustrated in Sects.4 and 5, respectively.Subsequently, the experiments, results, and analysis are presented in Sect.6.Finally, the conclusions of this study are summarized in Sect.7.

Related work
Regarding gesture interaction design and related activity research using sensor technology, currently there has been some practice with a popular commercial MEMS IMU that combines accelerometers and gyroscopes, sometimes also magnetometers, to report orientation, velocity, and gravitational forces.
Some wearable computer mouse interfaces combining IMU and electromyography (EMG) sensors, which could work on muscle signal examination, have been proposed (Forbes, 2013;Xiong et al., 2011).The wireless IMU system Xsens MTx and EMG are designed to control the cursor and mouse clicking via angular velocity and muscle signal with linear discriminant classifiers.
Embodied interaction is studied by providing users with 3-D avatar representation, using Vicon camera tracking and Xsens MVN, which consists of full-body, camera-less inertial motion capture (MoCap) solutions (Dodds et al., 2011).Supported by a head-mounted display immersive virtual environment (VE), research results show that participants moved more and performed better when both avatars were self-animated, compared to when both were static.
Motion analysis and comparison across human computer interaction activities for the purpose of PRedicting Occupational biomechanics in OFfice workers (PROOF) is implemented (Bruno Graza et al., 2012).The parameters covering field-measured forces, muscle efforts, postures, velocities, and accelerations are measured by a force-sensing device, EMG, and G-Link from Microstrain.
These research works indicate that MEMS-based motion sensor toolkits are good for embodied interaction-related applications.Nevertheless, their signal processing techniques only concentrate on preprocessing of measured sensor data, and these are not enough for intuitive, fast, accurate, and convenient gesture interface design and usage.

MEMS gesture interaction tool
This designed and developed human body movement track IMU for gesture interaction integrates MEMS motion sensors, including a tri-axis accelerometer ADXL335 which measures the rate of change of velocity in the International System of Units (SI unit) form of metres per second squared (m s −2 ), a dual-axis pitch and roll MEMS gyroscope LPR530AL that captures rotation velocity in the SI unit form of degrees per second (dps), and a yaw MEMS gyroscope LY530ALH.In order to realize wireless data transmission and system control of this board, the XBee series 2 OEM RF module and a 32-bit flash microcontroller, PIC32MX4XXH, are integrated into this system as well.
The American Analog Devices company's ADXL335 measures acceleration with a minimum full-scale range of ±3 g.It can measure the static acceleration of gravity in tilt-sensing applications as well as dynamic acceleration resulting from motion, shock, or vibration (Analog Devices, 2009).Regarding gyroscope usage, STMicroelectronics (ST)'s LPR530AL provides a full scale of ±300 • s −1 with the measuring support for the angular rate along the pitch and roll axes.ST's other device LY530ALH on this board also has a full scale of ±300 • s −1 , with the measurement capability in the angular rate along the yaw axis.These two gyroscopes are orthogonally soldered in this sensor system to build a triad capable of supporting a 3 degrees of freedom (DOF) angular rate.The characteristics of these sensors can be referenced in Table 1 below (Analog Devices, 2009;STMicroelectronics, 2009a, b).PIC32MX4XXH is a high-

MEMS IMU information processing
In the IMU, a gyroscope is used to measure rotation speed.
Given the initial orientation and sensor bias, true orientation can be calculated.The accelerometer measures the sum of the true acceleration and the gravitational field.This section discusses how to apply signal processing to more accurate signal acquirement.

Sensor burst noise reducing
Some burst noise and outlier signals often appear in the measurement signal from the MEMS sensors.In sensor signal processing, it is often desirable to be able to perform some kinds of noise reduction as the noise would affect attitude state estimation and the integral of the sensor signal.
The median filter is a non-linear digital filtering technique to run through the signal entry by entry, replacing each entry with the median of neighbouring entries.The mean filter is to replace each entry in a signal with the mean that is the average value of its neighbours, including itself (Ji et al., 2006;Jimenez et al., 2009).
The median filter and the mean filter are combined to reduce burst noise and outlier signal using the formula below: where x and y represent the measured sensor signal and the processed signal by filter, respectively; i denotes signal entry; and the data window width 2 × m + 1 is set to 7.

Gravity influence removal from the accelerometer
The accelerometer is always measuring the true acceleration of the device (acceleration from its trajectory in space) together with gravitational acceleration.Without additional information, there is no way of separating these two contributing sources of acceleration from sensor reading.Because of the gravity influence on static sensor data measurement as the common bias, a high-pass filter which could pass high-frequency signals but attenuate signals with frequencies lower than the cut-off frequency is integrated into this signal processing software to remove the gravity factor from the measured static sensor data.The movement sensor signal could be processed with the deduction of the gravitational influence measured on the static placement experiment.

Coordinate system conversion
For the purpose of understanding the reason why different coordinate systems exist in MEMS sensor signal processing and movement information analysis, inertial navigation system (INS) and IMU coordinate systems are illustrated in the contents below.Displayed as this INS coordinate system in Fig. 2, the inertial frame Earth-centered, Earth-fixed (ECEF) and the local navigational frame are drawn.The NWU frame is the local navigation coordinate system, where the axes are north (N), west (W ), and up (U ).The XY Z in Fig. 2 is the inertial frame ECEF.
The IMU diagram as shown by Fig. 3 below illustrates that the XY Z system is the body frame system which is aligned with the axes of the IMU.The centre of this frame is located at the origin of the navigational frame.In the problem of movement information processing and retrieval, the conversion between body frame and navigation frame needs to be solved, and the method is given in Appendix A.

Speed and position calculation
To acquire the trajectory information of movement, the integral algorithm is designed in this system.The position is the result of the double integral of acceleration measured by the accelerometer.In order to decrease the integration error, the Runge-Kutta algorithm which judiciously uses the information on the slope at more than one point to extrapolate the solution to the future time step is implemented in speed and position estimation.

MEMS IMU information fusion
As individual sensors, accelerometers, gyroscopes, and magnetometers can be quite useful on their own if the objective is a simple motion feature.However, individual MEMS sensors by themselves are often not sufficient when the goal is a more complex motion feature, like the computation of attitude or trajectory estimation.State estimation technology involved in data fusion with multiple sensor integration could be used for orientation calibration problems.In our system, information fusion and orientation state estimation are used in attitude acquirement and angle estimation, which are required in coordinate system conversion for trajectory retrieval.Regarding this case of orientation state estimation for measured information by gyroscopes and accelerometers, a quaternion expression is used in combination with the EKF algorithm.Based on this developed IMU hardware system using the MEMS tri-axis accelerometer, a dual-axis pitch and roll gyroscope, and a yaw gyroscope, this section introduces a simplified EKF design for a 2 DOF tracking application considering the real-time effect on the gesture interaction application.A quaternion expression for the orientation estimation problem and sensor model construction formulas to implement the EKF algorithm are explained.EKF design and formulas are provided and the orientation acquirement of Euler angles from the quaternion estimation result by the EKF is presented.

Quaternion and attitude representation and determination
Based on the attitude representation problem, the relationship between the column-vector x(t) composed of functions of time relative to the navigation frame N and body frame B could be referenced (Sabatini, 2006): For the simplicity of formula representation, the argument t is removed.
The movement formula of the attitude quaternion following time may be expressed as this vector differential formula below (Sabatini, 2006): In Eq. ( 5), [ω] = 1 2 where ω is the projection of angular velocity of movement to the body frame and ω = [p, q, r] T .Here, p is the body angular velocity around the x axis, q is the body angular velocity around the y axis, and r is the body angular velocity around the z axis.In addition, in Eq. ( 6), The related discrete-time model of Eq. ( 5) is where T s is the system sampling interval.

Sensor model
Before designing an EKF for the purpose of estimating orientation, the sensor model is required for development and construction.The gyroscope and the accelerometer are sensors with perpendicular sensitivity axes.In order to simplify the sensor model construction, sensor models of gyroscopes and accelerometers involving the angular velocity ω, total acceleration a without the processing of cross-axis sensitivity, cross-coupling, and misalignment are expressed as In Eq. ( 9), K g and K a are the scale factor matrices, and they are the identity matrix I for the ideal case; b g and b a are the bias vectors, and they are null for the ideal case; v g and v a are assumed to be uncorrelated white Gaussian measurement noise with null mean and covariance matrices g = σ 2 g I and a = σ 2 a I.The bias and scale factor of gyroscopes are influenced by the environment such as the ambient temperature, while the temperature coefficients of accelerometers have a relatively lower quantitative relevance (Abbott and Powell, 1999).Furthermore, scale factor drifts of these sensors affect the accuracy of measurement less than the bias drifts (Foxlin, 2002).Therefore, scale factor and bias error of accelerometers are assumed to be constant in Eq. ( 9).Additionally, temperature variation is the most important factor for the bias drift.Nevertheless, the bias tends to change very slowly after the power is applied to gyroscopes (Abbott and Powell, 1999), and after thermal stabilization for a few minutes.Then, scale factor and bias error of gyroscope are assumed to be constant in Eq. ( 9) as well.

EKF for attitude estimation
Considering the non-ideal and non-linearity properties of the environment and the system, the EKF theory, which is the non-linear problem solution and which linearizes about the estimate of the current mean and covariance, is applied.In the EKF algorithm to estimate orientation of movement, the state vector is designed to include the rotation quaternion and tri-axis accelerometer bias vector.The state equation is where In Eq. ( 10), W a k are zero-mean white noise processes with covariance matrix a k = T a s σ 2 w I.The gyro measurement noise vector v g k is assumed small enough that a first-order approximation of the noisy transition matrix is possible.It is assumed that W a k and W q k are not correlated with each other.The process noise covariance matrix Q k will follow this expression below: The measurement vector is the accelerometer measurement vector following the formula below: It is assumed that the accelerometer measurement noise v a k+1 is a zero-mean white noise process and that its covariance matrix is R a (k +1) = σ 2 a I.Then, the covariance matrix of the measurement model can be given by the following formula: In an EKF, it is not necessary that the state transition and observation models should be linear functions of the state.They can be replaced by differentiable functions.To implement the EKF, a Jacobian matrix is calculated using this formula below: EKF calculation follows the procedures below (Maybeck, 1982):

Attitude acquirement
After a quaternion-based EKF is implemented to estimate orientation, the quaternion should be converted to Euler as the calculation of trajectory retrieval needs Euler angles.The formulas of conversion may be referenced from the formulas below (Diebel, 2006): -Pitch Attitude = a sin (2 × q 1 × q 2 + 2 × q 3 × q 4 ) ( 22)

Experimental results
To effectively analyse and verify designed IMU information processing and fusion algorithms, experiments and measurements using this tool, including static placement and movement, are implemented.The sampling rate of the sensor system is set as 100 Hz because the 50 Hz cut-off frequency is enough for the resolution required for our gesture interaction application.As the ZigBee receiver connects the serial port, accelerometer and gyroscope sensor data are captured and collected by the developed PC program which receives the serial port data.All data of sensors are processed by noisereducing filters.Because of special gravity components of accelerometer sensor data, a high-pass filter is practiced for movement acceleration information retrieval.Furthermore, the EKF algorithm estimates attitude information.Finally, coordinate system conversion and integration are calculated to acquire actual movement trajectory.The data processing, filters, and algorithms are implemented using Matlab.

Static measurement
Static measurement is practiced to get gyroscope and accelerometer sensor data for static placement on this gesture interaction tool.The measured period during the experiment is 1 min, and each experiment collects 6000 samples.In Fig. 4 shown below, the horizontal axis X is time with seconds as the unit.The vertical axis Y is the digital output of gyroscope data transformed by analogue-to-digital converter (ADC) for Fig. 4a and b, and the vertical axis Y is the measured angular rate by the conversion calculation of ADC digital output for Fig. 4c with radian as the unit.The data listed above indicate that noise-reducing processing effectively decreases and removes burst noise and outlier signal and keeps the signal in a valid and compressed amplitude range.Quantification processing using a scale factor modulates the vertical axis from sensor digital output to a real angular rate with dps as the unit, and the angular rate range is from 0.005 to 0.045 dps.Attitude state estimation by a quaternion-based EKF algorithm is displayed as Fig. 5 below.
In this static experiment, the estimated roll fluctuates from −0.0068 to 0.0087 radian and pitch flows from −0.0089 to 0.0046 radian.
Moreover, accelerometer data are analysed in another static horizontal placement measurement.In Fig. 6, the horizontal axis X is the sample number and the vertical axis is acceleration measured in the gravity direction with m s −2 as the unit.
As the Fig. 6 data show above, the gravity component of sensor data is removed from the original measured sensor signal, and burst noise is also decreased.

Movement measurement
To implement the movement experiment, the sensor is slowly moved towards the y axis by hand.The distance between the starting point and the destination is about 65 cm.This movement experiment spends 48 s and 4800 samples are collected.Through sensor information processing for gesture interaction usage, the trajectory information is acquired as Fig. 7.The horizontal axis is the sample number.The vertical axis is the estimated position and the unit is meters.
Regarding this movement measurement and processing result, position estimation generates 0.065 and 0.05 m deviations on the x and y axes, respectively.The performance of the movement information retrieval of this tool is acceptable for movement interaction application and usage.When the tester moves the body part on which this MEMS gesture toolkit is put, the trajectory of movement could be acquired by the processing of information as in Fig. 8 below.In Fig. 8, the oblique line with green colour is the designed movement trace line, and the curve with black colour is the retrieved tra-  This developed MEMS-based interface is connected to virtual Earth navigation to observe and use gesture interaction visually.Applying these IMU information processing methodologies, the virtual Earth could be navigated and controlled by gesture.When the user moves the tool following the roll angle, this Earth rotates with latitude variation.Pitch angle rotation induces longitude modulation.The speed of movement controls how fast the Earth moves.With the support of movement trajectory information, the Earth can go to the place where the user wants to go.

Conclusion
In this paper, advanced movement information processing and retrieval algorithms which are achieved by signal processing and information fusion techniques, and applied in a gesture interface with MEMS and IMU technologies, are explained.The MEMS-based IMU supplies a real-time measured movement signal.The burst noise and outlier phenomenon is improved by combined mean and median filter and gravity influence on the sensor signal is separated by a high-pass filter.The attitude information which contributes to both trajectory acquirement and gesture navigation is estimated by the implemented quaternion-based EKF algorithm.
With the assistance of coordinate system conversion and integral operation, the position and trajectory of the movement are obtained and the small distance experiment gets acceptable performance.These positive results are applied in the virtual Earth navigation paradigm, and it vividly indicates the visual gesture interaction effect.MEMS IMU and information processing algorithms are not confined to the gesture interaction system and are applicable in other systems, including motion analysis, medical or physical application, and entertainment products.

Data availability
The dataset of movement measurement can be accessed in open repository by the link below: https://1drv.ms/f/s!AsYkWYuw2L6nt3oIVfMCXQa5Oyrb (Ling, 2016).www.j-sens-sens-syst.net/5/419/2016/ 1600 Hz for the X and Y axes, 150 µg Hz −1/2 rms for the X and Y axes, 0.5 to 550 Hz for the Z axis 300 µg Hz −1/2 rms for the Z axis LPR530AL ±300 • s −1 3.33 mV s •−1 −3 dB up to 140 Hz 0.035 • Hz 1/2 s −1 LY530ALH ±300 • s −1 3.33 mV s •−1 −3 dB up to 140 Hz 0.035 • Hz 1/2 s −1 performance 80 MHz MIPS-based 32-bit flash microcontroller, and its operating voltage range is from 2.3 to 3.6 V.All the data sensed by the accelerometer and gyroscope are transmitted wirelessly to the PC by a ZigBee transceiver at the 2.4 GHz transmission band.The system block diagram of this MEMS gesture interaction tool is given in Fig. 1 below, and schematics of this tool are provided in Appendix B.

Figure 1 .
Figure 1.Block diagram of the designed MEMS gesture interaction tool.

Figure 4 .
Figure 4. Static placement measurement results of the gyroscope.(a) Digital output of the ADC for the gyroscope.(b) Processed data by the combined median and mean filter.(c) Converted angular rate for gyroscope measurement.

Figure 5 .
Figure 5. Orientation estimation results on roll and pitch using an EKF.

Figure 6 .
Figure 6.Static horizontal placement measurement for the accelerometer in the gravity direction.(a) Original sensor data.(b) Processed data by noise-reducing filter and high-pass filter.

Figure 8 .
Figure 8. Trajectory of movement in space.

Figure B3 .
Figure B3.Schematic of the microcontroller and the ZigBee module.

Table 1 .
Specifications and characteristics of MEMS sensors in the system.