Professional Documents
Culture Documents
of 3D Graphics
David Humphreys
• Ergonomically designed
i
Acknowledgements
Thanks go to all those people who were involved with this project, particularly
Dr S. Woolley
Mr D. Checkley
Mr W. Hay
Mr A. Zanatni
Mr A. Yates
Mr S. Greep
My thanks also go to my Parents who have provided encouragement and support whenever I
have needed it.
ii
Table of Contents
Table of Contents
1 INTRODUCTION...................................................................................................................................1
1.1 DESIGN BRIEF ...................................................................................................................................1
1.2 JUSTIFICATION ..................................................................................................................................2
1.3 BASIC CONCEPTS..............................................................................................................................3
1.4 REPORT APPROACH ..........................................................................................................................4
2 INITIAL SPECIFICATION....................................................................................................................6
2.1 FUNCTIONAL REQUIREMENTS.............................................................................................................6
2.2 ERGONOMICAL REQUIREMENTS .........................................................................................................6
3 PROJECT OVERVIEW..........................................................................................................................8
3.1 SYSTEM DIAGRAM .............................................................................................................................8
3.2 DESIGN PROCESS .............................................................................................................................9
4 PC INTERFACE ...................................................................................................................................12
4.1 AIM................................................................................................................................................ 12
4.2 SELECTING THE PC INTERFACE ...................................................................................................... 12
4.3 IMPLEMENTING PC INTERFACE USING RS232.................................................................................. 15
5 SENSORS .............................................................................................................................................17
5.1 INTRODUCTION TO INERTIAL SENSING.............................................................................................. 17
5.2 AIM................................................................................................................................................ 19
5.3 SENSOR RESEARCH ....................................................................................................................... 20
5.4 METHODS OF USING INERTIAL SENSORS TO SENSE MOVEMENT IN A 3D ENVIRONMENT ..................... 21
5.4.1 Accelerometers........................................................................................................................ 21
5.4.2 Gyroscopes ............................................................................................................................. 22
5.5 IMPLEMENTING SENSORS ............................................................................................................... 23
5.6 PRELIMINARY TESTING OF SENSORS ............................................................................................... 24
5.7 SUMMARY OF IMPLEMENTING INERTIAL SENSORS ............................................................................ 25
6 MICROCONTROLLER .......................................................................................................................26
6.1 AIM................................................................................................................................................ 26
6.2 HARDWARE SELECTION .................................................................................................................. 26
6.3 IMPLEMENTING MICRO-CONTROLLER TEST CIRCUIT ......................................................................... 27
6.4 FLOW DIAGRAM OF PIC FIRMWARE ................................................................................................. 28
6.5 IMPLEMENTING A/D CONVERSION ................................................................................................... 28
6.6 IMPLEMENTING ASYNCHRONOUS TRANSMISSION.............................................................................. 30
6.7 MISCELLANEOUS IMPLEMENTATION DETAILS .................................................................................... 31
6.7.1 Auto configuration.................................................................................................................... 31
6.7.2 Header bytes ........................................................................................................................... 31
6.8 TESTING AND RESULTS................................................................................................................... 32
7 DATA PROCESSING DESIGN...........................................................................................................33
7.1 AIM:............................................................................................................................................... 33
7.2 DATA PROCESSING METHOD SELECTION ......................................................................................... 33
7.2.1 Direct Mapping ........................................................................................................................ 33
7.2.2 Gesture Recognition................................................................................................................ 35
7.3 DESIGN CONSIDERATIONS .............................................................................................................. 36
7.3.1 Selecting Threshold Levels ..................................................................................................... 36
7.3.2 Normal Operation Mode .......................................................................................................... 37
7.3.3 ‘Auto-damping’ Mode............................................................................................................... 40
7.3.4 User Options............................................................................................................................ 41
7.4 USABILITY FLOW CHART ................................................................................................................. 44
iii
Table of Contents
11 RADIO LINK........................................................................................................................................63
12 TESTING ..............................................................................................................................................64
12.1 FUNCTIONALITY TESTING: METHOD ................................................................................................. 64
12.2 FUNCTIONALITY TESTING: RESULTS ................................................................................................ 65
12.3 USABILITY TESTING: METHOD ......................................................................................................... 66
12.4 USABILITY TESTING: RESULTS ........................................................................................................ 68
12.5 TESTING CONCLUSION ................................................................................................................... 70
12.5.1 Functionality ............................................................................................................................ 70
12.5.2 Usability (compared with a mouse) ......................................................................................... 70
12.5.3 Optimised Settings .................................................................................................................. 72
13 PROJECT CONCLUSION ...................................................................................................................73
13.1 MEETING THE SPECIFICATION ......................................................................................................... 73
13.2 DISCUSSION................................................................................................................................... 74
13.3 FURTHER WORK ............................................................................................................................ 78
14 REFERENCES......................................................................................................................................79
iv
List of Figures
List of Figures
v
List of Figures
vi
List of Abbreviations
List of Abbreviations
‘Direct Mapping’ – When the position of an object on the screen is directly mapped to the
position of a human interface device (often referred to as absolute position tracking).
vii
Introduction
1 Introduction
1
Introduction
1.2 Justification
This product is intended to be of an intuitive design that can be used to manoeuvre a 3D
object around a screen and rotate it. It would be useful when viewing 3D objects, such as the
detailed cuneiform tablets created using a 3D scanner in research carried out at the University
of Birmingham. Currently most users use a mouse to manoeuvre an object on a screen.
However, a mouse is not an intuitive device to use for this application as it only provides 2
degrees of translational freedom whereas an object in 3D space has 6 degrees of freedom.
“The user needs to be able to rotate the tablet in all three axes to be able to see all of the
faces of the tablet. This gives six degrees of freedom; but most computers have only two-
dimensional input devices (mouse, trackball or joystick)… An ideal solution would be a form of
data glove that measures the position of the user’s hand (in all six degrees of freedom) and
renders the image to correspond with the position of the hand,” (Woolley, S.I. 2001 [32]).
The solution will not involve designing a data glove, but rather a sensor that the user olds in
their hand. This makes it easier for the user to switch between devices when working at a
desktop PC with a mouse and keyboard. The proposed device will be used alongside the
mouse, which would be used for pointing tasks in the 3D world.
2
Introduction
3D object moves in
conjunction with
Hand-held sensor
the user’s hand
detects movement
of hand
User moves
sensor like they
want the object
to move on the
screen
3
Introduction
4
Project or Product Sensors Used Technical attributes Other comments
Introduction
An Inertial Measurement Unit for User 3 Murata Gyrostar angular Connects via RS-232 interface Prototype project to prove principles
Interfaces [2] rate sensors and Incorporates Kalman filtering and gesture of inertial tracking
3 ADXL 202 recognition Prototype cost ~ US$300
accelerometers RF Monoliths radio transmitter
Application of inertial sensing to handheld ADXL202 connected to Connects to a Linux PC using a Report into the development of an
terminals.[4] Microchip 16C622 development board inertial navigation system for a
Connects via parallel interface configurable phone project
Hybrid Inertial and Vision Tracking for Accelerometers and Detailed mathematical analysis Project using Inertia cube to
Augmented Reality Registration [19] angular rate sensors Uses Kalman filtering demonstrate effectiveness of vision
Vision based sensing to based correction techniques
compensate drift
Intersense Inertia Cube 2 [7] 9 discreet sensing RS-232 Interface Complete 3-DOF sensor
elements with advanced Measures roll, pitch and yaw Accurate to 1degree at 25 degrees C
Kalman filtering 0 -1200 degrees/second Approximately 1" cube
180Hz refresh rate Costs US$1,695
Logitech 3D mouse and Head Tracker [9] Stationary ultrasonic Incorporates separate transmitter, control Designed for high end workstations
transmitter unit and power supply Can be incorporated into VR
Receiver relays back to Connects via RS 232 interface headware
control unit Costs US$1,999
Movy - a sourceless and wireless input Accelerometers and Connects via RS232 interface Currently at prototype stage
device for real world interaction [6] angular rate sensors Uses radio link MOVY ring uses accelerometers to
50Hz refresh rate monitor gestures of finger
Miracle Mouse [11] Senses position using infra- Connects via USB Device is intended for disabled users
red transmitters and Powered from USB port Uses gesture recognition for Windows
detector IR transmitter fitted to headgear applications
IR detector sits on monitor
Gyration Ultramouse [5] Uses the MicroGyro dual Connects via USB Commercial project aimed at
axis gyroscope to sense Refreshes at 80Hz business
rotation NiMH or 3 AAA batteries This device works on or off the
25' Radio distance desktop
Retails for US$179
Figure 3: Summary of 3D Input devices already on the market
5
Initial Specification
2 Initial Specification
Ease of Learning
This is most ergonomically important factor of the project. The device needs to be intuitive to
use so users can pick up the device and immediately start to use it.
Speed
It is important that the user is able to achieve their task quickly. The task might be to rotate an
object on the screen to the desired position.
6
Initial Specification
Accuracy
A mouse has to be a very accurate device, as the user has to position the cursor sometimes
on very small targets. This device must be accurate but not to the extent of a mouse. A mouse
has to point to small targets (such as icons) whereas for manipulating 3D objects targets are
not so small.
Co-ordination
The co-ordination of a device is how well it works as a single unit. Because there will only be
one part that interacts with the user it should achievable. A device that has multiple controls,
such as steering wheels and pedals, might find it harder to be co-ordinated.
Fatigue
The proposed device will undoubtedly cause the user to become fatigued after prolonged use
because the user has to hold the device in the air. To reduce the fatigue the sensor should be
as light as possible.
7
Project Overview
3 Project Overview
Packaging (Section 10) Sensors (Section 5) Microcontroller (Section 6) Radio Link (Section 11)
8
Project Overview
9
Project Overview
Expensive solutions
however lower noise
and drift compared to
04
Design Murata. Beyond
RS
budget
Start Implementing a
C
on
software A/D was
e unneccesary.
l ic
ar
Gyroscopes are not affected by
Gyro
Si
gravity. Knowing orientation
means gravity can be subtracted
o ftw
when accelerometers are added Medium cost, 5v power S
M
s
A/D converter
ur
supply, available, high
yr
5
at
49
performance, proven
G
a
Sensors Fragmented solution. x design with gyros
Ma
EN
Gyros & Accelerometers Within budget, readily
Ha
Translation can be
Amplifier and Filter
C
added once rotation available, compact and
rd
Hardware A/D are
-3
low power consumption.
w
has be achieved with available on PICs and the
ar
Ac
Output is differential.
gyroscopes 74 Low cost, readily M16C. They are reliable
e
1
ce
Requires significant
et
processing to remove
er
gravity component of
s
Flash erasable,
acceleration and either all-
hardware USART
or-nothing will work
and 8 10-bit a/d £9
16
Higher transfer rate
F
87
Continued on however requires
Softw High-end PIC. 5v operation,
Sy
7
Figure 6 clock line
a re hardware USART and 10 12-bit a/d.
nc
Low cost hardware
Data Processing £20 Primarily chosen over similar 16C774 PIC
h
UARTs on PICs. Data
ro
PICs because of familiarity with
n
moved into buffer to
family by author plus capability if
ou
Ha nsm
ware
transmit
tr a
extra sensors/features required
Hard
s
Transmission
rd is
44
w sio
C
ar n
18
s
8 10-bit a/d. and
u
no
hardware UART.
P IC
o
Similar to other
hr
Single transmission line.
choice. £19
on
c
m are
Can be easily converted
yn
si
ns tw
to RS232 voltage levels
As
is
tra Sof
Requires ‘bit-
banging’. Complex
Probably not powerful and processor
enough to handle inefficient
complex data processing
PIC Loses plug-and-play Supported in
aspect because driver is department (not USB
required so may as well though), author
M1 3 2 perform data processing
6C S -2 in software
familiarity with family
and low cost (£15)
Hig-end chip with R 45
hardware UART plus PC Interface 6C7
full departmental 1
support US
B PIC Low speed device
5v, 100mA supply, plug- designed for HID and
and-play, future USB Controller Cypress enCoRe similar peripherals. Well
compatibility, no driver documented in USB
development required Complete [xx] but no
HID support department support
undocumented or
not supported
C
AI
C
M
ive
or
L
Dr
work properly and to add full HID support only for
C
functionality to the device a driver mouse and three-axis Built-in support from D
would need to be developed joystick. Not so widely used for Windows 98 onwards HI
(rather than ‘kludging’ as a
joystick), losing the plug-and-play
3D object browsers as Java 3D 3D Software for devices that are built
VRML. Less well to this specification.
advantage USB has over RS-232. supported in general Proposed device will
Software data processing was meet specification
instead considered
10
Project Overview
c
eti Extra sensors and
a gn processing that are
DF
correction. £40 a pair So lin
M
lid A solid block could be machined Cy Difficult to machine and is large Accrylic or high-impact
into a cubic frame. Wastage is high
but the resulting frame is rigid.
Packaging Shape Sphere because space wasted fitting Packaging Material Plastic polystyrene. Expensive
around a cube. Ergonomically and heavy and poor
Aluminium was chosen becasuse it finish after machining
designed and looks attractive
is light wieght but strong
Pl
Cu
as
be
tic
ba
Can be made small
Solid plastic juggling
becasue it can fit
ll
ball [24] cut in half and
around the cubic frame
machined. Cannot
without wasting space.
easily be held for
Not ergonomic.
machining
11
Figure 6: Design Process (ii)
PC Interface
4 PC Interface
4.1 Aim
The aim was to design an interface between a microcontroller and the PC. The specification
for the link was:
• The interface should be low cost and as user-friendly as possible (i.e. plug and play)
RS232 and USB were short listed as being suitable for this project. They are both standard
PC interfaces that are low cost.
USB has some advantages and some disadvantages over RS232 that are relevant to this
project. Firstly the disadvantages: USB is not supported so well as RS232 in the Department,
is complex to implement and is more expensive. The advantages are the ability to provide a
five-volt power supply at up to 100mA to the device and providing a truly plug-and-play
interface. Microsoft describe the benefits of USB as, ”complete support for Plug and Play,
power management, and "hot plugging" to add or change devices without turning off the PC.
USB provides a fast, low-cost solution that is strongly recommended for gaming devices and
other input controls,”[22]
12
PC Interface
13
PC Interface
used on the Internet and was also the modelling language of choice for the Cuneiform project
carried out in the Department in 1999 [32]. Research found that, in fact, most mainstream 3D
software (including VRML browsers such as Cosmo [9], Cortona [26] and Contact [8]) do not
provide generic support across the range of HID devices that are described in the
specification.
Peter Sheerin is an industry expert on USB devices [28] who was contacted with reference to
problems finding support for unusual HID devices. The enquiring email and his full response is
listed in Appendix C. Sheerin commented in his email on the development of a USB HID
device, “If you continue on that path, that will make the list include your device, the USB
model of the SpaceBall, and future controllers from 3DConnexion…Unfortunately, the only
software I've seen that uses that spec is an internal utility from 3Dconnexion. But I wouldn't
give up quite yet on using that spec, since in the long run, it will result in greater compatibility.”
In Sheerin’s article written in February 2002 it is stated that [28 ],
“Unfortunately, CAD and other 3D-software providers have not adopted the
HID/DirectInput device interface at all, with some exceptions. A few, including
thinkdesign from think3, have eliminated the requirement to load an application-
specific plug-in to use a 3D input controller, but these programs still connect to
only one of the proprietary device drivers (the SpaceMouse, in this case). And the
one 3D viewer (the Cortona VRML browser) that I found with DirectInput-support
doesn't allow all six axes of a device to be used at once, forcing you to switch
viewing modes in the application in order to switch from movement along or about
one axis to another”
Parallel Graphics [26] produce a widely used VRML browser called Cortona that was
mentioned in the passage above. Cortona is widely used and supported well by Parallel
Graphics which is known to support HID mice and joysticks. They were contacted and asked
whether-or-not it was possible to support a Multi-axis HID class device using Cortona. The full
response is given in Appendix C. The extract below states that it is not possible to interface a
multi-axis HID device using Cortona unless the SDK was purchased, which was outside the
scope and budget of the project,
“To provide support for any other input devices in Cortona, an application, which
will handle events of the device and control movements in the Cortona 3D window,
should be developed. Such an application can be created with Cortona Software
Development Kit”
Whilst the support for HID devices was being further researched, development of a low-speed
USB controller was being developed using a Microchip PIC 16C745. Initial research had
suggested that support would be available for such a well-documented specification from the
14
PC Interface
USB Implementers Forum that seemed to have many advantages. Unfortunately this was not
the case.
Appendix A details the selection of the PIC16C745 microcontroller and implementation as a
joystick. Cortona supports input from a joystick that could be used to rotate or translate a 3D
VRML object. Interfacing the device as a joystick would provide a valid way of controlling the
movement of the object, even though the device was not a joystick.
The result of the attempted implementation of a USB controller is unfortunately unsuccessful.
It was possible to enumerate the PIC as a mouse and, using internally generated values on
the PIC, control the cursor on the screen using sample code from Microchip [21]. Modifying
the firmware code to enumerate the device as a joystick and control Cortona had limited
success. It was possible to rotate a VRML object in two axes but not in all three, which was
required.
The unsuccessful attempt at implementing USB on a PIC could be put down to several things:
the lack of support for HID devices by software developers, lack of knowledgebase for the
PIC16C745 on the Internet and complexity of the USB specification. The results from
modifying the USB descriptors were not represented by the time that had been spent
changing values in descriptors, which seemed to be correct (having been checked using the
HID Descriptor Tool [30] and USB Complete [5]). It was decided, at this stage, that there was
no guarantee that the innovative (in final year projects at least) USB solution would work and it
was decided to implement PC device to PC communications using RS232. RS232 has a
successful record in similar projects, is well supported in the Department, is low cost and has
no disadvantages, other than plug-and-play (and having to develop a device driver) and future
compatibility, over USB for this project.
• Data rate determined by transmission distance - typically 3m for 9,600 bps (960
bytes/sec)
15
PC Interface
RS232 communication was established using a Maxim [20] Max232 chip. The input to the
Max232 chip was as asynchronous data from the microcontroller described in section 6.
Figure 61 in Appendix B shows the schematic for connecting the Max232 chip.
The data transmitted from the microcontroller includes a 2 byte header to synchronise the
data with the software. See sections 6.7.2 and 8.3 respectively.
Known data was transmitted from a PIC16C774 microcontroller to the Max232 chip. The data
from the PIC was also output to a set of 8 LEDs (for debugging purposes). MS Windows
HyperTerminal (Figure 9) was used to monitor data that was received on COM1, which could
be compared with the known values sent from the microcontroller and the LEDs. The data
received is represented in ASCII format and can be converted to the hexadecimal values
transmitted from the PIC using an ASCII table [4].
Data Bits 8
Parity None
Stop Bits 1
16
Sensors
5 Sensors
17
Sensors
Gyroscopes measure the angular velocity that they are rotated at and to determine their
angular position would require a single integration. For reasons given in section 5.4, this
project shall concentrate on the use of gyroscopes.
Gyroscopes traditionally used rotating masses mounted on a set of gimbals to maintain
constant orientation when they were rotated. Mechanical rotating gyroscopes are both
expensive, have high power consumption and may suffer wear after prolonged use. Modern
fabrication techniques have meant that alternative (vibrating) gyroscopes have been
designed. They are significantly smaller and lower cost than mechanical versions although at
a cost of being more prone to inertial drift. Figure 11 is a summary of how gyroscope
specification has changed in the last 30 years.
Section 5.4 covers the selection process of the inertial sensors. The outcome was that three
Murata GyroStar Gyros were used to sense rotation around 3 axes. The Murata gyros contain
three piezoelectric (elements that produce a current when they are subjected to mechanical
pressure) that sense the rotation about a single axis. Figure 12 shows how the elements are
arranged.
18
Sensors
Coriolis Oscillations
Driving Oscillations Driving Oscillations
Figure 126: How the Murata Gyrostar Gyroscopes Sense Angular Velocity [31]
One of the elements is made to vibrate whilst the other two act as sensors. When the device
is rotated, the vibrating element experiences a coriolis force, causing a sinusoidally varying
difference between the two sensors with an amplitude proportional to their rotation. The
output is the difference between the outputs from the two elements.
5.2 Aim
The aim of this section is to select inertial sensors and implement them for use in a handheld
sensing device to provide passive sensing of movement in a 3-Dimensional space. The
following specification was drawn to help clarify what was required from the sensors
• The combination of sensors must measure, within appropriate limits, the movement
that the user makes to purposefully manipulate an object on the screen. Factors such
as accuracy (e.g. inertial drift and noise) must be considered.
• The total budget for the project is £100 which the sensors need to be bought from
• The device is going to be wireless so the sensors should be low power, preferably
operate at 5v (to complement the other components in the device), and be as small
and robust as possible
19
Sensors
For this project there were several options for the gyroscopes, summarised in Figure 13. The
ADXL202 accelerometers have been used before in the Department (so there is already an
advanced knowledgebase) and represent good value for money and are readily available. The
ADXL202 accelerometers are highly suitable for this application so no others were
investigated.
Device Type Voltage Current Number Range Typical Bandwidth Drift Size Cost for
per axis of Axis Noise 3 axis
Analog Accelerome 2.7v - 0.6mA 2 ±2g 4.3 mg 5kHz N/A <5mm³ US $24
devices ter 5.3v
ADXL202
Silicon Gyroscope 4.85v - <35mA 1 ±150°/sec 0.75°/sec 85Hz 0.55°/sec 30mm² x £279
CRS04 5.15v 8mm
Analog Gyroscope 4.75v - 6mA 1 ±150°/sec 0.35°/sec 500Hz 0.05°/sec 7mm² x pre-
Devices 5.25v 3mm production
ADXRS150
Murata Gyroscope 2.7v - 3.2mA 1 ±300°/sec 0.5°/sec 50Hz 0.5°/sec <15mm³ £90
Gyrostar 5.5v
Gyration Gyroscope 2.2v - 2.7mA 2 ±150°/sec 0.15°/sec 10Hz 0.12°/sec <25mm³ US$450
Microgyro 5.5v
20
Sensors
This section will summarise different ways of using a combination of accelerometers and/or
gyroscopes to sense the movement made by the user to manipulate the object on the PC.
5.4.1 Accelerometers
Three dual axis accelerometers could be used to measure all 6 degrees of freedom. Taking
the difference between the measurements observed by the accelerometer based opposite to
each other can differentiate rotation and translation from each other: “rotation can be
measured inertially without gyroscopes, using the differential linear accelerations measured by
two (or more) accelerometers undergoing the same rotational motion but located at different
distances from the center of rotation,” [31]. The configuration of these accelerometers is
described in detail in [11], section 6.1. Using accelerometers in this configuration involves
complex mathematics to track and subtract the effect of gravity. For this reason it was decided
to fragment the project: using gyroscopes to measure rotation and 3 single axis
accelerometers to measure translation. This simplifies the task of removing the component of
gravity because the orientation of the device is known from the gyros (which do not sense
gravity like accelerometers).
21
Sensors
5.4.2 Gyroscopes
Gyroscopes can be used to measure the angular velocity around the three axis x, y and z (roll,
pitch and yaw respectively). Gyroscopes do not suffer the effects of gravity like
accelerometers, so they can be used to sense orientation and track gravity. 3 single-axis
accelerometers can in conjunction with the gyroscopes to measure translation.
One major advantage of using gyroscopes to measure roll, pitch and yaw, then adding x, y
and z translation using accelerometers it increases the chance of the project succeeding.
Forming a complex solution using accelerometers alone carries the risk that either everything
or nothing will work.
z
It was decided to use three Murata Gyrostars will be used to sense 3-DOF. The Murata
Gyrostar offers the following features:
• Small (<15mm³)
• Good availability
Due to time limitations, at this stage it was decided that to concentrate on building a 3-DOF
device and making the addition of accelerometers an extension to the project.
22
Sensors
The specification for the Murata Gyrostar gyroscope is available from the Murata website
[23]. The output from the sensors is a differential voltage that sits at approximately 1.35v when
stationary however, “depending on physical factors, like temperature, the frequency and
The Murata data sheet specified the circuitry represented in the following schematic to be
connected to the output of the gyroscopes. The same circuit would need to be made for each
axis.
A/D Data
Sensor Filter Amplifier
Converter Processing
In his thesis [7], Benbasat used the Murata Gyrostar for similar applications. His
implementation of the gyroscopes and amplifiers were proven when applied to a 6-DOF
gesture-recognition inertial measurement. He makes these comments, “The purpose of a low-
pass filter is to reduce the effect on the system of noise in the bandwidth of immediate
interest. The maximum frequency of interest for human arm gestures is considered to be
approximately 10 Hz, though quantitative analysis of the sample data stream suggests that
most of the gestures in which we are interested have a maximum frequency in the 3 - 5 Hz
range. High-pass filtering can be used to remove constant and slowly changing values from
the signals…which can be very useful if thresholding of the signals is desired”.[7]
His design was used, largely unmodified, for this project. It incorporates the following features:
• The differential output signal floats around a central value set by pin Vref
• The inverting amplifier has a gain of between 1.36 and 1.67 by adjusting the value of
VR1. It is possible to change the amplification so that the output utilises the whole
range of the a/d on the micro controller.
• Capacitor C2 provides a low pass filter with a cut-off of 66Hz (this is the upper limit of
the range of frequencies were found to be significant for inertial input device
applications in [7]).
23
Sensors
• The idle value can be adjusted precisely by varying the A/D reference voltages on the
micro controller (see section 6.5)
The Maxim Max495 operational amplifier was used to construct the inverting amplifier. The
Max495 has the following features:
• 5v operation
90° turn clockwise followed by a sharp 90° anticlockwise to the original position. More results
can be found in Appendix F. The variable resistor (VR1) that formed part of the amplifier could
be adjusted to prevent the output saturating but also to use the full range of the output.
Adjusting the A/D reference voltages could alter the constant level the output remained at
when the gyro was motionless.
250
8-Bit Quantised Gyro Output
200
150
100
50
0
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
Time (s)
24
Sensors
The signal to background noise ratio when the sensor was at rest was calculated to be
approximately 12.5dB (based on the output ‘wandering’ a maximum of 15 levels when
stationary).
Implementing this circuit for each axis makes and fixing each gyroscope perpendicular to each
other makes it possible to measure the rotation of the body in all three axes. The outputs of
the sensors are connected to an a/d converter, which is covered in section 6.5.
• The sensors measure the velocity of rotations that the user makes in a 3D space. The
output from the sensors has a range that can be adjusted so the maximum expected
rate of rotation occupies the full range of the output.
• The quality of the output from the sensors is poor compared to more expensive
gyroscopes (Figure 13) in terms of drift (0.5°/s[23]) and noise (SNR 12.5dB).
• The cost to build the circuits for the sensors is £105 (based on the gyroscopes costing
£33 and the Max495 operational amplifiers costing £2[12])
• The sensors operating voltage is 5v and the combined current consumption is <6.5mA
25
Microcontroller
6 Microcontroller
6.1 Aim
The following specification was drawn up for the micro-controller that would convert the
outputs from the gyroscopes to an asynchronous serial data stream that could be connected
to the Max232 RS232 interface:
• The micro controller must have at least three analogue to digital converters. Extra
analogue inputs may be required if extra sensors need to be added
• There must be a way of implementing asynchronous serial data transfer using either
hardware or ‘bit banging’. The data rate must be 9600bps or greater. Assuming that
each sample is constructed of 5 bytes, this gives a refresh rate of 192Hz. The extra
baud rate allows for extra information to be transmitted (at a reduced refresh rate).
• The device should operate from a 5v supply and consume low power as it will be
powered off a battery
• There should be means for testing the microcontroller and debugging any code during
written during development
• The device must be low cost and development facilities available in the Department
26
Microcontroller
• Hardware USART
RS232 socket
Variable resistor to
adjust Vref+
27
Microcontroller
Initialise Variables
Initialise Ports
Transmit Data
The code was written in Assembly and built into a .hex file using MPAsm [21]. The PIC was
programmed using the Department PIC programmer.
• The microcontroller needed to sample the output from the gyroscopes at a frequency
of at least 50Hz [7]
28
Microcontroller
Wait ~50µS #
Left justified A/D High Reference = External Vref+ AN0-AN3 Analogue Inputs
A/D Low Reference = AVss AN4-AN7 Digital Inputs
29
Microcontroller
bit 7 bit 0
ADCS1 ADCS0 CHS2 CHS1 CHS0 Go/Done CHS3 ADON
0 1 0 0 0 0 0 1
# A delay of greater than the 3TAD is required. TAD for a 4MHz clock and the conversion clock
select bits set in ADCON0 is 2µs. A delay of 50µs is invoked using a nested loop.
30
Microcontroller
*SPBRG
The SPBRG register controls the baud rate that data is transmitted from the asynchronous
transmitter. The desired baud rate is 9600bps. The corresponding value of SPBRG is
calculated as follows (high baud rate is selected - BRGH bit of TXSTA). ([21] DS30275A]:
**TXSTA
bit 7 bit 0
CSRC TX9 TXEN SYNC - BRGH TRMT TX9D
0 0 1 0 0 1 0 0
8-Bit Transmit Asynchronous Un-
Don’t Care High Speed Read only Don’t care
Transmission Enabled mode implemented
RCSTA has to be given a value although not data is being received. Bit <7> (SPEN) needs to
be set high to enable the serial port. RCSTA is set to 0x90.
A configuration word can be specified to tell the programmer the settings to use. The following
configuration word was used.
__CONFIG _CP_OFF & _WDT_OFF & _BODEN_OFF & _PWRTE_OFF & _HS_OSC
Code protect off Watchdog timer Brownout reset Power-up timer High Speed
off disabled disabled crystal
Figure 26: Configuration Word
The protocol used to synchronise the asynchronous data stream with the host software
included the use of header bytes. The PC software recognises the header bytes, removes
them and stores the following 3 data bytes. The header bytes also contain the state of the
activate button. See Figure 41 for details of how the header bytes were removed.
31
Microcontroller
The testing proved the design of the microcontroller circuit and code. The code was designed
to re-sample the analogue inputs immediately the previous samples had been transmitted.
The refresh rate (determined from recorded data) was approximately 180Hz. The header
bytes had values in the range of the output of the sensors, rather than reserve two values
from the 256 values (to give output from gyros full 8-bit resolution). This meant there was a
chance that the header byte sequence being accidentally detected. The probability of this
happening would be approximately 1.5×10-5 had the output from the sensors been evenly
spread between 0 and 255 but because outputs from the sensors usually remains at
approximately 128 the chance of an error were less. An error was not noticed throughout
testing.
32
Data Processing Design
7.1 Aim:
The aim was to decide how to translate the data from the gyroscopes into data representing
rotation of an object in 3D space. The design would be a compromise between usability and
complexity. The following sections shall summarise the findings from research into possible
solutions and justify the decision made to implement a form of gesture recognition using signal
thresholding.
‘Direct mapping’ is the term used in this report referring to the orientation and position of a 3D
object on a screen being directly related to the orientation and position of the input device
(also referred to as absolute tracking). As an example, if referring to the rotation of the object
(which we shall concentrate on in this project) direct mapping implies being able to rotate the
handheld sensor 180°, for example, and the object on the screen also rotating 180°.
Direct mapping between input device and 3D object has an obvious advantage in terms of
usability. The user would be able to intuitively rotate an object to the desired position. If the
object on the screen were to rotate in the wrong direction the user would form part of a
feedback loop and be able to correct their hand movements accordingly.
To implement ‘direct mapping’ would require the integration of the data from the three
gyroscopes to give the absolute rotation of the sensor. There are problems that make this
solution non-trivial because the sensors themselves are not perfect at measuring their
rotation. This can lead to them getting increasingly disorientated (for gyroscopes,
proportionally with time).
33
Data Processing Design
• Noise: Figure 17 showed a sample of the output from a gyroscope. This sample shows
the noise present in the signal as the output varies when the sensors are stationary.
Integrating the noise will lead to errors in tracking absolute rotation.
• Drift: The Murata gyroscopes have a quoted drift of 0.5°/s. Low cost gyroscopes tend
to have a high drift. Drift is effectively where the gyroscope ‘slips’ and accuracy of the
rotational velocity is lost. If the rotational velocity is not accurate the orientation of the
device cannot be accurately determined
will cause a loss of accuracy up to 0.5°/s (i.e. 20s after starting from a known orientation the
To overcome the effects that cause gyroscopes to become disorientated, devices use different
methods of re-calibration. Recalibration requires the device to periodically sense its actual
rotation/position by interacting with the outside world. There are a number of ways that this
can be done:
These solutions have proven to effectively overcome the inaccuracies of inertial sensors. They
are, however, complex and require extra sensors and processing that makes them outside the
scope of this project.
A possible solution that was considered (of which no examples were found in any research)
was periodic ‘user’ recalibration. The idea was based on the user recalibrating the device
once it had become disorientated so it was unusable (either by placing the device in a cradle
or realigning the device with a specified orientation visually). It is hard to know, without
experimentation, how often user-recalibration is required.
‘User’ recalibration was considered along with the use of Kalman filtering, which research had
highlighted as a technique used frequently inertial sensing devices, “Kalman filtering is the
main analysis technique for inertial data and is used almost exclusively for inertial tracking, the
34
Data Processing Design
determination of position and orientation from inertial readings and an initial state,” [7]. It was
decided not to track absolute position because of the likelihood that extra sensors would be
required, as was stated in research:
“Inertial systems are not well-suited for absolute position tracking. In such systems, positions
are found by integrating, over time, the signals of the sensors as well as any signal errors. As
a result, position errors accumulate. Inertial systems are most effective in sensing applications
involving relative motion”. [31]
“Inertial sensors are completely passive, requiring no external devices or targets, however, the
drift rates in portable strapdown configurations are too great for practical use.” [29]
And finally, when referring to inertial tracking [14] states: “the frequent recalibration of the
system, i.e. with a compass is necessary.”
Another solution to overcome the imperfections in inertial sensing was gesture recognition
(relative tracking), which did not require re-calibration or extra sensors and fitted better with
the time limits imposed on the project.
Gesture recognition can be used to sense physical intentions made by a user. “Gesture
recognition offers a natural and intuitive way for users to input information” [6]. There are
numerous different ways that sensors can be positioned on the human body to monitor
movements made by the user, particularly their heads [19] and hands [7].
A commercial product that uses gesture recognition and inertial sensing is the Gyromous [13]
which, through specialised drivers, can interact with Windows applications and is marketed to
work especially well with PowerPoint presentations. Benbasat [7] has used a Hidden Markov
Model based approach to perform gesture. Gesture recognition is a recognised technique in
inertial devices. Verplaetse describes gesture recognition as a, “method for estimating motion
and position is to use a Kalman filter state-estimation algorithm. Once the time-dependent
motions and positions of the system are estimated, a pattern recognition scheme such as a
neural network, hidden Markov model, or matched filter may be performed with that motion
data,” [31].
A form of gesture recognition, through thresholding the outputs from the sensors, has been
developed for this project. This solution has been designed to overcome the inertial drift and
noise that causes problems with absolute tracking. The output from the sensors is an angular
velocity that is directly proportional to the speed that the user has rotated the device. Using a
35
Data Processing Design
clutching method (for example, an activation button) the outputs from the sensors can be
separated into those made intentionally by the user and those that are general hand
movements. Thresholding is used to overcome gyroscope noise and small movements made
by the user either from shaking or rotational components made unintentionally on other axis.
In his thesis, Benbasat [7] states about his project, “a peak size threshold is used, to ignore
gestures that are caused either by the acceleration sensitivity of the gyroscopes, or by
misalignment of the sensors,” and then goes on to say in conclusion, “visual inspection
suggests that it would be possible to collect interesting information from that data stream,
certainly the presence of gestures and the number of peaks. However, it seems there is not
enough entropy in the stream for our current algorithm, which produced meaningless output. A
simpler scheme looking at pair-wise differences between data points could be successful in
this case and is left as possible future work.” It is understood that a term used in his
statement, ’looking at pair-wise differences between data-points,’ is referring to the same idea
that is being considered in this project. The solution is described graphically in the following
section.
Thresholding is used on the output of each sensor for the following reasons (numbers refer to
Figure 28):
• To eliminate shake 2
36
Data Processing Design
250 X Axis
Y Axis
Z Axis
200
2 3
Output Value
1
150
100
0
0 0.2 0.4 0.6 0.8 1 1.2 1.4
Time (s)
Figure 28: How Thresholding is used to detect a gesture
Two modes of operation were implemented to suit the preferences of different users: normal
and auto-damping. Both modes are designed to be intuitive to use. Normal mode should be
considered as the user setting an object in motion in a frictionless environment and then
stopping it by counteracting this rotation. ‘Auto-damping’ mode should be considered as
setting the object in motion in an environment where friction exists and the object brings itself
to a halt. ‘Auto-damping’ will be covered in the following section.
The following explanation of ‘normal mode’ shall consider the input from a single gyroscope.
The same processing is applied to the three axes to give 3D rotation. All axes are processed
in quick succession, appearing simultaneous to the user.
Figure 29 shows how a user makes a gesture in a single axis.
1. The user rotates the device in one direction. The object on the screen rotates with an
angular velocity proportional to the up amplitude of the input up to the peak.
2. The user holds the device still and the 3D object continues to rotate at a velocity
proportional to the peak of the input
3. The user returns the device to the initial position and the 3D object stops rotating
37
Data Processing Design
Velocity
0
t
The output from a single gyroscope corresponding to a gesture of this nature is sketched in
Figure 30. This is considered to be a ‘positive gesture’ because phase 1 is positive. If phase
one is negative the input is mirrored and the same processing technique is applied. Figure 30
also explains the different parts of a gesture.
A sample output for a single axis is shown in figure 31. It can be seen how the magnitude of
the output rises when there is a threshold-braking peak in the input and how the output
remains constant until the input breaks the other threshold. If two peaks are consecutively
break the same peak, the second is ignored and the user has to rotate the device in the
opposite direction to counteract first peak. The output falls proportionally with the peak of the
second phase of the gesture: if the magnitude is not so big as the peak in the first phase,
when a peak is detected in the second phase the output is automatically returned to zero.
38
Data Processing Design
Phase 1 Phase 2
250 A B C D A
P1
200
Calibration Level
G1
Upper Threshold
Output Value
150
50
P2
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Time (s)
200
150
100
Value
50
0
0 1 2 3 4 5 6 7
-50
-100
Time (s)
Figure 31: Output in Normal Mode
39
Data Processing Design
Auto damping mode was developed to offer an alternative intuitive mode of use to normal
mode. The idea behind this mode is being able to rotate the object on the screen by making a
series of ‘nudges’ with the device. The starting angular velocity of the 3D object is proportional
to the amplitude of the gesture made by the user. The rotation is automatically damped so that
it comes to a rest in a specified time, in a similar way to if the object was in an environment
where friction existed. The damping duration, Dd, is specified by the user and is the period of
time the cube comes to rest (Figure 32)
Max
0
Time
Output(t) = Max - (λ×T)2 , where Max is proportional to the amplitude of the gesture
Where, λ = √(Max/Dd)
40
Data Processing Design
250
150
Output
100
Other peaks ignored
when there is an output
2000ms
50
A(max)
Output
0
0 1 2 3 4
-50
Time (s)
As well as the different modes the user can select, there are some settings the user can
change. It was hoped that performing usability tests would find optimised values for these
settings.
Sensitivity: The upper and lower thresholds are set with the calibration level as the reference
point. The calibration level is measured when the device is at rest and the user presses the
calibrate button.
Figure 34 shows an example of different threshold levels being used. If the threshold is ±50
there are 4 peaks, if the threshold is 25 there are 9 peaks and if the threshold is 10 there are
16 peaks. The significance of this is that noise is detected as a peak when the threshold is low
and when the threshold is high only large peaks are detected. In Figure 34 a threshold of 25 is
approximately the correct level to distinguish between noise and the wanted signal.
41
Data Processing Design
250
200
Input
Threshold = +50
Threshold = +25
150
Threshold = +10
Input
Threshold = -10
100 Threshold = -25
Threshold = -50
50
0
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time(s)
Amplification: The amplification is the factor, between 1 and 3, by which the output is
multiplied by (illustrated in Figure 35).
250
200
150 A2
100
Value
A1
50
2.A2
0
0 0.5 1 1.5 2 2.5
-50 2.A1
-100
Time (s)
42
Data Processing Design
Maximum Output: The maximum angular velocity at the output, as illustrated in figure 36. The
maximum output is the magnitude that the output is ‘clipped’ at. It allows the user to define the
maximum angular velocity that they are comfortable for the 3D object to spin at.
250
200 A1
A2
150
Value
100
Output
capped at 100
50
2A2
0
0 1 2 3 4 5 6 7 8 9 10
Time (s)
Time Out: This is the minimum time between two gestures, in milliseconds. This setting was
developed to overcome problems in normal mode where the user ‘corrected corrections’ that
lead to unintended gestures.
250
200
150
Value
100
50
Tout Tout Tout
0
0 1 2 3 4 5 6 7 8 9 10
Time (s)
43
Data Processing Design
Load Software
Read test instructions and press Enter filename and directory and
activate button to start test start data logger
Key
Unconditional View>settings to load settings
Simulate Mode only dialogue box
Device Connected only
Currently data logging
Not currently logging data
44
Data Processing Design
The activate button is a toggle switch: pressed once to activate and once to deactivate rather
than a press and hold button. The reason for this is because it may be difficult to hold down
the button whilst using the device. The ergonomics of Computing Devices (page 53 [10])
quotes from Mackenzie (Mackenzie et al, 1991) that the trackball (which is similar to this
device) has a ‘drag lock’ feature that means that the user doesn’t have to hold the button
down at the same time as dragging. It was found that he trackball, “performed poorly for
dragging because it is awkward to hold the button down and move the ball at the same time,”
[10].
45
Data Acquisition, Storage and Processing Implementation
8.1 Aim
The aim of this part of the project is to develop a piece of software that handles all the
incoming data from the device connected to the serial port and converts it into meaningful
data that can be used to manipulate the object in 3D space. It should also log inputs and
outputs so they can be graphed.
The software was split into several sections. Splitting the project helped set milestones and
test individual sections of the software before moving onto the next to help reduce the
chances of encountering bugs.
1. Data Acquisition
2. Data Storage
3. Data Analysis
Microsoft Visual Basic was chosen to implement the software part of the project. Visual Basic
has the following features:
• Advanced debugging (for example, syntax problems are detected after each line of
code is entered)
46
Data Acquisition, Storage and Processing Implementation
• Draws a rotatable 3D cube that provides visual representation of output (Section 9.3)
• Graphically represents inputs, outputs, threshold levels and maximum output (Fig. 47)
Settings
• User defined settings and calibration (sections 9.4 and 8.5 respectively)
Record
• Allows user to simulate input if device is not physically connected (for testing
purposes)
Test1_dialogue/Test2_dialogue
• Displays the time in which the test was completed (Figure 56)
47
Data Acquisition, Storage and Processing Implementation
48
Data Acquisition, Storage and Processing Implementation
• It must be able to detect and remove header bytes included in data (Figure 27).
Every byte of data received is stored into an input buffer. The next stage is to remove the
header bytes and store the x, y and z values into separate variables. Header byte number 2
can have one of two values, indicating the state of the activate button. The stages in removing
the headers are shown in Figure 41.
49
Data Acquisition, Storage and Processing Implementation
‘Input_buffer’ is an array of nine entries. This is because this is the smallest size needed to
‘find’ the data, as shown in Figure 40 below:
H2 X Y Z H1 H2 X Y Z
Start
Endval = 0
Data received on
COM1 (single byte)
Data converted to a
decimal from ASCII
Stored into
‘Input_buffer[endval]’ Endval = Endval +1 Endval = 0
No
Endval ≥ 4 &
Input_buffer[endval-4] = H1 & No Yes
Input_buffer[endval-3] = H2 Endval = 8?
?
ValueX = Input_buffer[endval-2]
Button Activated = False ValueY = Input_buffer[endval-1]
ValueZ = Input_buffer[endval]
No
Yes
Yes
H2 = H2Type1? Button Activated = True
Figure 41: Data Acquisition from RS232 Serial Port Flow Chart
50
Data Acquisition, Storage and Processing Implementation
• Record all inputs and outputs against a time stamp whenever new data is received
• User must be able to start and stop recording and select location for logged data
It was decided to store data into a text file. Each time stamp occupies a line and a comma
separates entries. The file can be opened in Excel (as a comma de-limited text file), which can
be used to plot graphs.
Green for logging,
Dir1.Path Filename.Text Start/Stop Logging Button Red for not logging
51
Data Acquisition, Storage and Processing Implementation
Device powered up
Repeated for Y
Device Uncalibrated and Z axis
Device Deactivated
X_cal = X_val
X_high_thresh = X_cal + sensitivity
Device remains still X_low_thresh = X_cal - sensitivity
User calibrates device
• The time-out feature prevents the user making two consecutive gestures in less than
the specified ‘time-out’. A gesture is not recognised when this timer has not elapsed.
52
Data Acquisition, Storage and Processing Implementation
Start
Data received and stored
into ValueX
Reset
Flipped = False
G1, G2, P1, P2 = false
Flipped = True
Flipped = false
X_val >
X_val_previous
X_val_previous X_val ≥ X_val_previous X_val_previous > X_val
≥ X_val
Invert X_int
Key
G1 Gesture Phase 1* X_val Value of input after possible inversion
ValueX Data received from COM1 Max_out User defined maximum angular velocity
*Refer to figure 30
53
Data Acquisition, Storage and Processing Implementation
Start
Data received and stored
into X_Val
Reset
Flipped = False G1, G2, P1, P2 = false
Flipped = True
Flipped = false
Lower Threshold
≥ X_val &
P1 not detected
Flipped = true
X_val ≥ Upper Threshold
G1 not detected
Damping
Auto-damping function Duration
produces output that elapsed
decreases to zero after
Damping Duration
Flipped = False Flipped = True
Invert X_int
Key
G1 Gesture Phase 1* X_val Value of input after possible inversion
ValueX Data received from COM1 Max_out User defined maximum angular velocity
* Refer to Figure 30
Figure 45: Auto-damping Mode Implementation Flow Chart
54
Implementing the Graphical User Interface
9.1 Introduction
A graphical user interface was developed to meet the following specification
• Provide an intuitive way for the user to set their preferences for the device
• Graphically represent the inputs and outputs from the data processing algorithms
• 3D rotatable object
Logging/settings
Figure 46: The Graphical User Interface
55
Implementing the Graphical User Interface
0 255
Scalewidth = 255
Input:
0 255
Scalewidth = 255
Process:
Output:
56
Implementing the Graphical User Interface
9.3 3D Cube
It was an important part of the project to provide a way to visually see the output from the
device so that it could be tested. To do this there were two options: link the program to a third
party piece of software to control movement of an object or implement a 3D object in the VB
interface which can be directly controlled.
VRML (virtual reality modelling language) is a language used frequently on the Internet to
display 3D objects and 3D worlds. Popular VRML browsers include Blaxxun Contact [8] and
Cortona by ParallelGraphics [26]. Research was carried out into ways of interfacing either of
these browsers with the VB software by controlling movement programmatically. Messages
were posted on newsgroups including the official Blaxxun Contact site but none of the replies
led to the solution. Research into Cortona by Parallel Graphics led to a VRML browser window
that could be embedded into the VB application and controlled, however, the Cortona SDK
was required at a cost that could not be justified. Since implementing another solution (below)
a paper was found that, “introduced a DeviceSensor for grabbing arbitrary input and a Camera
node to control the scene view and implemented both constructs in Blaxxun Contact”[3]. This
is an area for future development.
It was decided to construct a simple 3D object in Visual Basic from first principles. A cube with
different coloured sides was chosen so that that the orientation of the cube could be
recognised from the coloured faces. The flow diagram shown in figure 50 describes the stages
involved in drawing the cube.
z
8 7 Face No Colour Corners
1 Red 1234
4 3 2 Turquoise 2376
3 Blue 6785
4 Orange 5841 x
6
5 Yellow 4873 y
6 Purple 5126
1 2
57
Implementing the Graphical User Interface
Add perspective and store in new array, Sample Z_out from gesture recognition.
Perspective Coordinate Rotate Corner Coordinates proportionally
Calculate normal to each face using Sample Y_out from gesture recognition.
points defined in Corner Coordinate Rotate Corner Coordinates proportionally
Order faces from 1-6: 1 being the normal Sample X_out from gesture recognition.
of the face with the largest Y-component Rotate Corner Coordinates proportionally
Draw faces 3,2 then 1 using the polygon Pause for 10ms
function described in section 9.3
The next stage is to enter the x and z coordinates for the four coordinates of the polygon into
PolyPts. The command to draw a four sided polygon is:
Polygon Cube.hdc, PolyPts(0), 4
This function is repeated for each of the three polygons that are draw (there are only ever
three polygons visible).
58
Implementing the Graphical User Interface
The cube can then be rotated using circle geometry. For example, to rotate by angle θ
(radians) around the x-axis, all corners of the cube are rotated by θ in the Y-Z plane using the
following equations:
The cube is can be drawn with its new coordinates as before by adding perspective,
performing back-face culling and drawing polygons.
The output from the gesture recognition algorithm is directly input into the rotation of the cube.
A timer samples the output from the gesture recognition algorithm at an interval of 10ms and
rotates the cube proportionally to give an angular velocity of between 0 and 50°/s.
59
Implementing the Graphical User Interface
9.4 Settings
Other parts of the GUI include a settings dialogue box, simulated input dialogue box, data
logging dialogue box and functionality to perform some usability tests. The usability tests are
discussed in section 12.3 along with the results from testing and in 12.5.3 the optimised
settings are discussed.
The Settings dialogue box is shown in Figure 51 below. The settings are changed using the
sliders. The value shown above each slider is updated as the slider is moved. The values of
the sliders are stored to a corresponding variable when the Calibrate button is pressed
(section 8.5).
60
Packaging
10 Packaging
The packaging for the device was made in conjunction with the Department’s mechanical
workshop. There are two main components that form the packaging for the device: the
aluminium cube that holds the PCBs in position and the MDF (medium density fibreboard)
sphere that provides an ergonomic case that the user holds.
The cube was machined out of aluminium. Aluminium was chosen because of its strength to
weight ratio, which is very high. The cube was machined out of a solid block to make a rigid
structure. Rigidity was important to make sure the gyroscopes remained aligned. The
dimensions for the cube were decided upon because of the size limitations of the PCBs. The
dimensions of the cube are given in the technical designs in Figure 70.
The packaging that holds the electronics is in a sphere shape and is made out of laminated
MDF sheets. The MDF is CNC machined into a sphere and a cavity is machined to house the
aluminium cube that, in turn, holds the electronics. The sphere is made in two halves that are
bolted together and clamps the cube in place. The on/off and activate buttons are fixed onto
the casing. Alternative materials to MDF included acrylic and high impact polystyrene. MDF
was lighter and cheaper. Another idea was using a 100mm diameter rigid plastic juggling ball
[24]. It was decided that it would be problematic to hold the juggling ball while it was machined
so this idea was not used. To get over this problem of holding a spherical object, the cavity
was machined when the MDF was in block form, then the spherical sides were machined last.
The MDF could be finished in a plastic coating to give a professional finish.
Figure 53 shows how the packaging fits together, including the proposed components to make
the device wireless (see section 11).
61
Packaging
Activate/Deactivate
button
Battery
Gyroscope circuit
Battery holder
Radio transmitter
Gyroscope circuit
Gyroscope circuit
Aluminium cube to
mount PCBs
Microcontroller PCB
62
Radio Link
11 Radio Link
The initial specification included that the device should be wireless. The prototype, up until
now, has used a wire to provide power to the device. The PIC was mounted on a PCB outside
of the device: the analogue output from the sensors/amplifiers and the button was connected
to the PIC via a 6-core wire (Supply x 2, sensors x3 and button). Adding a single channel
radio link would have affected the schematic of the prototype considerably. Data would need
to be transmitted over a single-channel, digital (for reliability) data link. The PIC would need
mounting inside the device along with the transmitter. The device would also need to be
powered by a battery. The packaging had been designed to accommodate these changes
(Figure 53). The radio receiver would need to be mounted with the Max232 PC interface.
There are radio transmitter/receiver pairs available that are compact, have a low power
consumption, operate at RS232 data rate and include built in error correction. Such a device
could be connected to the asynchronous data stream from the PIC (unchanged from current
implementation) and fed straight into the Max232 serial interface. Because the radio link could
be added to the prototype without changing the functionality, an attempt to implement it was
made at this stage, once the rest of the prototype was functioning.
The LPRS Easy Radio 433Mhz radio transmitter/receiver pair was selected. The specification
for the device and pinout is available at [17] and an antenna design sheet is available from
RadioMetrix [27]. The data sheet was followed and a PCB was designed for the transmitter
and receiver.
The implementation of a radio-link had limited success. It was possible to verify that the
transmitter was operating, using a spectrum analyser set to 433.92 MHz. The receiver,
however, did not work. Testing the device found that it was drawing less current than the
specification and, despite using a high powered transmitter, the received signal strength
indicator pin did not have an output. The aerial was replaced and power supplies were
checked for any problems, such as excessive voltage ripple. A second receiver was ordered
and tested and shortly after being connected the RSSI pin voltage died (as the first had done).
Due to time limitations and cost of the receivers that were malfunctioning, it was decided not
to include the wireless link in the prototype. The reason for the malfunctioning is unknown:
perhaps they were damaged by static or were faulty when they were supplied.
63
Testing
12 Testing
64
Testing
65
Testing
The implementation of the cube being from basic principles meant there was the ability to
have total control over the orientation of the cube at all times and to determine the direction
the faces of the cube using the normals worked out for the back-face culling. It was possible to
implement a 3D rotation task, which is described in Figure 21. There are two tests: one more
difficult than the other. The harder test requires the user to perform a more complex rotation
and to align the cube into position to a higher accuracy.
The test requires the user to align the cube so that the red side faces the user. This is done by
measuring the Y component of the normal to the red face, which will be greatest when the red
face is facing the user. The user is required to make sure the red face is facing forwards within
a certain tolerance. The tolerance is lower for the harder test.
These tests gave quantitative performance results on speed, accuracy and learning time.
Qualitative results were also important for measuring fatigue and obtaining recommendations.
The testers were asked to comment on how they rated the device against the ergonomical
criteria given at the start of the project (section 2.2) using mean opinion scoring. These results
are given in Figure 59.
66
Testing
User opts to start the test from the toolbar. Having read The program deactivates the device and
the instructions they click next. automatically rotates the cube to the start position.
The user rotates the cube to the position detailed in the The user presses the activate button. A clock
instructions and brings the cube to a halt. automatically starts and is displayed
The software knows the orientation of the cube and The time the task is completed in is displayed and
when it is lined up accurately enough. The clock is the user can either repeat the test (Yes)or continue to
stopped when the position of the cube meets the use the device(No).
criteria and the cube is stationary.
67
Testing
35
30 Test 1 Test 2
25
Time (s)
20
15
10
0
1 2 3 4 5
Attempt
Test 2 was designed to be significantly more difficult than test 1 and the results show this
because it took all of the users longer to complete (in some cases, three times as long).
68
Testing
As a comparison, a similar test was undertaken in a VRML browser. The user was presented
with a 3D cube positioned in the same way, and, after the same warm-up period of five
minutes was asked to perform the same objective as the device testing. It was not possible to
have the same level of control over the VRML browser, so the tests were not automated and a
stopwatch was used. The results (shown in Figure 59) are discussed in the following section.
10
9 Test 1 Test 2
8
7
6
Time (s)
3
2
0
1 2 3 4 5
Attempt
The testers were also asked to give a verbal response to the device. Some comments were
favourable and others were constructive criticism. The comments included:
“It’s hard to get used to and at first I got slightly confused... I think there is a certain
knack to it”
“It seemed responsive”
“The button was useful to stop the cube rotating”
“I had to think quite hard at what movement I needed to make although I suppose it was
obvious”
“Perhaps it would be easier with a more descriptive object.. part of the problem was
knowing which face was where”
The comments that were made are also made in literature about similar devices (see section
12.5.2).
The following table (Figure 59) shows the MOS (Mean Opinion Score) for the different
ergonomical aspects of the project as decided by the test team. Each term was explained and
they were asked to rate the ergonomical factor from 1 to 5 (a higher score is preferable). The
results are the average of the user’s opinion. The users were asked to give their opinion on
how well the mouse performed ergonomically in a 3D environment to act as a comparison.
69
Testing
12.5.1 Functionality
The functionality tests proved the design of the prototype. Fragmenting the project into
sections helped to identify problems. It was possible to test each module and verify correct
operation before the system was assembled. The sensors were able to sense all but the
slowest of rotations and the output from the sensors utilised the full range of values available
for normal operation. Data was transferred successfully between the sensors and the PC
along a wire and any errors that may have occurred did not effect the operation of the device.
A professional looking graphical user interface was developed to interface the user to the data
processing output performed on data from the inertial sensors. The graphical user interface
formed the final part of what turned out to be a very robust design. Two attempts at
implementing a wireless interface was unsuccessful because, for unknown reasons, the radio
module stopped working indicated by the sudden disappearance of the received signal
strength indicator on the receiver. Despite problems with the wireless link, the prototype was
built, including the manufacture of packaging, and allowed usability tests to be carried out on
the design.
A method described as ‘gesture recognition’ was used to turn the output from the sensors into
the rotation of a 3D cube. The gesture recognition had two main objectives: to overcome the
effects of noise and inertial drift that the gyroscopes produced and to make the device perform
well against the ergonomical criteria set at the start of the project. Thresholding was used to
eliminate background noise from the sensors. The use of thresholding in this way had a direct
effect on the usability of the device and results indicate that the device was not as intuitive to
70
Testing
use or as quick to use as a mouse for performing 3D rotation. The tests designed into the
graphical user interface produced accurate, quantitative results that could be used to compare
the prototype device to a mouse. Of the ten users, all performed the simple rotation of a 3D
object in less than approximately half than with a mouse. These finding match the findings of
Zhai[33], who, in his report states, “With proper clutching mechanism, it is conceivable to
implement an isometric device in position control mode or an isotonic device in rate control
mode. Interestingly, these two combinations tend to produce poor user performance. The
reason is quite simple: the self-centering mechanism in an isometric device facilitates the
‘start, speed-up, maintain speed, slow-down and stop’ cycle in rate control. The later half of
the cycle is somewhat automatic with the self-centering mechanism in isometric devices. With
a free moving device, one has to deliberately return to the null position.”
It is not felt, however, that this is conclusive evidence that this prototype device is significantly
less usable than a mouse for the simple fact all testers had experience at using a mouse that
ran into decades. The users only had approximately 15 minutes of contact time with the
device and as Zhai comments, “Rate control is an acquired skill. A user typically takes tens of
minutes, a significant duration for learning computer interaction tasks, to gain controllability of
isometric rate control devices. It may take hours of practice to approach the level of isotonic
position speed.”[33]
To test the device accurately against a mouse, tests would have to be carried out on users
who had no experience with either device. Performing such a test would be futile because an
overwhelming portion of people this device would be likely to attract would have experience
using a mouse. For such a 3D object controller to become marketable it the device would
need to allow the user to perform a task ‘better’ than a mouse having had a lot less exposure.
The alternative to finding a user with little experience using a mouse is a user with a lot of
experience with the device. The author performed a lot of tests using the device during
development. It can be claimed that the author was significantly better at using the device
than other users in terms of speed and accuracy of performing a task. As Zhai [33]
commented, using such a device is an acquired skill and it is unclear if users would be
prepared to use such a device for every-day tasks. Testing found the majority of users were
happier using a mouse, finding the prototype more of an interesting toy than a useful tool.
71
Testing
Allowing users to decide upon the setting they found most appropriate has found optimised
values for the different settings implemented. Adjusting these settings had a heavy impact on
the usability of the device.
The sensitivity value was set depending on how still the user could keep their hand, especially
when it came to isolating a single axis and not making components of an intentional rotation
appear on other axis. The sensitivity set had to be a compromise between having to make
unnecessarily large ‘jerks’ of the hand to break the threshold and making these unwanted
components of rotation. At no point was the preferred threshold level low enough that
background noise broke the threshold.
The amplification at the output stage was selected at approximately the same level for all
users. This was a compromise between having to make sharp ‘jerks’ of the wrist to produce a
large output and having an output that saturated even for a gentle rotation.
The maximum output was largely unchanged from the default by the majority of users. The
default value is the comfortable maximum angular velocity that the output is capped at. The
maximum angular velocity was setting worth experimenting with, especially with a larger
amplification in normal mode. One particular example is setting the amplification high in
normal mode so small gestures are amplified so they saturate the output of the device. The
extreme case is where the output is either zero or saturated. None of the users selected these
settings, opting to use the settings so that the output was analogous with the speed of the
gesture. This was a positive outcome for the device because users preferred to use a
variable-rate output as per the main design of the device.
Half of the testers opted to use the auto-damped output mode. This mode of operation is
simpler to use than normal mode because there is no need to think about how to counteract
rotation so some users found it easier to use and completed the usability quicker in this mode.
Users who found using normal mode okay generally performed the usability tests quicker than
testers who used auto-damping modes. Most users set the damping duration at the default
value. There is certainly potential to develop this mode of operation, in particular allowing the
user to ‘nudge’ the 3D object in a different direction to the direction it was moving (currently
the user has to wait for the output to decay to zero before any other direction in that axis is
recognised).
The time-out feature used in normal mode was not adjusted by any of the users. This feature
was implemented to prevent a problem that was found during development (see Figure 37).
This feature is should be included but there isn’t a need for the user to alter the setting from
default.
72
Project Conclusion
13 Project Conclusion
Criteria Rating
It is hard to give an overall rating for the project because the criteria in the specification are
ordered on importance. The most important feature was the device being capable of
manipulating 3D object that the device did achieve. The device also scored well at being
robust and reliable. The areas where the project was not completed successfully was
implementing 6 degree of freedom (only rotation was implemented) and the wireless link was
not included.
The project was, on the whole, successful and there are some interesting results and
conclusions that can be made on inertial sensing. These are discussed in the following
section.
73
Project Conclusion
13.2 Discussion
Having completed the project this section will reflect on how well the project was executed, a
summary of the significant findings and further work that is likely to enhance the functionality
of the prototype. It is hoped that the findings from this investigation into an inertial device for
the control of 3D graphics will probe further work into this area, particularly in the Department,
and add to the existing knowledgebase on inertial sensing.
At the start of the project it was important to justify the project, so that a clear set of objectives
could be defined. It is believed that there is a need for a device to control 3D graphics that is
becoming increasingly made more aware as 3D graphics become incorporated into web
pages. The mouse is the default choice for the control of 3D graphics because it is low cost,
robust and has excellent ergonomic attributes. The mouse operates best in a 2 dimensional
environment: for 2D pointing and dragging tasks the average time generally found to be is less
than 1 second (Table 4.9 [10]) whereas for 3D tasks experiments in this project found that
time taken was, for the majority of users, over 6 seconds (Figure 59). Zhai [33] states: “As
three-dimensional (3D) graphics moves to the core of many mainstream computer systems
and applications, the search for usable input devices for 3D object manipulation becomes both
an academic inquiry and a practical concern. In the case of the 2D Graphical User Interface
(GUI), the computer mouse established itself very early and quickly replaced the light pen as
the de facto standard input. In the case of 3D interfaces, however, there is still not an obvious
winner suitable for all applications,”
There are no mainstream inertial devices available that have a widespread user base. The
problem that faces inertial devices is the overwhelming domination of the mouse on desktop
PCs. Any new device would have to be as cheap, robust and usable as a mouse. The cost of
inertial sensors and the necessary accompanying circuitry is gradually falling, partly due to
new MEMS fabrication techniques: “The use of inertial components to measure gestures has
increased due to the availability of cheap micro-electromechanical systems,”[11]. Inertial
sensing is a complex subject, due to fundamental problems of inertial sensors, so this
increases the cost and, unless the solution totally overcomes the problems will mean the
device is not as robust or persistent as a mouse. Having to pick up the device is also a
disadvantage because it makes switching between the device and the mouse or keyboard
74
Project Conclusion
slow. The final problem is the usability of a 3D input device that the user holds in their hand:
unless the device was very light, prolonged use would lead to fatigue. These disadvantages
might outweigh the intuition and speed at which a 3D inertial device could perform. To
compete with a mouse the device would need to have considerable advantages, especially
considering the versatility of the mouse and that the majority of the computing world is
experienced at using them. Specialist applications (such as studying cuneiforms in museums)
might be where there is a demand for such a device. Specialist markets might be able to
justify the extra expense to purchase a device although the demand compared to the desktop
PC market would be comparatively low so the incentives to developers would small. It is,
however, worth developing the idea, too see whether the performance of such a device can
be proven and with the cost of inertial sensors dropping, may become marketable in the
future.
A prototype device was developed as part of an investigation into the application of inertial
sensors to the control of 3D graphics. Through books and substantial research on the Internet
it has been possible to offer many possible solutions to meet the design criteria. An industry
expert was also contacted and provided interesting comments on what was trying to be
achieved in this project. Before discussing the solution and further developments, two
particular areas of the research shall be discussed because interesting conclusions can be
made: the USB HID specification and the application of inertial sensors to human input
devices.
USB connectivity was desirable for the device because of future compatibility with PCs and
because the device could be designed to be automatically supported by Windows without
additional drivers. The HID specification was released by the USB Implementers Forum to
allow hardware developers to build computer input devices that are able to interface directly
with applications that also have HID support. By adopting the specification, hardware
developers can ensure that their device will be compatible with existing and future software
that supports the specification. Research, including the contacting of an industry expert, found
that the specification has not been adopted by much of the industry despite the one huge
advantage; that application-specific drivers for each device would not have to be developed.
For example, 3Dconnection specify that their top-of-the-range SpaceMouse is supported by
“100+ applications Catia, UG, Pro/ENGINEER, SolidWorks, SolidEdge, Inventor and 3Dstudio
75
Project Conclusion
among others,” [1]. If 3Dconnection and the software developers had used the HID
specification, listing specific software packages would not be necessary. The problem could
be that software developers are reluctant to incorporate HID support properly; after all, it is up
to the device developers to make sure that their device can interface with software, “Since
each company created its own 3D-device interface, each had to work with many vendors to
incorporate support into CAD software, and not all of the vendors were willing to do this two or
three times to support all the devices available,” [28]. Alternatively, it could be the hardware
developers that are not demanding HID support from application developers. Either way, the
HID specification seems to have many advantages that would benefit the developers and
consumers by improving the compatibility between human input devices and applications and
hopefully developers will start to develop human input devices using it.
This project also included a study into the application of accelerometers and gyroscopes to an
inertial input device, which would sense the movement of the user’s hand in all six degrees of
freedom. There are two ways of arranging inertial sensors to measure six degrees of freedom:
using three dual-axis accelerometers or three gyroscopes and three single axis
accelerometers. Gravity has severe implications on the first method because the
accelerometers sense force and therefore gravity needs to be cancelled out. The second
method (using gyroscopes and accelerometers) does not suffer from this because gyroscopes
are not affected by gravity, so the orientation of the sensor can be calculated and the
corresponding component of acceleration due to gravity sensed by the accelerometers can be
cancelled out. It was decided to use the latter method as the arrangement of the inertial
sensors. Due to time limitations, accelerometers were not used so the device was only able to
sense 3D rotation.
There are significant problems, other than the effects of gravity that will be encountered by
any device that uses inertial sensors to measure its own orientation and position. These are
caused by imperfections in the sensors outputs. Inertial sensors suffer from noise and also
inertial drift because they don’t measure their movement precisely (as if they were slipping).
When trying to calculate the absolute position and orientation an object the output from the
sensors has to be integrated (twice for accelerometers, once for gyroscopes) which amplifies
the imperfections in the output from the sensors. The result is the device will become
increasingly disorientated with time. To overcome this problem the device must re-calibrate
76
Project Conclusion
periodically on fixed reference points. Methods include taking a reference on gravity and
magnetic north, vision processing to fix onto light sources in the local surrounding and using
two or more artificial sources of sonar.
Including the extra sensors and the complex data processing that is required to overcome the
problems with inertial sensing described previously was outside the time limitations imposed
on the project. A prototype was developed that used ‘gesture recognition’ to overcome these
problems. It uses thresholding to overcome noise and orientation remains constant because
the user returns the device to the ‘null’ position after each. The output from the sensors is not
integrated, preventing the inertial sensor inaccuracies from being amplified.
The gesture recognition based approach overcame the problems that would have been
encountered using direct mapping. The proven downside to this method is usability suffered
as a result. Testing showed that the device was slower than a mouse at rotating 3D objects.
Particular problems concerning usability were users not returning the device to the null
position after making a gesture, then attempting to make another gesture, which was
interpreted incorrectly because the device was misaligned. Also, some users didn’t find the
device as intuitive as was hoped and found themselves having to concentrate how on what
movement to make to make the desired rotation, especially when the device was already
moving in another axis. The gesture recognition did not perform well against the mouse in its
current guise. There is a scope for development of the gesture recognition design and it is
possible that gesture recognition could end up being as intuitive to use as a direct mapping is
expected to be. Improvements could include different operational modes (such as linearly a
decreasing output) or matching outputs to complex mathematical models such as Hidden
Markov Models that have, “had great success in the area of human gesture recognition.”[7].
Despite the ergonomical problems that have been discussed, the project successfully
demonstrated a fully working, robust hardware system. The project proved the used of the
MEMs Murata Gyrostar gyroscopes, a robust system that transmitted the output from the
gyroscopes to the serial port on a PC using a PIC, and a professional looking and robust
graphical user interface to test the device. The author benefited from the chance to learn
Visual Basic, improve PIC programming skills, understand the USB interface, develop 3D
graphics from basic principles and research and gain first hand experience of using inertial
sensors.
77
Project Conclusion
• Adding accelerometers to the prototype to measure translation. The PIC and the
serial interface have the capability to handle the extra complexity and data rate
• Solve the problems encountered with the radio transmitter and receiver module to
make the device wireless
• Add extra sensors to the prototype that are required for the device to recalibrate
electronically for ‘direct mapping’ mode
• Attempt user-recalibration. The hardware and GUI could remain largely unchanged
and the gesture recognition algorithm could be replaced with data processing that
would probably require the use of Kalman filtering.
78
References
14 References
79
References
80
Appendix A: Implementing USB on a PIC 16C745
Research found that there was limited support available for the 16C745 due to its relatively
recent release in 2001. Research found a project that used the PIC16745 to develop a USB
HID microphone [18]. This project gave details of the circuit built and the modifications made
to the sample code provided by Microchip [21] to construct a microphone and was a useful
reference during this project. The sample code is available off the Microchip website [21]. It is
packaged as usb122c.zip and contains 6 assembly files and one MPLAB project file than can
be assembled and programmed on the PIC to demonstrate the chip rotating the mouse cursor
in circles on the screen.
The sample code included the following four files, as well as 22-associated file that were not
modifiable:
I
Appendix A: Implementing USB on a PIC 16C745
The sample mouse demonstration was built using MPLAB (loading the movecurs.pjt project
file) into an Intel Hex file that was then loaded onto the micro controller using the WPIC
software and PIC programmer. The settings for the programmer were:
• Oscillator H4 (HS)*
• Watchdog OFF
II
Appendix A: Implementing USB on a PIC 16C745
Report Descriptor
retlw 0x05
retlw 0x01 ; usage page (generic desktop) See HID Usage Tables [30]
retlw 0x09
retlw 0x04 ; usage (joystick)
retlw 0xA1
retlw 0x01 ; collection (application)
retlw 0x09
retlw 0x01 ; usage (pointer)
retlw 0xA1
retlw 0x00 ;collection (linked)
retlw 0x05
retlw 0x09 ; usage page (buttons)
retlw 0x19
retlw 0x01 ; usage minimum (1) Three button joystick
retlw 0x29
retlw 0x02 ; usage maximum (3)
retlw 0x15
retlw 0x00 ; logical minimum (0) Logic value for buttons
retlw 0x25
retlw 0x01 ; logical maximum (1)
retlw 0x95
retlw 0x02 ; report count (3) Three bits..
retlw 0x75
retlw 0x01 ; report size (1) ..of single bit size
retlw 0x81
retlw 0x02 ; input (3 button bits)
retlw 0x95
retlw 0x01 ; report count (1)
retlw 0x75
retlw 0x06 ; report size (5) Five bits of padding
retlw 0x81
retlw 0x01 ; input (constant 5 bit padding)
retlw 0x05
retlw 0x01 ; usage page (generic desktop)
retlw 0x09
retlw 0x31 ; usage (X) Two-axes joystick
retlw 0x09
retlw 0x30 ; usage (Y)
retlw 0x15
retlw 0x81 ; logical minimum (-127) 256 levels
retlw 0x25
retlw 0x7F ; logical maximum (127)
retlw 0x75
retlw 0x08 ; report size (8) Each report is eight bits
retlw 0x95
retlw 0x03 ; report count (2) There are two reports
retlw 0x81
retlw 0x02 ; input type(data, variable, absolute)
retlw 0xC0 ; end collection
retlw 0xC0 ; end collection
III
Appendix B: Circuit Diagrams
IV
Appendix B: Circuit Diagrams
R1 + VR1
Gain = −
R2
V
Appendix B: Circuit Diagrams
VI
Appendix C: Correspondence
Appendix C: Correspondence
VII
Appendix C: Correspondence
Request sent on 25th November 2002 for advice on HID support to Peter Sheerin
psheerin@cmp.com
Peter,
I am currently trying to develop a 3D input device for a PC using the USB HID specification. I have read
your article on Cadence that you wrote back in February and I hope you don't mind me contacting you.
The device I am making comes under group 8, multi axis controller, of the USB spec. Its an inertial
device that will send Rx, Ry and Rz data to the computer. The scope of my project is to build the device
but I need to interface it to some software to demonstrate it working. All it needs to do is rotate a cube
or something. I emailed Cortona and they say that I need to use their SDK to get my device to work. I
was thinking about grouping my device as a joystick instead, but I read on the MS website that there
might be problems as a joystick uses absolute position and I want to send angular velocity information.
Do you know of any software that might support my device for demonstration purposes or anyone else
who may be able to help me? Any help would be much appreciated.
Many thanks
VIII
Appendix D: Hardware CAD designs
Top View
IX
Appendix D: Hardware CAD designs
X
Appendix E: Component List and Budget
The initial budget for this project was £100. Due to the cost of the Murata Gyrostar
gyroscopes alone, this budget was exceeded. The approximate budget for this project
was as follows:
XI
Appendix F: Sample Data
300
250
Inputs
200
150
100
Value
50
0
0 2 4 6 8 10 12 14 16 18 20
-50
Outputs
-100
X Axis
-150
Y Axis
Z Axis
Figure 72: Example Inputs and Output for ‘Normal Mode’
XII
Appendix F: Sample Data
250
200 Inputs
150
100
50
Value
0
0 2 4 6 8 10 12 14 16 18 20
-50
Outputs
-100
X Axis
-150
Y Axis
-200
Z Axis
XIII
Appendix G: PIC code
__CONFIG _CP_OFF & _WDT_OFF & _BODEN_OFF & _PWRTE_OFF & _HS_OSC ;configuration word
;*******************************************************************************************************************************
; Defining variables not declared in p16c774.inc
COUNT equ 0x22
TEMP equ 0x23
VALUEX equ 0x24
vALUEY equ 0x25
vALUEZ equ 0x26
ANALOGR equ 0x27
ADCHAN equ 0x28
BUTTON equ 0x29
;*******************************************************************************************************************************
; Startup instructions. This section comes from 16c745 template
;*******************************************************************************************************************************
; Start of main program
;*******************************************************************************************************************************
XIV
Appendix G: PIC code
Loop
bcf STATUS,RP0 ; select bank 0
movf PORTD,w ; read PORTD into working
movwf TEMP
btfsc TEMP,4 ; Checks activate button..
movlw 0x0f ; ..if pressed 15 moved to working
btfss TEMP,4
movlw 0x10
movwf BUTTON ; ..else 16 moved to working
movlw 0x41 ; channel 0 selected, fosc/8, a/d on: 01000001
movwf ADCHAN ; puts value into ADCHAN
call Analog
movf ANALOGR,w ; puts analog sub-routine result into working
movwf VALUEX ; stores into VALUEX
;*******************************************************************************************************************************
;This section sets all ports.
Init_ports
bcf STATUS,RP0 ; switch to bank 0
clrf PORTA ; clear ports A - E
clrf PORTB
clrf PORTC
clrf PORTD
clrf PORTE
bsf STATUS,RP0 ; switch to bank 1
movlw 0xff ; sets all of PORTA to inputs
movwf TRISA
movlw 0x00 ; sets PORTB to outputs
movwf TRISB
movlw 0x80 ; PORTC outputs apart from pin 7 which is the UART Rx
movwf TRISC
movlw 0x10 ; pin4 of PORTD is input for activate button
movwf TRISD
movlw 0xff ;PORTE set to inputs (not used)
movwf TRISE
movlw 0x3B ; sets up ADCON1
movwf ADCON1 ;left justified, ext. Vref+/AVss as reference, AN<0, 1, 2> analogue
return
;*******************************************************************************************************************************
; Sets up the UART
Init_serial
bsf STATUS,RP0 ; select bank 1
movlw 0x19 ; high baud rate, 4mhz oscillator, 9600bps
movwf SPBRG
movlw 0x24 ;. brgh =1(high speed) and transmit =1 (enabled).
movwf TXSTA ; setup transmit status>
bcf STATUS,RP0 ; goto bank 0
movlw 0x90 ;Receive not used but must be implemented..
movwf RCSTA ;..to enable serial port (bit SPEN set to 1)
return
XV
Appendix G: PIC code
;*******************************************************************************************************************************
; Transmits data over UART and to LEDs connected to PORTB
Xmit
bcf STATUS,RP0 ; select bank 0
call Transdelay ; waits 1ms for data to be transmitted
movlw 0xff ; moves 255 (header 1)to working
movwf TXREG ; moves to Transmit buffer
movwf PORTB ;moves to PORTB LEDs
call Transdelay
movf BUTTON,w ;second header. Value depends whether button pressed
movwf TXREG
movwf PORTB
call Transdelay
movf VALUEX,w ; Transmit VALUEX and output to LEDs
movwf TXREG
movwf PORTB
call Transdelay
movf VALUEY,w ; Transmit VALUEY and output to LEDs
movwf TXREG
movwf PORTB
call Transdelay
movf VALUEZ,w ; Transmit VALUEZ and output to LEDs
movwf TXREG
movwf PORTB
return
;*******************************************************************************************************************************
; Routine polled TXSTA bit 1 to see if the TSR register was empty. However polling TSR did not work
;so a timing loop has been implemented, based on the time to send 10 bits at 9600bps
movlw 0x54
movwf COUNT
Loop5 decfsz COUNT,f ;250uS
goto Loop5
return
;*******************************************************************************************************************************
; Routine to read values from a/d converter. p121 16c774 data sheet
Analog
bcf STATUS,RP0 ; select bank 0
movf ADCHAN,w
movwf ADCON0 ;moves values of ADCHAN to ADCON0
movlw 0x0f ;waits ~50us
movwf COUNT
Loop2 decfsz COUNT,f
goto Loop2
bsf ADCON0,2 ; starts A/D conversion
Loop4 btfsc ADCON0,2
goto Loop4 ;loops until ADCON0, GOis clear once conversion has taken place
movf ADRESH,w ;moves high byte of conversion to working
movwf ANALOGR ;stores result into ANALOGR
return
END ; directive 'end of program'
;*******************************************************************************************************************************
XVI
Appendix H: Visual Basic Code
Main
End Sub
Reset_all_x
Reset_all_y
Reset_all_z
X_val = 0
X_out = 0
Y_val = 0
Y_out = 0
Z_val = 0
Z_out = 0
End Sub
XVII
Appendix H: Visual Basic Code
If Simulate_input = False Then 'similate input has proirity of com port if enabled
Input_buffer(Endval) = Asc(Data) ' places a decimal conversion of the ascii symbol into input_buffer at next
slot
Update 'calls update
End If
End If
End Sub
End Sub
End If
XVIII
Appendix H: Visual Basic Code
Valuex = Input_buffer(Endval - 2) 'takes the last 3 values and stored into valuex, y and z
Valuey = Input_buffer(Endval - 1)
Valuez = Input_buffer(Endval)
Store 'calls store
End If
End If
End Sub
Print #1, Mytime; ","; Valuex; ","; Valuey; ","; Valuez; ","; X_out; ","; Y_out; ","; Z_out; ","; X_cal; ","; Flipped_x;_
","; Gesture_phase1_detected_x; ","; Gesture_phase2_detected_x; ","; Peak_phase1_detected_x;_
","; Peak_phase2_detected_x; ","; Currently_auto_damping_x; ","; Valid_after_damp_x;_
","; X_val_previous; ","; Auto_damping; ","; Damping_factor_x; ","; X_int; ","; Max_peak_phase1_x;_
","; Damping_duration ‘Prints headers for inputs and outputs and variables
End If
End Sub
Private Sub Settings_Click() 'replaces the log data form with the settings form
Settings_form.Show
Record_form.Hide
End Sub
XIX
Appendix H: Visual Basic Code
X_low_thresh = X_cal - Threshold 'works out upper and lower threshold for x axis
X_high_thresh = X_cal + Threshold
Y_low_thresh = Y_cal - Threshold 'works out upper and lower threshold for y axis
Y_high_thresh = Y_cal + Threshold
Z_low_thresh = Z_cal - Threshold 'works out upper and lower threshold for z axis
Z_high_thresh = Z_cal + Threshold
End Sub
X_val = (2 * X_cal) - Valuex 'if flipped is true, will invert all inputs
Else
End If
Currently_auto_damping_x = True
Damping_factor_x = Sqr(X_int / ((Damping_duration) ^ 2)) 'works out damping factor
Max_peak_phase1_x = X_int 'parameter for damping
Damping_time_x = 0 'this is the current time from peak
End If
End If
X_int = 0 'sets output to zero. this is so user doesn’t have to exactly counter-act phase 1 to make velocity 0
End If
End If
XX
Appendix H: Visual Basic Code
Valid_after_damp_x = True 'uses this variable so a gesture is not recognised half cycle
X_int = X_val - X_high_thresh 'difference between input and upperthreshold
Else
Valid_after_damp_x = False 'won't validate damping if damping when Gesture phase 1 detected
End If
End If
X_int = X_val - X_high_thresh 'outputs only if not autodamping and cycle completed
End If
Else
Currently_auto_damping_x = True
Damping_factor_x = Sqr(X_int / ((Damping_duration) ^ 2)) 'works out damping factor
Max_peak_phase1_x = X_int 'parameter for damping
Damping_time_x = 0 'this is the current time from peak
End If
End If
End If
End If
If X_val < X_low_thresh Then 'detects if input goes below lower threshold
If Peak_phase1_detected_x = False Then 'if input is a negative gesture the input will need to be flipped
'flip_function and all inputs will be inverted then call monitor again
If Currently_auto_damping_x = False Then
Flipped_x = True
End If
Exit Sub
Else
XXI
Appendix H: Visual Basic Code
X_int = X_int - (X_low_thresh - X_val) 'if auto-damping is off, will subtract input - calibration
End If
X_val_previous = X_val 'stores previous value for comparison
Else
X_int = X_int - (X_val_previous - X_val) 'subtracts difference from previous input from output
End If
X_val_previous = X_val 'stores for comparison
Else
X_int = 0 'sets output to zero - this is so user does not have to exactly counter-act phase 1 to make
velocity 0
End If
End If
End If
End If
End If
End If
If Currently_timed_out_x = False Then 'sets timed out the first time crosses back over lower threshold
Currently_timed_out_x = True
Current_Time_out_time_x = Time_out
End If
End If
XXII
Appendix H: Visual Basic Code
If Flipped_x = False Then ' if flipped is true, will invert x_val before output
End If
Else
End If
End If
End Sub
Monitor_X is then repeated for the Y and Z axes (Monitor_Y and Monitor_Z respectively)
Public Sub Timer2_Timer() 'triggered every 10ms
If Activated_mode = True Then 'if device is activated
Monitor_x 'calls gesture recognition functions in turn
Monitor_y
Monitor_z
Else
End If
End Sub
X_out = 0
X_int = 0
Flipped_x = False
End If
XXIII
Appendix H: Visual Basic Code
Auto_damping = Settings_form.Auto_damping_box.Value
Damping_duration = Settings_form.Damping_duration_slider.Value
Currently_timed_out_x = False
End Sub
Y_out = 0
Y_int = 0
Flipped_y = False
End If
End Sub
Z_out = 0
Z_int = 0
Flipped_z = False
End If
End Sub
XXIV
Appendix H: Visual Basic Code
End If
End If
End Sub
Auto_damp_X is then repeated for the Y and Z axes (Auto_damp_Y and Auto_damp_Z)
Cube
Pi = 3.14159265358979
End Sub
XXV
Appendix H: Visual Basic Code
Corner_coordinate(1).y_coordinate = Start_length
Corner_coordinate(2).y_coordinate = Start_length
Corner_coordinate(3).y_coordinate = Start_length
Corner_coordinate(4).y_coordinate = Start_length
Corner_coordinate(5).y_coordinate = -Start_length
Corner_coordinate(6).y_coordinate = -Start_length
Corner_coordinate(7).y_coordinate = -Start_length
Corner_coordinate(8).y_coordinate = -Start_length
Corner_coordinate(1).z_coordinate = -Start_length
Corner_coordinate(2).z_coordinate = -Start_length
Corner_coordinate(3).z_coordinate = Start_length
Corner_coordinate(4).z_coordinate = Start_length
Corner_coordinate(5).z_coordinate = -Start_length
Corner_coordinate(6).z_coordinate = -Start_length
Corner_coordinate(7).z_coordinate = Start_length
Corner_coordinate(8).z_coordinate = Start_length
End Sub
XXVI
Appendix H: Visual Basic Code
End Sub
Dim i As Integer
'Coordinates backed up and now restored. This is done because don't want to replace until both above
calculations are done
Corner_coordinate(i).y_coordinate = New_y_coordinate
Corner_coordinate(i).z_coordinate = New_z_coordinate
Next
End Sub
Rotate_x_axis is then repeated for the Y and Z axes (Rotate_y_axis and Rotate_z_axis)
End Sub
XXVII
Appendix H: Visual Basic Code
For Count = 4 To 6 Step 1 'counts through the 3 faces that are visible
Current_face = Render_order(Count) 'recalls the 3 visible faces (there may only be 1, in which case the other 2
are covered up by the front one
End Sub
End Sub
Public Sub Calculate_rendering_order() 'sorts faces into those who are most facing you
For Render_position = 1 To 6 Step 1 'cycles through the 6 places in the render_order array
Lowest_y = 1000 'an arbitrary high number
For Count = 1 To 6 Step 1 'cycles through each of the faces
If Sorted(Count) = False Then 'for sides that have not already been sorted
If Normal_coordinate(Count).y_coordinate < Lowest_y Then 'if this side is the lowest of those still unsorted
Lowest_y = Normal_coordinate(Count).y_coordinate ' store the y component of this face's normal
XXVIII
Appendix H: Visual Basic Code
End Sub
Face_color(1).R = 255
Face_color(1).G = 0
Face_color(1).B = 0
Face_color(2).R = 122
Face_color(2).G = 238
Face_color(2).B = 248
Face_color(3).R = 0
Face_color(3).G = 0
Face_color(3).B = 255
Face_color(4).R = 255
Face_color(4).G = 179
Face_color(4).B = 17
Face_color(5).R = 226
Face_color(5).G = 0
Face_color(5).B = 246
Face_color(6).R = 247
Face_color(6).G = 225
Face_color(6).B = 9
End Sub
Public Sub Face_corner_numbering() 'an array that stores corner numbers of faces
' face numbering 1: 1234
' 2: 2376
' 3: 6785
' 4: 5841
' 5: 4873
' 6: 5126
Face_corners(1).Corner1 = 1
Face_corners(1).Corner2 = 2
Face_corners(1).Corner3 = 3
Face_corners(1).Corner4 = 4
Face_corners(2).Corner1 = 2
Face_corners(2).Corner2 = 3
Face_corners(2).Corner3 = 7
Face_corners(2).Corner4 = 6
Face_corners(3).Corner1 = 6
Face_corners(3).Corner2 = 7
Face_corners(3).Corner3 = 8
Face_corners(3).Corner4 = 5
Face_corners(4).Corner1 = 5
Face_corners(4).Corner2 = 8
Face_corners(4).Corner3 = 4
Face_corners(4).Corner4 = 1
Face_corners(5).Corner1 = 4
Face_corners(5).Corner2 = 8
Face_corners(5).Corner3 = 7
Face_corners(5).Corner4 = 3
XXIX
Appendix H: Visual Basic Code
Face_corners(6).Corner1 = 5
Face_corners(6).Corner2 = 1
Face_corners(6).Corner3 = 2
Face_corners(6).Corner4 = 6
End Sub
XXX
Appendix H: Visual Basic Code
Bars
Y_cal = 125
Y_low_thresh = 105
Y_high_thresh = 145
Z_cal = 125
Z_low_thresh = 105
Z_high_thresh = 145
End Sub
'calls functions
Process_X_process_bar
Process_Y_process_bar
Process_Z_process_bar
Process_X_input_bar
Process_Y_input_bar
Process_Z_input_bar
Process_X_output_bar
XXXI
Appendix H: Visual Basic Code
Process_Y_output_bar
Process_Z_output_bar
End Sub
End Sub
End Sub
End Sub
If Valuex > X_high_thresh Then 'draws red up to high thresh and green past that
X_process_bar.Line (X_cal, 0)-(X_high_thresh, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
X_process_bar.Line (X_high_thresh, 0)-(Valuex, X_process_bar.ScaleHeight), RGB(0, 255, 0), BF
End If
If Valuex > X_cal And Valuex <= X_high_thresh Then 'between x-cal and high thresh just red line
X_process_bar.Line (X_cal, 0)-(Valuex, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
End If
If Valuex < X_cal And Valuex >= X_low_thresh Then 'between low threshold and x-cal just red line
End If
If Valuex < X_low_thresh Then 'lower than low thresh then red line up to low thresh and green past that
X_process_bar.Line (X_low_thresh, 0)-(X_cal, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
X_process_bar.Line (Valuex, 0)-(X_low_thresh, X_process_bar.ScaleHeight), RGB(0, 255, 0), BF
End If
End Sub
XXXII
Appendix H: Visual Basic Code
End Sub
End Sub
Record
End Sub
XXXIII
Appendix H: Visual Basic Code
Open Store_filename For Output As #1 'opens file of name typed in filename text box
'prints input, output and variable values
Print #1, "Time"; ","; "Value X"; ","; "Value Y"; ","; "Value Z"; ","; "X_out"; ","; "Y_out"; ","; "Z_out"; ","; "X_cal";_
","; "Flipped_x"; ","; "Gesture_phase1_detected_x"; ","; "Gesture_phase2_detected_x"; ",";_
"Peak_phase1_detected_x"; ","; "Peak_phase2_detected_x"; ","; "Currently_auto_damping_x";_
","; "Valid_after_damp_x"; ","; "X_val_previous"; ","; "Auto_damping"; ","; "Damping_factor_x";_
","; "X_int"; ","; "Max_peak_phase1_x"; ","; "Damping_duration"
Records = 0 'sets records stored to 0
Records_box.Caption = Records 'displays records stored as 0
Filename.Enabled = False
Dir1.Enabled = False
Drive1.Enabled = False
Main_Form.Tests.Enabled = False
Main_Form.View.Enabled = False
Else
Started = False 'if button pressed in start state then will stop
Shape2.FillColor = RGB(255, 0, 0)
StartStop.Caption = "Start Logging" 'displays start once again
Records_box.Caption = " " 'clears records box
Close #1 'closes file
Time_box.Caption = 0 ' time caption set back to zero
Filename.Enabled = True
Dir1.Enabled = True
Drive1.Enabled = True
Main_Form.Tests.Enabled = True
Main_Form.View.Enabled = True
End If
End Sub
Settings
XXXIV
Appendix H: Visual Basic Code
End Sub
End Sub
XXXV
Appendix H: Visual Basic Code
End Sub
Public Sub Uncalibrated_unlock() 'unlocks menus and controls when device is calibrated
Cube.Uncalibrated_caption.Visible = False
Simulated_input_form.Activate_button.Enabled = True
Main_Form.Tests.Enabled = True
Main_Form.Log_data.Enabled = True
End Sub
Public Sub Test_lock() 'locks controls and menus when test is underway
Main_Form.Tests.Enabled = False
Main_Form.View.Enabled = False
Main_Form.View.Enabled = False
Simulated_input_form.Close_simulated_control.Enabled = False
End Sub
Public Sub Test_unlock() 'unclocks controls and menus after test is finished
Main_Form.Tests.Enabled = True
Main_Form.View.Enabled = True
Main_Form.View.Enabled = True
Simulated_input_form.Close_simulated_control.Enabled = True
End Sub
Simulated input
XXXVI
Appendix H: Visual Basic Code
End Sub
End Sub
Test 1 dialogue is the same as Test 2 (given here) except for the code in red that is specific to
Test 2
XXXVII
Appendix H: Visual Basic Code
Private Sub Form_Load() 'when form loads displays time test completed in
Test1_time_caption.Caption = Test_timer / 1000
End Sub
Module 1 (Main)
Public X_cal As Integer 'calibration levels
Public Y_cal As Integer
Public Z_cal As Integer
XXXVIII
Appendix H: Visual Basic Code
Public Valid_after_damp_x As Boolean ' this variable makes auto-damp wait for a complete cycle before working
Public Valid_after_damp_y As Boolean
Public Valid_after_damp_z As Boolean
Public X_val_previous As Integer 'stores previous value for of X_val for comparison
Public Y_val_previous As Integer
Public Z_val_previous As Integer
Public Damping_duration As Integer 'in milliseconds how long you want it to last
XXXIX
Appendix H: Visual Basic Code
Module 2 (cube)
Public Type Corner_coordinate_type 'data structure for cubne coordinates
x_coordinate As Double
y_coordinate As Double
z_coordinate As Double
End Type
Public New_x_coordinate As Double 'allows backup to be made while circle geometry is used
then stored into x_coordinate
Public New_y_coordinate As Double
Public New_z_coordinate As Double
Public Face_corners(6) As Face_corner_type 'array of type above. entry for each face
XL
Appendix H: Visual Basic Code
XLI