You are on page 1of 7

CHAPTER TITLE PAGE

ACKNOLEDGEMENT i
ABSTRACT ii
ABSTRAK iii
TABLE OF CONTENT iv
LIST OF FIGURE viii
LIST OF TABLE x
LIST OF ABBREVIATIONS xi
LIST OF APPENDIX xii
CHAPTER 1 1
INTRODUCTION 1
1.0 Motivation 1
1.1 Problem Statement 2
1.2 Objective 2
1.3 Scope 2
1.4 Objective 2

CHAPTER 2 3
LITERATURE REVIEW 3
2.1 Introduction 3
2.2 Previous research on human body movement and 3
locomotion analysis
2.3 Summary 25
CHAPTER 3 27
METHODOLOGY 27
3.1 Introduction 27
3.2 System Overview 28
3.3 Hardware and Software Selection 29
3.3.1 Inertial Measurement Unit (IMU) 29
3.3.2 Arduino Nano 30
3.3.3 XBee ZigBee Wi-Fi module 30
3.3.4 MATLAB Software 31
3.4 Experimental Setup 31
3.5 Recognition Method 34
3.6 Performance Measure using Receiver Operating 35
Characteristic (ROC)
CHAPTER 4 38
PRELIMINARY RESULT 38
4.1 Introduction 38
4.2 Preliminary Data Collected 38
4.3 IMU Data Output 39
CHAPTER 5 44
CONCLUSION 44
5.0 Conclusion 44
REFFERENCES 45
APPENDIX A 49
APPENDIX B 50
APPENDIX C 51
APPENDIX D 52
APPENDIX E 57
APPENDIX F 62

LIST OF FIGURE

FIGUR TITLE PAGE

E
2.1 Modelling of walking gait [4] 4
2.2 Walking angle, Running angle and Standing angle result [4]. 5
2.3 Processing flow [4] 6
2.4 Comparison of the result obtained from IMU and Model for 6
walking, running and standing motion [4]
2.5 Overview of motion tracking system [5] 8
2.6 Process flow of orientation estimation [5] 8
2.7 Motion tracking test of vehicle guidance training [5] 9
2.8 Real -time tracking result of pitch angle [5] 9
2.9 Real -time tracking result of roll angle [5] 10
2.10 Real -time tracking result of yaw angle [5] 10
2.11 Different between Trajectory and Clustered Output [6] 11
2.12 Example run of the algorithm on a repeated triangular trajectory. 11
The input trajectory is partitioned (Step 1) into sub trajectories
(shown in different colors) that are then projected onto the
trajectory’s k-lines center (Step 2) for second-stage clustering
(Step 3) [6].
2.13 Algorithm of k-line approximation [6] 12
2.14 Example run of Algorithm 1 on 5 point sets for k = 2. Each set is 13
shown in a different shape. Assignments to the 2 lines are shown in
colours [6]
2.15 (a) Original trajectory data; (b) frequency plot for the trajectories 13
as determined by a manual clustering; (c) frequency plot resulting
from algorithm; (d) frequency plot form k-means [6].
2.16 (a) original trajectory data; (b) frequency plot for the 14
trajectories as determined by a manual clustering; (c) frequency
plot resulting from our algorithm; (d) frequency plot from using k-
means.
2.17 Conceptual diagram of HMM motion modelling [6] 16
2.18 Result of IMU model respectively (a) Strongly, (b) Weakly and (c) 17

Sideways [7]
2.19 Result of using Wii models [7] 17
2.20 Testbed with Opti-Track system, PIR and IMU sensors [8] 18
2.21 Overview of the approach [8] 19
2.22 Activity recognition [8] 19
2.23 Heading change estimation [8] 20
2.24 Algorithm of the particles filtering for human localization initial 21
[8]
2.25 Result of 3 localization and tracking. Without activity recognition: 21
( a)Real-time estimation; (b) Refined trajectories with activity
recognition; (c) Realtime estimat ion; (d) Refined trajectories [8]
2.26 System overview diagram [9] 22
2.27 The initial, constrained and optimized joint angle trajectories of 23
left upper arms. (a) Left shoulder pitch. (b) Left shoulder roll. (c)
Left elbow pitch. (d) Left elbow roll [8]
2.28 Trial snapshots of the experiments [9] 24
3.0 Methodology of the project 27
3.1 Overview of the system 28
3.2 Inertial measurement unit MPU-6050 29
3.3 Arduino nano 30
3.4 XBee ZigBee Wi-Fi module 30
3.5 MATLAB Software 31
3.6 Route for climbing stair cases 32
3.7 Route for right and left turning walking 33
3.8 Set of designed hardware 34
3.9 Model of Hidden Markov Model 35
3.10 Recognition Algorithm 35
3.11 Confusion matrix and common performance 35
3.12 Mathematical Equation of ROC 36
3.13 Illustrated graph of ROC 37
4.0 Right Turning Walking graph 40
4.1 Left Turning Walking graph 41
4.2 Climbing Walking Stairs graph. 42
LIST OF TABLE

TABLE TITLE PAGE


2.1 Parameter used in modelling [4] 4
2.2 Recognition result using IMU sensor mixed [7] 16
2.3 The success of probability of segmentation [7] 17
2.4 Recognition result using Wii Remote Data Strongly [7] 17
2.5 The squared errors of joint angles between the upper arms motion 24
of virtual human mapped to the robot but not constraints and the
actual robot [9]
2.6 Comparison of tracking human body movement and the 25
performance measurement
3.1 Category of the subject 30
4.1 Subject categories for pre experiment 39
4.2 Right Turning Walking Data collected from IMU sensor 39
4.3 Left Turning Walking Data collected from IMU sensor 40
4.4 Climb Stair Walking Data collected from IMU sensor 41
LIST OF ABBREVIATIONS

NO SHORT FORM STAND FOR


1. IMU Inertial Measurement Unit
2. HMM Hidden Markov Model
3. DOF Degree of Freedom
4. BSN Body Sensor Network
5. PIR Passive Infrared Sensor
LIST OF APPENDICES

APPENDI TITLE PAGE

X
A IMU MPU6050 Calibration Coding 47
B Program for collecting Human Walking Activities Data 51

You might also like