Professional Documents
Culture Documents
game controller
ABSTRACT
In the gaming experience, players are used to intuitive interfaces
that allow them to jump straight to the entertainment. Modern
joysticks are physical hardware components with a fixed layout,
being the main interface for a large variety of games with different
control methods and needs that will be played by users that also
present different ergonomic needs and preferences. This work
proposes a new interface, based on a touchscreen device. We
present a gamepad concept capable of adapting itself dynamically
to a user according to its touch and attention focus patterns, trying
to avoid errors and provide a more comfortable experience. We
also present the results of our usability tests, with both objective
and subjective evaluations and the discussion about our findings.
CCS Concepts
Human-centered computing Human computer interaction
(HCI) Interaction devices Touch screens
Human-centered computing Ubiquitous and mobile
computing Ubiquitous and mobile devices Mobile
devices
Keywords
Adaptive interfaces; adaptive game control; game input; eye
tracking; focus detection.
1. INTRODUCTION
Video games are one of the main entertainment fields nowadays.
They are composed of many elements, such as gameplay, audio,
graphics and narrative. When these factors are well performed and
combined, the game may produce engagement and immersion to
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. To copy
otherwise, or republish, to post on servers or to redistribute to lists,
requires prior specific permission and/or a fee.
SAC16, April 4-8, 2016, Pisa, Italy.
Copyright 2016 ACM 978-1-4503-3739-7/16/04$15.00.
http://dx.doi.org/xx.xxxx/xxxxxxx.xxxxxxx
2. RELATED WORK
This work presents the use of a mobile device as an adaptive
controller to games. Actually there are many different kinds of
controllers. To name few of them, we can mention gamepads,
keyboards, steering wheel and mobile devices. The study in [4]
compared the usability, user experience, functionality and the
design of some controllers but they did not compare mobile
devices as inputs neither adaptive devices. Mobile phones have
specific hardware (camera, accelerometer, GPS, Bluetooth and so
on) with lots of them being different from the ones found in
traditional game platforms, like video games and PCs. For this
providing the best experience for the individual. Besides that, the
game itself is not constant: the challenges, level design and even
control options will change as the player progresses on it. To
create a dynamic interface that follows this process, our adaptive
controller constantly improves its interface to better fit both the
player and the current moment in the game. Adaptations may be
triggered by different adaptation causes, such as the context of the
interaction, the experience of the user, or user behavior [2]. The
use of a touchscreen device instead of a physical hardware, like
traditional controller, allows a single device to not only provide
custom interfaces for each game, but to change its own layout so
that it can respond to new requirements in the interaction.
In [ANONYMOUS], the authors developed agents that monitor
the player usage during the gameplay experience, tracking the
users touches and buttons interactions. Based on this data and
using machine learning approaches, the system dynamically
changed and adapted the buttons, in order to better fit the specific
player. In this work we propose a novel approach for the adaption
using eye tracking to perform better adaptations. In our
experiment we will focus in changes to the buttons position and
size, but other changes such as shape and even the way to interact
with a button can be altered, since our controller allows any kind
of interface element, including dragging into the sensible screen.
The communication between the mobile app on the smart device
and the PC that is running the game is performed via network,
using the traditional TCP protocol. Both devices must be in the
same network and an application on the device that is running the
game (PC or console) will be responsible for receiving the
commands from the controller and translate it to local keyboard
events that perform actions on the game. In this work we also
capture eye tracking data and send it to the controller.
3. PROPOSED INTERFACE
Our adaptive controller consists in a mobile application for a
smart device (in our case, an Android smartphone or tablet) that
presents a customizable graphic interface, specifically built to
attend the games needs. The controls are presented in the screen
and the user interacts with it by touching buttons to perform
actions in a game running on a regular PC or game console.
Figure 1 shows the prototype controller in action.
However, the interaction with the controller is not a static
experience. Each user has personal ergonomic needs and
preferences, so that a generic controller is not capable of
the user looks constantly to the controller, the speed of change for
the size and position of the buttons will be progressively
increased. If the interface stabilizes and the user stops looking at
the controller for more than 10 seconds, the controller will slowly
decrease the speed of the adaptation, stabilizing the interface and
performing way less changes to its layout.
We change the speed of the adaptation using two parameters: the
maximum change of a buttons size and position per iteration (the
algorithm is executed 2 times per second) and the amount of
points passed to the K-means clusterization algorithm. The first
parameter, when increased, makes the controller apply the
changes faster, changing its layout almost immediately, while a
lower value will result in slower changes. The last parameter,
when decreased, results in less points being passed to the
clusterization algorithm, resulting in an adaptation focused in the
more recent interaction patterns, being able to change its layout
more dramatically to answer to differences in the users behavior.
When this parameter is increased, the controller will base its
suggestions in long term characteristics of the gameplay section
and will be more conservative when performing changes.
We expect that this approach helps to avoid cases where the
interface keeps adapting itself after finding an optimal
configuration, performing unnecessary changes that can be
detrimental to the users experience.
5. USABILITY TESTS
In order to evaluate properly our proposal adaptation and the
interaction with the final user, we conducted a usability test,
observing some parameters given by the controller and the eye
tracking algorithm. To realize this evaluation, the tests were
divided in two stages: the pilot and the final user tests. The pilot
was the preliminary test and it worked to set the parameters to the
final test, determining the best configuration possible that would
fit to the game and the adaptation. As the controller do not need to
be identical to the physical one, the pilot test gave insights to
define his design.
Top
KA (%)
In-between
NA (%)
89.7
78.8
96.0
83.0
93.6
81.2
92.0
83.9
99.1
93.7
98.3
86.3
87.6
78.6
84.0
72.2
Average
92.5
82.2
Standard
deviation
5.3
6.3
6. RESULTS
The results were evaluated by analyzing each users precision,
that is, the percentage of touches on the screen that accurately hit
a button and performed an in-game action. All comparisons were
between the non-adaptive version (NA) and the K-means adaptive
version (KA), for each user and for the average of the entire
group. The results were compared using the Wilcoxon signedrank test, non-parametric statistical hypothesis test, which in this
case is used to compare if the difference between the results for
both controllers is significant. To execute this test, we set the
significance level to 0.05 and the two-tailed hypothesis, resulting
in a p-value that represents the difference. If the p-value is lower
than the significance level, the difference is significant.
To validate the improvements of our new adaptation approach,
using K-means and reacting to focus losses in real-time, we
compared the precision for KA and NA, using all users. Table 1
shows the precisions for all users, while figure 2 shows the
average for all players combined in a single plot displaying its
variation over time. With the Wilcoxon test we found a p-value of
0.007813 that is much lower than 0.05, which means that the
difference between the precision with both controllers were
significantly different. As the KA controller averaged a precision
of 92.5% while the NA controller achieved 82.2%, resulting in an
average precision 10.3% higher on the adaptive controller, we
have a strong indicator that the controller improved the users
precision, avoiding errors that could lead to frustration. The
standard deviation is 5.3 for the KA controller and 6.3 for the NA
controller, showing consistent results where all users display a
sizable improvement. In figure 3, we present the initial layout for
the adaptive controller, which is also the standard interface for the
non-adaptive version. In the same figure we can also see the final
configuration achieved by the adaptation based on touch input
and focus detection for one of our volunteers.
After playing the game, each user should fill the SUS
questionnaire [3] and the average of all users was 84.69 and the
standard deviation was 11.76. As [1], our SUS score of adaptive
controller version suggests that our controller had from good to
excellent acceptability. Comparing both versions, 75% of
participants preferred the adaptive version and believed that
played better. We asked the users about which controller version
they think was gazed more frequently during gameplay. Four out
of eight said that the non-adaptive was gazed more frequently, 2
said the adaptive was the most gazed and other 2 answered that
they did not detect any difference at all. The results in the
8. REFERENCES
[1] Bangor, A., Kortum, P., and Miller, J. Determining what
individual SUS scores mean: Adding an adjective rating
scale. Journal of usability studies, 4(3), 114-123, 2009.
[2] Bezold, M., and Minker, W.: Adaptive multimodal
interactive systems. Springer, Boston, 2011.
[3] Brooke, J. SUS-A quick and dirty usability scale. Usability
evaluation in industry, 189(194), 4-7.Chicago, 1996.
[4] Brown, M., Kehoe, A., Kirakowski, J., and Pitt, I. (2010).
Beyond the gamepad: HCI and game controller design and
evaluation. In Evaluating User Experience in Games (pp.
209-219). Springer London.
[5] [ANONYMOUS], details omitted due to double-blind
reviewing.
[6] Joselli,M. ,Silva Junior, J. R. S., Zamith, M., Clua, E., and
Soluri,E. A content adaptation architecture for games. In:
SBGames. SBC, 2012.
[7] Koster, R.: Theory of fun for game design. OReilly Media
Inc., Sebastopol, 2013.
[8] Langley, P. Machine learning for adaptive user interfaces. In:
Brewka, G.,Habel, C., Nebel, B. (eds.) KI 1997. LNCS, vol.
1303. Springer, Heidelberg, 1997.