You are on page 1of 8

1

Week 9 Assignment 4

Week 9 Assignment 4 The Human-Computer Interface


Marci Smith
March 8, 2014
Professor Jason Keppler CIS 106

Week 9 Assignment 4

The Human-computer interaction (HCI) is not a new topic of discussion. The research
and practice of HCI was started back in 1980. It got an initial start as a specialty area in computer
science in its relations to cognitive science and human factors engineering (Carroll 2013). We
know that computers began the interaction with a simple display. The user is able to see what is
visually going on but thru time other interactions started to develop: the keyboard, mouse,
pointers, touch screens, and even more complex devices such as fingerprint and retina scanners
that can see your eye movements. There are many ways in which humans and computers interact
and this paper will discuss a few ways this happens.
One way we as humans know we are interacting with computers or other electronic
devices is with hepatics or haptic feedback. Haptic feedback is the use of vibration patterns and
waveforms to relay information to a user or operator (Carroll 2013). The word haptic means I
touch in Greek. Keys uses for this is so that the products can communicate with the user. They
can communicate with sounds, visual alerts (LEDs) and vibration. Its main purpose it to alert the
user.
Adding haptic feedback has a few great benefits for both manufactures and operators. It
is improving user experience with everyday products are built with touch screens which make it
less expensive to build. Operators are able to concentrate on the task at hand with using the
vibrations to transmit information. Smartphones, game controllers, tablets and the like all use this
now. It is becoming very popular in many new markets to have devices with several haptics
evolving from the simpler vibration models. It allows for greater product differentiation and can
help with a products competitive advantage (PR 2013).

Week 9 Assignment 4
Another aspect we need to look at in the human-computer interface is the human
memory. We know that human memory is broke down into two types of storage units, long term
and short term or working memory (Carroll 2013).

Copyright Card, Moran and Newell. All Rights Reserved. Reproduced with
permission.

These types of memory have led researchers and developers to create devices that work in the
same way our brain does to store and interact memory and functions. For example, when the
mouse was first created, its was quite amazing to physically move a device small distances on a
tabletop in order to control a pointer in two dimension on a computer screen (Tan 2010). In a

Week 9 Assignment 4
sense we can look at the brain as a complex arrangement of competing sub-systems that compute
highly specialized tasks (Carey 2002). The study of the brain and brain injuries is leading to
more developed brain-computer interaction. We now have devices that are able to interact with
brain thoughts and eye movement to control different aspects of the computer. This is a great
advancement in our technology.
With all of the new technology with devices that are able to read such delicate details, we
need to make sure that there really needs to be a solid structure and pattern that is followed to
allow for consistency across the board. There are a few factors that are taken into consideration
with consistency: physical, communicational and conceptual (Adamson & Wallace 1997). We
find that having consistency may lead to learned skills being transferred to new systems,
predicting system responses, shortened learning process, increased efficiency and reduce
working memory demand (Mendel 2009). When we overlook consistency, we lose overall
accuracy in the human-computer interface. Not everyone works, thinks or even acts the same
way so there is an instant demand or at least a requirement to have a straight forward baseline for
the human-computer interaction. That way the process of how the computer interacts with
different users will be the same for each different user. Otherwise it will take a more complex
system to be able to learn the differences between the different users. Now this would be fine if
the computer will only have one dedicated user for the entire time the computer will be used.
Some personal computers are developed with fingerprint readers for security purposes and
usability so the development of a more dedicated system in already in the works. The use of
consistency lead us to having specific design process.

Week 9 Assignment 4
The User Centered Design (UCD) is a way that the development process supports
activities that involve the user. They look at developing applications, programs and systems that
are easy to use and a good value to the user. There are four main principles within UCD:

A clear understanding of user and task requirements


Incorporating user feedback to refine requirements and design
Active involvement of user to evaluate designs
Integrating user centered design with other development activities (UsabilityNet 2006)

The principles here have an important impact on the four phases in the development process:
planning, analysis and requirements, design and finally evaluation. In planning, UCD activities
are looked at to be tailored to meet the needs of the project. Next in the Analysis stage it is
important to look and see which users will be able to contribute to the success of the project.
Make sure to identify the time, cost, skills and facility needed to complete and build the project.
Look at who the end user of the project will be and how it will be used. Look and see if there are
any environmental issues to be addressed and safety issues. This is where a lot of questions
should be asked to make sure that all aspects of concern are addressed. The third step is the
design stage. Here is where a project can be created or a prototype will be made. Often there are
many different prototypes will be started, changed and redone to create the best fit. Finally is
the evaluation stage, this is where feedback for users that are testing the product or prototype can
help in testing. Also this stage is often running at the same time as the design stage so that the
necessary changes can be made to help keep costs down in redesigning.
In mentioning some of the technology advancements with the computer being able to
interact with human movement is the last aspect of this human-computer interface topic to
address. Many people today have interacted in some way with moving to interact with the
computer or even a video game system. In the past, this was limited the movement required to
type on a keyboard or move a mouse. More and more tools are becoming available that allow

Week 9 Assignment 4
one to easily access and use body movements for interaction purposes. Nintendo Wii, Microsoft
Kinex, Google Glass, touchscreens, SixthSense device, Skinput, speech detection and even realtime retina tracking are just a few of the recent developments in this area. All of these are
demonstrating a fascinating way of how the HCI is going to be in the future. Using human
gestures as an input method offers more possibilities for different applications. The main
advantage of tools like these is the wide range of usage which they all have. SixthSense is still in
the prototype stage with a few users getting the chance to interact with it. It uses new technology
that the user is using a wearable gesture interface in which the communication is done with the
hands without any handheld tools (Hahn 2010).
Many new methods are continually being developed with similar characteristics. We as
humans communicate in a variety of ways and computers will evolve to communicate the way
we do. This will a wide variety of people to use a computer. Those that have not been able to
type on a keyboard will be able to move their eyes or speak to type and interact with the
computer and similar devices.

Referances

Week 9 Assignment 4
Carroll, John M. (2013). Human Computer Interaction - brief intro. In: Soegaard, Mads and
Dam, Rikke Friis (eds.), The Encyclopedia of Human-Computer Interaction, 2nd Ed.. Aarhus,
Denmark: The Interaction Design Foundation. Available online at http://www.interactiondesign.org/encyclopedia/human_computer_interaction_hci.html
PR, N. (2013, November 11). Study: Next Generation Touchpad with Haptic Feedback Makes
Control Tasks Easier and Safer. PR Newswire US.
Tan, Desney. (2010) Brain-Computer Interfaces and Human-Computer Interaction. Retrieved
from: http://research.microsoft.com/en-us/um/people/desney/publications/bcihci-chapter1.pdf
Carey J (ed) (2002) Brain Facts: A Primer on the Brain and Nervous System, 4th edn. Society for
Neuroscience, Washington DC, USA
Adamson, P.J. and Wallace, F.L. (1997). A comparison between consistent and inconsistent
graphical user interfaces. Pre-publication Report, University of Northern Florida, Department of
Computer and Information Sciences, Jacksonville, Florida.
Mendel, Jeremy and Pak, Richard. (2009) The Effect of Interface Consistency and Cognitive
Load on user Performance in an Information Search Task. Clemson University, Department of
Psychology.
UsabilityNet. (2006). Overview of the user centred dsign process. Retrieved from:
http://www.usabilitynet.org/management/b_overview.htm.
Hahn, Thomas. (March 26, 2010) Future Human Computer Interaction with special focus on
input and output techniques. University of Reykjavik.
http://www.olafurandri.com/nyti/papers2010/FutureHumanComputerInteraction.pdf

Week 9 Assignment 4

You might also like