You are on page 1of 10

HCI Group Project

Carbon Footprint Reduction


Baymax
Part 4
July 31, 2015

Project Overview

The idea behind our design is to create a system that entices individuals to participate in
going green. To do this, the design offers achievements and badges based on the activities that

a user participates in. They can easily add these activities from a wide array offered, which all
reduce a carbon footprint. Each activity is tracked to award a user with an achievement or
badge. The list of activities is extensive and includes many activities that the user can access
and choose from. Since there is such an array, there are various suggestions to guide them in
how to improve their lifestyle. The reason behind this is to encourage the user to continue
making changes as well as to guide them in what changes they can make; all of this
encourages and assists the user in creating and developing a greener lifestyle. Due to this
design, many different age groups are able to utilize the app and use it as a daily tool.
Additionally, the app offers a way to challenge friends or share activities. This allows for
encouragement and competitiveness to also fuel a users motivation to reduce their carbon
footprint. Our group feels that by creating an app for all ages and including social media, it is
unique and offers a one-of-a-kind experience that will help us become closer to a common goal,
to recycle and better our planet.
To test our design we will be using a PowerPoint prototype. The prototype implements
the major features of our design in order to see what a users response may be. These features
include being able to add an activity, to challenge a friend to that activity, to share an activity on
Facebook, to view achievements, and smaller features/transitions that allow these to take place.
Based on our desire to create an app that entices individuals to participate in going green, we
want to make sure our testers give feedback on engagement and motivation. For these to be
effective, we will also be testing for learnability and familiarity. Attempting to incorporate these
four usability criterias will aid us in creating a system that entices individuals to participate in
going green.
Evaluation Plan

The evaluation plan that the group has chosen requires four usability criteria;
Learnability, engaging, motivating, familiarity. With these four ideas in mind, the goal of the
mobile application is to encourage users interested in green activities and to develop that usage
into a daily habit.
The design team will implement an interview for each user to participate in after the
prototype has been tested. An interview will hopefully engage the users to elaborate further with
their feedback than through a survey. The items the design team will record will be answers to
the interview questions, and screen capture recording to evaluate the interaction through the
PowerPoint prototype with the idea the user will be using a think-aloud method. The evaluation
interview will take place in a quiet setting. The design team will create the interview questions
with no bias or leading tones, to receive a fair and honest answer from the user. The questions
will not be too vague, as the feedback would then not be very helpful or constructive. The
questions will also not be too narrow, assuring the user will provide answers that exceed one or
two words.
The evaluation plan, as mentioned above, will be an interview. The description that will
be given to the user is a brief description of the prototype and what the design team is trying to

accomplish. The tasks the user will be given include a variety of activities which directly relate to
the usability criteria.
The interview protocol is shown below in the Appendix, as well as the form the design
team completed for each user to gather quantitative data. The links are listed to show the forms
the design team used.
Results

Heuristic Evaluation
For the heuristic evaluation conducted in class, the feedback we received assisted us in
understanding the future elements to refine to implement the application. Two of the highest
priority items dealt with the heuristic values of error prevention and user control and freedom.
Error prevention was not utilized in the prototype and so one item was to include the status of
the activity selected as well as being able to delete an activity. The second item was that our
users, as is, were not able to add a friend a friend of their choosing or understand how a friend
could be added; this lack limited their control and freedom within the application. Our next item
that was not quite considered as much of a priority was that our summary page could be
considered too minimalistic and therefore take away from the importance of role of that page.
From there the items were considered less of a priority and mostly dealt with the flexibility and
efficiency of use and the aesthetics of the application. Specifically, this included the help page
being difficult to find, the date of the weekly My Activity was incorrect, and that elements looked
clickable, but werent.

Cognitive Walkthrough
For our cognitive walkthrough, we provided a team with the step-by-step instructions on
how to add an activity. Overall, the feedback seemed to find the steps easy to follow and the
application supported the purposes defined. There was one issue that was found which caused
hesitation and a misunderstanding of the purpose. Within the added activity page, there is a
challenge button directly beside the social icons. This layout lead the user to believe that the
challenge button related to the social icons, which it did not.

Qualitative Data
Overall, the first impression of our testers was the interesting appearance of the
application. Most of our testers complimented the look of the application; some specifically
commented on the appeal of the colors. What became apparent quite quickly is that about half
the users were confused as to the intent of the application as they began exploring. The
confusion for the most part seemed to vary as to the cause, but it is clear that there are
elements that lead to this confusing. It seemed that what helped the most to overcome this
confusion was the menu button in the top left and the add button in the top right. Both of the
elements were commented on as assisting in overcoming confusion or not understanding where

to go. This indicated that our application implemented elements that are familiar to a user and
that familiarity assists in learning the application. The problem being that the application relies
on familiarity to be learnable, which . All in all, we have an understanding that we would want to
change the summary page, the activity added, and the process of challenges to be easier for a
user to learn the application.
Our testers found our application engaging and motivating enough to consider using if it
were a real application or that if fulfilled the purpose of encouragement for environmental
change; since this is the purpose of the application, we found this encouraging. Two of our
Seven testers believed that the application would be one that they would use or they felt it would
encourage others to use. Of the remaining five, three believed they would not use it, but for the
reason of a lack of concern for the environment/recycling. In addition to this, six out of seven
users believe that the application, as is, serves the purpose of encouraging engagement in
activities that will benefit the environment. All of this helped us to understand the engagement
and motivation level of the application.
In the final analysis, we can see clearly that there are several elements that need to be
enhanced to make the application adhere better to learnability, engagement, and motivation.
Our application relies heavily on these elements to fulfill its purpose and therefore our research
has assisted us in understanding how we can improve our application.

Quantitative Data
With our quantitative data, we wanted to have a more tangible idea of how learnable our
application was as well as what aspects of the applications were being used. In the following
graphs, you can see that we were particularly interested in the usage of the add activities button
on the top right and the challenges button on the main page.

From our data, we are able to see that adding the shortcut add activity button to the top
right corner was an effective decision. This assists the application in adhering to learnability and
familiarity usability criteria. We also are able to discern that the challenges button on the
summary page may not be as effective as we might have hoped. This further supports the idea
that our summary page needs some reworking to assist in most aspects of our usability criteria.
There are various aspects to this, but it is clear that the importance of the page makes it a
priority in our design.

To gather a more thorough understanding of the learnability within our application we


compared the number of steps taken between users. It is easy to see in the graphs how both
the first activity and the third have a wide variety of of steps taken; this indicates that some
users took much longer to complete the task and had difficulty in completing it. With this in mind,
it further helps us to understand which aspect of the application to focus on when attempting to
improve the learnability of the app.
It is important to note that each tester was able to complete all tasks given.

Proposed Changes

When making a prototype of any kind, you need to allow the public to test the waters
and see how successful your prototype is; evaluations from the public can truly make or break
any design or product. By having users test the product and complete evaluations, this shows
you what ideas worked well and how they were received. By the evaluations, you can also see
any problems that may arise and that need to be tweaked for the final product.
One of the problems that our evaluations lead us to included deleting unwanted or
unnecessary activities that the user had previously added. A possible solution would be adding
a minus icon to the activities the individual has already completed. By adding a minus icon, it
would not only fix the problem of deleting unwanted activities that had already been completed,
but it could also lend a solution to accidental buttons being pressed. By fixing one problem, we
can also fix another by adding a simple icon. The results we collected lead us to discover the
challenges text on the summary page has the tendency to draw people to click on the icon to
view their challenges. To solve this issue, we will have to brainstorm another way to word this so
that it does not get accidently clicked. The biggest issue, based on the results of our evaluations
and the users reactions, is that there was some confusion with what the true purpose of our
application was. The users that we evaluated did not truly understand what the application was
doing and therefore, we need to solve that problem. A method of fixing this problem might be to
add a short summary or description to the home page of the application, giving a brief summary
of what the application does and what we hope that it will help us do, which is have others
participate in environment friendly activities.
Another sizable change that we could make a little more user friendly could be adding to
the social aspect of the application. At the prototypes present state, there are not features
where users can add friends or communicate with others through the application. By adding a
feature like this, this would allow users to send more challenges to others and allow them to be
more competitive towards other users. To aid in this change, we could add a feature where each
user creates a profile so that they could display their accomplishments and post on their page.
Subsequently, this would allow a news feed type page where each user could see their friends
accomplishments and thoughts. By fixing these issues that were brought to our attention by
having other individuals navigate through our application, this should give us better results in
future interactions with other individuals.
By having an application that is set up like ours, there would be maintenance that would
have to been done every now and then. For example, we would have to constantly maintain our
servers so that communications regarding new challenges and activities would be recording
without problems. For the activities page, we would have to update the dates and charts so that
the user would have accurate statistics of their progress and activities. We could also try to add
a feature so users could share specific accomplishments or challenges to their Facebook page,
so that more people could see our application and encourage others to use our application.
Another maintenance that we would have to complete would be updates for the applications
accomplishments and badges. We would like to continually add to the types of badges that
users could earn so that they are constantly participating in new and fun activities that are

environment friendly. Along with updating the badges and accomplishments, we could also
make seasonal tasks and challenges that go along with the time of year. For example, we could
add a challenge of growing a garden in the spring so that this would encourage users to
participate in activities that would be fun and beneficial to our environment.
Overall, our prototype was received rather well throughout our evaluations that were
completed. By having individuals of different age ranges, this gave our results a variety and
displayed how some age groups are more susceptible to navigating through an application like
our prototype a little easier than others. We feel as though the feedback that we received
through doing our evaluations helped in opening our eyes to changes that we might not have
thought of otherwise. As a group we feel that once we fix some of the issues that our users
noticed while navigating through our application, it will be a strong application that will aid in our
overall goal, which is to spread awareness and encourage others to partake in environment
friendly activities on a daily basis to make the world a better place.
Critique Evaluation

The design team agreed to implement an interview for the evaluation. Through the
interview, the team was able to collect detailed qualitative data which reflected the users
opinions about the prototype. Given more resources and time, the team could have gone into
further detail with interview questions and created questions that the user could further
elaborate on. To collect quantitative data, the design team created a form to enter specific
information for each user. The idea was that after the user was finished testing the prototype,
the design team member would review the video of the user testing the prototype and provide
detailed information.
The design team would implement a short questionnaire at the end of the prototype
testing that would offer the user quick and easy-to-understand questions that would have
ranges implemented in. This would offer the design team a significant amount of quantitative
data than did filling out the form after the user had finished. Providing ranges or scales for the
user to enter would provide a consistent measure of data throughout each user. The interview
was helpful in receiving feedback for making changes or improving the prototype, but as for
collecting quantitative data, the interview was more difficult to pull from. The design team
gathered data that referred to the number of steps the user had to take to complete an activity,
but that was the only quantitative data the team could gather by using the interview.
The design team learned that implementing an interview into the evaluation plan at first
seemed like an honest evaluation, but for the amount of time there was and the detail that the
interview required, the team could have implemented an evaluation that was a little less time
consuming and more efficient for quantitative data gathering. The interview also engages the
user to deepen their thoughts and opinions more so than a survey or questionnaire, but this
makes it more difficult to have a standardized basis to go off of. The data that was gathered
from the interview was unrelated with each user, as the user had the opportunity to give as little
or as much information as he/she desired. Overall, the design team learned the positives and

negative to interviews, and gathered more knowledge about effective ways of retrieving
quantitative and qualitative data.
Conclusion

All things considered, we have gained a much more thorough understanding of human
computer interaction and how to better understand if those principles are being applied to an
interface. To us, understanding how those principles apply to an interface has been the most
valuable aspect. This experience has allowed us to consider our designs and the interfaces
around us in a new way that is much more effective. These principles and concepts are ones
that will assist us in creating more effective programs and understand how best to design
interfaces for the purpose we intend. One of the most resounding ideas that we have taken from
the course is, as Dr. Gonzalez puts it, we can design the users experience, but we cannot
create it. Given this, we have also learned that it is important to understand how the user
interprets the information and how they decide to interact with the environment that we provide
them with. As we did with this project, we can have a specific intention, but that does not mean
the user will understand that intention; therefore, it is important that we access the application.
Overall, these are the lessons and concepts that we have taken to heart.

Appendix

Interview Protocol
https://docs.google.com/forms/d/1PvMdUwWZiSYiGzc_rZ2bkYDNfdqLljg4HC9kUcKK9c/viewform

Quantitative Data Protocol


https://docs.google.com/forms/d/1g2oD8cI6EtVp9Yjm2aTVpLBjf6Stp7UGE4Nl1DSClkQ/viewfor
m

You might also like