You are on page 1of 24

Project One

ENGL 4181, Spring 2016


Derek Nelson, Hussain Alabdulmuhsin, Jacob Cole

Table of Contents
Introduction
Executive Summary
Usability Test Results
Usability Test Observations
Key Problems
Recommendations
Conclusion

Graphics
Figures 1 & 2 refer to results of Section One, Task One
Figures 3 & 4 refer to results of Section One, Task Two
Figures 5 & 6 refer to results of Section One, Task Three
Figures 7 & 8 refer to results of Section One, Task Four
Figures 9 & 10 refer to the results of Section One, Task Five
Figures 11 & 12 refer to the results of Section Two, Task One
Figures 13 & 14 refer to the results of Section Two, Task Two
Figures 15 & 16 refer to the results of Section Two, Task Three
Figures 17 & 18 refer to the results of Section Two, Task Four
Figures 19 & 20 refer to the results of Section Two, Task Five
Figure 21 refers to the failed tasks in Section Two

Introduction
In 1967, the Online Computer Library Center (OCLC) was founded along with
WorldCat, which is a catalog of over 70,000 libraries throughout the world. WorldCat is the
current search engine used at the J. Murray Atkins library at the University of North Carolina at
Charlotte, and the research conducted exposed a number of flaws within the search engine. The
research testing was broken into three incremental parts and was given to five different users,
spanning across different types of users, from experienced to inexperienced. The testing
contained a number of tasks split into two sections of tasks, the first being considerably less
challenging and tedious as the second set of tasks. There was also a post-test questionnaire that
each participant took to help explain any difficulties the user may have had. Throughout the
research, the most prevalent problems have been centered on user experience. Simple things such
as autocomplete leading to incoherent responses, spelling mistakes not being automatically
corrected, and broken links led to many frustrations for the users. Many of the users relayed that
their preferred search engine of use remains outside of WorldCat, even though it is the schools
recommendation especially for academic material. This report will clearly show the flaws in
many of WorldCats programming and will conclude with long term and short term
recommendations for the improvement of the software.

Executive Summary
In order to test the usability of WorldCat Discovery, we conducted a series of usability
tests between January and February 2016 on the beta version of the search engine. We
preformed our test on five UNCC students ranging in grade levels and concentrations. The tests
were performed and recorded using screen capture software and reviewed to find any trends that
occurred between the tests.
In summary, the tests spontaneously identified many strengths and weaknesses of
WorldCat Discovery. Being that all the users were students, they were somewhat familiar with
the search engine and expected the offering of a quick and efficient way of looking up
documents.
Specific results and feedback, as summarized in the Results section, helped us identify key
problems with the usability of WorldCat discovery and propose both long and short-term
recommendations to enhance the user experience.
There was only one failed task among all users in Section 1
The average time to complete the tasks in Section 1 ranged from 35 seconds to 2 minutes 6
seconds
The fact that there are no outliers indicates that WorldCat is more than sufficient for
completing tasks such as the tasks in Section 1
Some of the users mentioned that the due date for items was listed on the main page while
other items were missing this information
2 out of 5 users tried to click on a link that lead to a missing webpage (404 error)
5 out of 5 users failed on at least one task in Section 2

The time to complete the tasks in Section 2 ranged from 25 seconds to 6 minutes before
giving up on the task
The wide range of times indicates that WorldCat needs to make changes to the interface
and usability of their service
3 out of 5 users failed to complete Section 2, Task 3 and 5.
All 3 of the users that failed Section 2, Task 5 used the auto-complete function of the exact
title of the document but still failed to locate it
5 out of 5 users had trouble with spelling at some point during the usability tests
5 out of 5 users indicated that they do not use the library search function as their primary
search engine for academic assignments

Usability Test
The way we designed the tests is that we divided the ten tasks into two sections five tasks
each. The first section tasks were chosen to boost the users confidence and get them familiar
with the basics of WorldCat Discovery search engine. The five tasks in the second were designed
to be more challenging for the users because they had to use the skills they gained, such as
filtering, from the previous tasks as well as finding different format, foreign authors, and finding
a textbook.

First Part of the Test


First Task
The first task was to locate The Hunger Games: Catching Fire on DVD and to see if it
was available for checkout. The reason we choose this task was because many people are
familiar with the movie and therefore the users know what they are looking for. All of the users
were able to complete the task successfully and according to Figure 2 below, the average time it
took to complete that task was 68 seconds, about 1 minute and 8 second.

Section 1 Task 1
(In Seconds)

86

Section 1 Average
Time Per Task (Seconds)

86

42.4

68

60
83.2
41

51

72

76

User 1

User 2

User 3

Figure 1

User 4

User 5

Task 1

Task 2

Task 3

Figure 2

Task 4

Task 5

Second Task
The second task was to locate the French version of Victor Hugos Les Miserable. This
task increased the difficulty by specifying that we wanted the French version of the book not the
English. Therefore, the users had to filter by language in order to find the correct book. With the
exception of user 5, all 4 users were able to complete this task successfully. User 5 was able to
find the English version of the book but not the French version. From the post questionnaire that
we asked User 5, we found out that it was User 5s first time ever using the librarys search
engine. Therefore, he did not have enough experience and knowledge using a search engine with
its filters. In contrast, if you look at Figure 3, you will notice that User 4 took 17 seconds longer
than User 5 and was able to find the book successfully without using the language filter. Even
though User 4 found the English version initially, User 4 went to the second result page and
found the correct result. According to Figure 4 below, it took an average of 83.2 seconds to
complete the task; about 1 minute and 22 seconds.

Section 1 Task 2
(In Seconds)

Section 1 Average
Time Per Task (Seconds)

72

42.4

118

68

60
42
83.2
72

49
135

User 1

User 2

User 3

Figure 3

User 4

User 5

Task 1

Task 2

Task 3

Figure 4

Task 4

Task 5

Third Task
The third task was to locate the book Steve Jobs by Walter Isaacson and once they found
it, the facilitator asked them additional questions such as if the book is checked out, when is it
due back, and what will they do if they need this item quickly. This made the users scroll through
the book page looking for the asked information, which helped familiarize the users with the
various information displayed in the book page. With the exception of User 2, all of the users
were able to complete this task successfully. As for User 2, the only thing that this user was not
able to complete was finding out what to do if they needed the book quickly because the user did
not click on the book page to see the other information that the other users did. According to
Figure 6 below, it took an average 72 seconds to complete the task; about 1 minutes and 12
seconds.

Section 1 Task 3
(In Seconds)
50

Section 1 Average
Time Per Task (Seconds)

80

42.4

68

60
77

83.2
72
98
55

User 1

User 2

User 3

Figure 5

User 4

User 5

Task 1

Task 2

Task 3

Figure 6

Task 4

Task 5

Fourth Task
The fourth task was to locate a peer-reviewed article about the assassination of President
John F. Kennedy. This task was designed to be general with no specific result as long as it is a
peer-reviewed article. With the exception of User 5, all of the users were able to complete the
task successfully. As for User 5, the user could not find any peer-reviewed article due to
misspelling of keywords that the search engine was not able to catch and correct. According to
Figure 8 below, it took an average of 60 seconds to complete this task.

Section 1 Task 4
(In Seconds)

Section 1 Average
Time Per Task (Seconds)

68

73

42.4

68

60
83.2
39
45

72

75

User 1

User 2

User 3

Figure 7

User 4

User 5

Task 1

Task 2

Task 3

Figure 8

Task 4

Task 5

Fifth Task
The fifth task was find what floor in the library Lawrence Lessigs book, Free Culture:
how big media uses technology and the law to lock down culture and control creativity is located
and what is its call number. The only challenging part about this task was the long title of the
book. Some users typed whole title while others stopped after few words. All of the users were
able to complete this task successfully. If you look at Figure 9, all of the users except User 5
spent less than a minute to complete the task. In addition, according to Figure 10, the average
time it took all of the users to complete this task is 42.2 seconds.

Section 1 Task 5
(In Seconds)

Section 1 Average
Time Per Task (Seconds)

35

45

42.4

68

60
83.2

44
72

45

43

User 1

User 2

User 3

Figure 9

User 4

User 5

Task 1

Task 2

Task 3

Figure 10

Task 4

Task 5

Second Part of the Test


First Task
The first task was to add the eBook version of Kim by Rudyard Kipling to your list. This
task proved challenging because none of the users knew what My List was and how to use it. In
addition, only User 3 had to login to add the book to My List while the other users did not. We
also found out that after few days, the book gets deleted from My List. Nonetheless, all of the
users were able to complete the task successfully. If you look at Figure 11, you would notice that
User 5 took the longest time of 113 seconds, about 1 minute and 53 seconds. The reason behind
that misspelling and not finding the result on the first page when using Kim by Rudyard
Kipling as the keywords but User 5 succeed after removing the word by from the search bar.
According to Figure 12, it took an average of 69 seconds to complete the task; about 1 minute
and 9 seconds.

Section 2 Average
Time Per Task (Seconds)

Section 2 Task 1
(In Seconds)
57

69

152.75
113

131.75

35

36
133.2

50
90

User 1

User 2

User 3

Figure 11

User 4

User 5

Task 1

Task 2

Task 3

Figure 12

Task 4

Task 5

Second Task
The second task was to locate the eBook version of Fyodor Dostoyevskys Crime and
Punishment published by Christian Classics Ethereal Library. In this task, we introduced the
users to a new element that is the publishers, which is found in the description of the book result.
All of the users identified this task as a difficult task because they users could not find a filter to
help narrow down the results and User 1 used Advance Search but did not find a way to search
by publisher. In this task, both User 1 and User 2 failed the task. If you look at Figure 13, you
would notice that both User 1 and User 2 gave up after spending a large amount of time on it. In
addition, if you look at Figure 14, you would notice that Task 2 is the second biggest chunk of
the graph averaging to 177.4 seconds.

Section 2 Task 2
(In Seconds)

Section 2 Average
Time Per Task (Seconds)

60
85
360

82

69

152.75

131.75
36
133.2
300

User 1

User 2

User 3

Figure 13

User 4

User 5

Task 1

Task 2

Task 3

Figure 14

Task 4

Task 5

Third Task
The third task was to look up the following for class: Qureshi, Sajda. "Assessing the
Effects of Information and Communication Technologies on Development." Information
Technology for Development. 14.4 (2008): 259-261. This task was designed to be similar to the
third task in section one where the users have to look up some specific information about the
book. In this task, the required information were the volume number, issue number, year
released, and pages used in the peer-reviewed article. For some of the users, this task proved to
be challenging but not as challenging as the previous task. Both User 2 and User 5 failed to
complete this task. In User 2 case, the user looked through everything by author but still could
not find the correct result after 240 seconds. As for User 5, the big title and foreign author name
proved to be hard causing the user to misspell. According to Figure 16, this task took about of
133.2 seconds to complete; roughly about 2 minutes and 13 seconds.

Section 2 Task 3
(In Seconds)
125

Section 2 Average
Time Per Task (Seconds)

75
69

152.75

131.75
90

36

240

133.2
136

User 1

User 2

User 3

Figure 15

User 4

User 5

Task 1

Task 2

Task 3

Figure 16

Task 4

Task 5

Fourth Task
The fourth task was locate the peer-reviewed article about video games written by Joel
Cooper and Diane Mackie. Similar to the fourth task in section one, this task did not have any
specific result. The difference though is section one task four requirement was only a peerreviewed article about the assassination of President John F. Kennedy. Section two task four
requirement are a peer-reviewed article about video games authored by Joel Cooper and Diane
Mackie. This task lower the difficulty a bit to help the users regain their confidence by getting
successful results. Though, some users had little difficulty finding an article by both authors and
not one of them only. According to Figure 18, this task took an average of 36 seconds to
complete. In addition, according 17, all of the users were able to compete the tasks in less than a
minute.

Section 2 Task 4
(In Seconds)

Section 2 Average
Time Per Task (Seconds)

15
53

25

69

152.75

131.75
36
133.2

46
41

User 1

User 2

User 3

Figure 17

User 4

User 5

Task 1

Task 2

Task 3

Figure 18

Task 4

Task 5

Fifth Task
The fifth task was to locate Linguistic Anthropology: A Reader, Second Edition and to
see if the book available in the Atkins Library? All of our users identified this task as the hardest
task because they did not have other information about the book beside the book title and edition
number. According to Figure 19, User 1 and User 2 both spent more than 300 seconds, 5
minutes, trying to find the book and failed to find it because the time was up. User 4 and User 5
both tried to look for an edition filter to apply but did not find any and gave up after not finding
the book on the first or second result page. As for User 3, we initially thought that the user found
the correct book but the result had no book cover or any information in the description. The user
was satisfied with that result and did not look any further. According to Figure 20, it took 152.75
seconds to complete the task; roughly 2 minutes and 32 seconds making this task the hardest task
among all of our tasks.

Section 2 Average
Time Per Task (Seconds)

Section 2 Task 5
(In Seconds)
75
300

171

69

152.75

131.75
36
133.2

65

300

User 1

User 2

User 3

Figure 19

User 4

User 5

Task 1

Task 2

Task 3

Figure 20

Task 4

Task 5

Task Failed Graphs


Below Figure 21 shows the each user spent on the tasks they failed. Notice that all of the
users failed at task 5. As for User 1 task 2, the user did not want to give up even after the time
limit was reached.

Section 2 Failed Tasks


400
350
300
250
200
150
100
50
0
User 1

User 2
Task 1

User 3
Task 2

Task 3

Figure 21

User 4
Task 4

Task 5

User 5

Post Questionnaire
All of the users could not find the textbook Linguistic Anthropology: A Reader Second
Edition despite using the textbook exact title as well as different variation of the title such as
linguistic anthropology: a reader 2nd edition, linguistic anthropology: a reader, linguistic
anthropology a reader, or linguistic anthropology and filtering the results to books only. In
addition, one of the users was used linguistic anthropology 2nd edition as the search keyword
and filtering by book and got some sort of a result. It was a book without a cover and no
description beside the publication, number of pages, language, and OCLC number. At that time,
we thought that was the book but it turned out it was not.
We were able to find the book by typing linguistic anthropology second edition and filtering
by book. The book was the third result and had all the correct information. We found out that
typing 2nd edition and second edition will alter the results even though it should not matter.
In addition, the book that the user found was titled Linguistic anthropology (2nd edition) while
the book we found was titled Linguistic anthropology : a reader (Notice the space between
anthropology and the colon). To further investigate, we copied and pasted the title and filtered by
book. The book was nowhere to be found on the first page, therefore we went to the next page
until we reached the fourth page. There we found a different result from the previous result; it
was an older version of the book.
All of the users answered that they do not use the librarys search engine often because
they prefer to use Google because it is simple and they are used to using it. Other reasons were
that the users were not aware that the library had a search engine to search the library as well as
other libraries around the world. A common suggestions the users offered was for the teachers to
inform the students of the librarys search engine and its capabilities.

Usability Test Observations


During the first test, the first user had a difficult time correctly spelling all of the words in
what we had them researching. We decided that moving forward that the tasks would be written
out for the tester just as it would be in a traditional assignment. That way, spelling on the users
end would not be a variable. This also acknowledged to the researchers that spell correct was not
strongly implemented within the WorldCat search engine. In the questionnaire given after the
test, the first user suggested that the second list of tasks was much more difficult. This was an
intention of the researchers, as the first section was designed to help the user get a little more
comfortable with the capabilities of the search engine.
The second user had similar difficulties that the first user had with the second list of
tasks. Of the ten total tasks, the user completed seven. Though there were a total of 30 minutes
allowed for the entire test, the user became too frustrated with the tasks after around 20 minutes.
On the third question of the second section; Look up the following for class: Qureshi, Sajda.
"Assessing the Effects of Information and Communication Technologies on Development."
Information Technology for Development. 14.4 (2008): 259-261, the user took around four
minutes and searched through all of the results under the authors name and was still unable to
find what she was looking for.
The third user provided some insight as to how to more quickly find a book. The user
used the entire books title as opposed to just a section of the title that other users had been using,
relying on autocomplete. At times, the user would use the full title and still not be able to find the
correct result, other times they would immediately pop up. Consistency was also an issue with
this users experience. At times, a desired results page would state where the item was located in
the library, if it were at all. However, at times there would be nothing displayed as to where the

item was available, if it were at all. The user suggested that there should be a very visible
indication of whether the item was available in the library or not. This user, along with previous
testers, struggled with spelling, but noted that the autocomplete was at times helpful.
The fourth user discovered that adding a format to a search was not helpful in terms of
immediate search results. In the first section, task one-Locate The Hunger Games: Catching Fire
on DVD from the Atkins Library. Is it available for checkout?-the user typed hunger games
catching fire dvd assuming that the DVD version would be one of the initial results. However,
the first four results were not the DVD. This would be an occurrence the user had to deal with
during various aspects of the testing. During section two, the user almost had to self-teach certain
tasks such as adding things to their list.
The fifth and final user had never used the WorldCat search engine or any of the other
library resources. The facilitator gave a brief introduction as to what the search engine was that
way the user did not feel completely lost. The user began the test by getting nervous pretty
quickly when asked to find the French version of Les Miserables (section one, task two). The
user did not complete that task and gave up on some others due to spelling mistakes not caught
by the WorldCat software.

Key Problems
While going over the summaries of the different user tests, it is clear that a few key
problems were discovered in WorldCat. During the first usability test, the user noted that the
they were unaware that WorldCat was an interface used throughout the world, and thought it was
a search engine specific to the school. This was a concern of the test administrators. Students at
the University of North Carolina at Charlotte should be full aware of the capabilities of the
WorldCat, especially if professors on campus have the expectation of their students using the
search engine. Autocomplete and autocorrect on spelling were consistent issues throughout the
trials. As many autocomplete suggestions would cause a few different problems. Initially, a user
could be typing in what they are searching, only to have the autocomplete feature suggest what it
is they are looking for. The user would then have the expectation of utilizing the autocomplete to
get to the desired result quickly. However, the autocomplete would lead to differing results than
what it autocompleted to. This frustrated the users immensely as they are used to other search
engines such as Google or even Bing having autocomplete features that leads them directly to the
desired result. There were other cases that the autocomplete would give the tester exactly what
they were looking for in the search engine. However when they would select that autocomplete it
still would not lead them to the result they wanted.
The lack of spelling assist was discovered early on in the research, as the test
administrators did not supply the spelling for what they wanted to be searched for initially in the
first test. The expectation being, by the administrators, that WorldCats software would
automatically correct the spelling errors. This lead to the testers changing the test to include the
spelling of what the users had to search for.

Other problems included the lack of labeling of certain critical links on the search engine, a well
as no obvious indication on the first results screen of the availability of all of the items searched.
This should, in the administrators opinions, be a uniform policy for all search results. Broken
links also arrived as some people would click on the availability of an item only to be lead to a
NOT FOUND option on the screen.

Recommendations
The problems and recommendations from the users have been reviewed, and from that
we have proposed a number of short-term and long-term solutions to enhance the usability of
WorldCat Discovery. The proposed short-term solutions should serve as a work-around until the
next iteration of WorldCat is released, which then would hopefully include the proposed longterm solutions. We named a recommendation short-term if it could be implemented easily
without any reworking of the search engine code. The proposed long-term solutions would
require much more reworking of the code, thus not being viable to be implemented in this
iteration.

Short-Term
Quick Reference Guide of Index Labels Linked to Library Page
Out of all five users, none used any modifiers or index labels to enhance their searches.
When faced with a difficult task we feel it would have been helpful to have a link to a list
of the modifiers available to enhance their search. We propose that there be a link to a
page that has all of the modifiers listed and how to use them next to the search bar. By
including this users could quickly reference the modifiers without having to open a new
tab and navigating the OCLC webpage. The reference guide could include things such as
how to search by keyword (kw:), author(au:), genre(ge:), publisher(pb:), etc. It could also
include tips on how to customize searches by using things such as quotations ( ), the
plus operator(+), the tilde operator(~), and the wildcard operator (*).

On-Hover Tooltips
Based on the number of features on the interface, it would be useful to provide tooltips
for unfamiliar terms when you hover over them with the mouse. One example that many
of our users struggled to understand was what My List actually did. It would have
been beneficial if they could simply hover over it with their mouse and get a brief
description of what it does.

Long-Term
Spellcheck feature
During our user tests it was common for the users to have trouble with spelling. They had
a general idea how to spell the name or term but didnt know the exact spelling. That
being said we think it would be beneficial to add a spell-check feature similar to
Googles Did you mean feature. This could be implemented by doing the following

The user misspells a word in the search bar

They dont find what they wanted (dont click any results)

They realized they misspelled the word and rewrite the word in the
search box

They find what they want (they click in the first links)

After this pattern is done numerous times the system could recognize what the most
common misspells were and offer the most common corrections instantaneously.

More accurate autocomplete


On multiple occasions users clicked the autocomplete of the title they were looking for
assuming it would be the first entity on the list. This was not always the case. Sometimes
the title wasnt anywhere to be found (Section 2 Task 5) and sometimes the title was far

down the list. Based on this, we propose a reworking of the algorithm that makes the
auto-complete function more accurate.

Conclusion
While there is value in having a uniform search engine throughout an entire university, it
is important for the search engine to be not only as a tool students use in for homework
assignments when they are forced to, but also as a resource that they use when the dont.
WorldCat has its place in the academic sector and is fully capable of competing with the likes of
Google or Bing for students to find perhaps more appropriate sources for research than their
traditional outlets, such as Wikipedia. WorldCat, the default search engine of the J. Murray
Atkins library as well as the preferred search engine for academic projects at UNCC, was the
subject of our research testing. The testing found that while simple tasks were fairly simple
within the browser, the more specific the tasks became, the harder they were to complete.

You might also like