Professional Documents
Culture Documents
Introduction
Instructor’s manuals that accompany nursing education textbooks frequently offer banks
of test questions for use in student examinations. Writing effective multiple-choice questions
can be difficult and time-consuming for instructors, and test banks offer a way to make the
process less laborious. However, test banks should not be considered a panacea for instructors
Instructor’s manuals with test banks are fairly common adjuncts to nursing textbooks,
and publishers often market the manuals together with the textbooks as a package (Masters et al.,
2001). For example, the Instructor’s Resource that accompanies Potter and Perry’s (2005)
Fundamentals of Nursing textbook comes with a test bank CD-ROM so that instructors can
download the questions directly onto their word processors (Castaldi, 2005). It could be
assumed that questions derived from textbook test banks would accurately reflect the content in
the textbook, but a literature search performed by Masters et al. failed to find any studies that
explored this area. The purpose of this study was to evaluate multiple-choice questions from test
Background Information
Multiple-choice questions are the most frequently used examination format currently
used in nursing education (Masters et al. 2001). Multiple choice questions offer several benefits
for educators, including objectivity, efficiency, and ease of grading (Farley, 1989; Hoepfl, 1994).
Multiple choice questions also allow instructors to test a wide variety of topics in a short time
and assess higher level cognitive skills such as problem-solving, prioritizing, and application of
concepts to scenarios (Morrison & Free, 2001; Schick, 1990). However, multiple choice
questions may encourage guessing by students, and are time-consuming to write for educators
(Farley). Test banks that accompany texts are convenient tools for busy educators, as the ready-
Using test bank questions 2
made questions can dramatically reduce test preparation time. Test bank questions can also help
ensure that test questions correlate with material presented in the text (Clute & McGrail, 1989).
However, educators should review questions that are provided in test banks to ensure that they
are of high quality and able to test higher level cognitive levels (Clute & McGrail; Ellsworth,
The cognitive levels within Bloom’s (1984) taxonomy may be utilized by educators to
evaluate the difficulty of multiple choice questions (McDonald, 2002). Bloom’s taxonomy
divides learning into three domains: cognitive, affective, and psychomotor. Each of these
domains is also divided into multiple levels of increasing complexity. The cognitive domain is
split up into six levels: knowledge, comprehension, application, analysis, synthesis, and
evaluation. Knowledge is the simplest level, in which the learner is merely required to recall
previously learned facts and figures. Comprehension requires the learner to demonstrate
understanding of the meaning of the material. Questions that test at the comprehension level
may ask students to interpret, discuss, or paraphrase the information. Application requires the
learner to relate concepts or theories to new situations that have not been previously discussed in
the reading or classroom. Analysis requires the learner to dissect learned material into smaller
elements so that patterns may be identified. Questions that test at the analysis level may ask
contributory factors within a scenario. Synthesis requires the learner to make generalizations
from given information, or integrate knowledge from different areas of study. Evaluation
requires the learner to determine the value of information, discriminate between several different
concepts, or make choices based on reasoning and logic (Bloom, 1984; Jeffries & Norton, 2005;
application, and analysis levels. The multiple choice question format is not recommended for the
synthesis and evaluation levels, as they “require divergent thinking, and lead to unique responses
major role in student grading as well as admittance to professional programs (Hoepfl, 1994).
written to ensure reliability and validity. Numerous guidelines and principles have been
proposed for multiple-choice question development. For example, Dewey (2004) and Farley
(1989) recommend that all of the options should be approximately the same length, as students
Farley (1989) and McDonald (2002) counsel against the use of specific determiners
(always, never, none, all) in multiple choice questions. Because nursing practice is rarely
absolute, specific determiners often indicate incorrect answers for testwise students. Similarly,
Frary (1995) and Kehoe (1995) advise against the use of “all of the above” and “none of the
above” in multiple choice questions. This is particularly important when students have been
instructed to select the best possible answer from the available options. Testwise students know
that “all of the above” can be ruled out as the correct answer if any of the options are wrong.
“None of the above” should also be avoided if the question requires calculations or estimation
(Kehoe, 1995).
Frary (1995) and Hoepfl (1994) state that multiple choice questions must be written using
correct rules of language, and all of the options must match up to the stem. “Almost always, the
stem and the correct answer are grammatically consistent, but distractors, often produced as
afterthoughts, may not mesh properly with the stem” (Frary, p. 3). Testwise students are quick to
rule out options that are inconsistent with the stem. McDonald (2002) also noted that if only one
option is consistent with the stem, testwise students will select it even if they do not know the
correct answer.
Farley (1989) and Hoepfl (1994) recommend that educators avoid using negative
statements in multiple choice questions. Questions should be phrased whenever possible so that
Using test bank questions 4
students will look for the correct answer. By utilizing negative questions, educators may
reinforce incorrect information (Farley, p. 11). Kehoe (1995) noted that bias may be introduced
into the test through the use of negative questions because students routinely look for options
that make the stem true. Farley and Schick (1988) both recommended that if negatives must be
used in questions, they should be emphasized with underlines, bold type, or capital letters.
McDonald (2002) pointed out that questions that contain negatives in both the stem and options
“Students believe that instructors have an unconscious tendency to make C or D the correct
answer in multiple-choice examinations” (Clute & McGrail, 1989, p. 245). This tendency may
introduce a significant bias into the examination and undermine test reliability and validity.
Kehoe (1995) and McDonald (2002) stated that educators should randomize the correct answers
to questions on multiple choice tests, and ensure that answers are assigned equally to each of the
option choices.
Farley (1989) and Morrison and Free (2001) stated that the majority of multiple-choice
questions should be written at the application and analysis levels for nursing examinations to
facilitate the development of higher level cognitive skills. The application level goes beyond the
regurgitation of information necessary for the comprehension and knowledge levels and requires
students to apply concepts to new clinical circumstances or patient conditions. The analysis
level requires students to compare and contrast components of nursing theories or scenarios
(Demetrulias & McCubbin, 1982). Ultimately, “when the nurse educator’s goal is to teach a
thinking process or the use of knowledge for nursing intervention, the evaluation instrument
should require the student to use the same process” (Demetrulias and McCubbin, p. 16).
Literature Review
A literature review was done using the CINAHL and WilsonSelectPlus databases with
the following search terms: test banks, instructors’ manuals, multiple choice questions and
nursing education, and multiple choice test development. The search revealed only one review
Using test bank questions 5
of nursing or healthcare textbook test banks. Masters et al. (2001) reviewed 2913 multiple-
choice test questions randomly obtained from 17 undergraduate nursing textbook test banks,
were evaluated on (a) adherence to generally accepted guidelines for writing multiple-choice
questions; (b) cognitive level as defined by Bloom’s taxonomy; and (c) distribution of correct
answers as A,B,C, or D” (Masters et al., p. 25). The 30 guidelines utilized for the research study
included: using proper grammar, including only essential material in the stem or options,
providing only one correct answer, and ensuring that all of the options were plausible.
The findings of the Masters et al. (2001) research study demonstrated 2233 instances
where the questions did not comply with the general guidelines used for the study. While most
of the problems were minor, it was noted that 120 questions had more than one correct answer
and 21 questions listed the wrong correct answer. The most common violations were inadequate
spacing, unequal option lengths, and negative phrasing in the questions. One significant finding
was that more than 70% of the questions were written at the knowledge and comprehension
levels, rather than the application and analysis levels used for the NCLEX-RN examination
(Masters et al. 2001). Masters et al. also noted that several test questions included outdated
material and procedures, suggesting a lack of up to date clinical expertise by the author. Based
on these findings, Masters et al. recommended that nursing educators review test bank questions
A broader search of the literature was performed by reviewing the references listed in the
Masters et al. (2001) article. The ERIC database was then searched using the following terms:
instructor’s manuals, test banks, and multiple choice questions. This search revealed a series of
three articles written by Schick (1988, 1989, & 1990) that included broad overviews of textbook
test question bank usage. Six additional studies were also located that analyzed textbook test
management.
Hansen (1997) reviewed 440 auditing textbook test questions from five textbooks to
Using test bank questions 6
determine if test bank questions violated 17 standard guidelines for writing multiple-choice
questions. The findings revealed a total of 490 violations, with approximately 75% of the
questions containing at least one guideline violation. The most common violations were failure
to emphasize negatives within the questions, and using options that included all or none of the
above. These results are consistent with those found by Masters et al. (2001), although the
Ellsworth, Dunnell, and Duell (1990) evaluated test banks for guideline violations as well,
using seven educational psychology textbooks. The researchers assessed 32 textbook test banks
using 37 commonly accepted guidelines. The findings revealed that out of 1,080 questions,
approximately 60% contained at least one guideline violation. The most common violations were
grammatical errors and using negative phrasing in the stem. The researchers also examined the
questions to determine that the correct answers were randomly placed within the test. The results
showed that “option C was used as a correct answer more frequently than options A, B, or D, and
option A was used the least” (Ellsworth et al., p. 291). These findings differ from the Masters et
al. (2001) study, in which the correct test answers were found to be evenly distributed but almost
Clute and McGrail (1989) reviewed cost accounting textbook test banks to determine if
the correct answers were randomly placed. The findings showed that almost all of the test banks
contained significant bias in the placement of the correct answers. Specifically, it was found that
in questions with 5 possible options, answer E was correct only 5% of the time. Again, this study
contrasts with the Masters et al. (2001) study in which the answers were found to be evenly
distributed.
Hampton, Krentler, and Martin (1993) evaluated marketing and management textbook
test banks to determine the cognitive levels of the questions. The findings revealed that 87% of
the management questions tested at the knowledge level of Bloom’s taxonomy. The review of
the marketing textbooks found that 65% of the questions tested at the knowledge level. These
findings differ from the Masters et al. (2001) study, in which 70% of the questions were written
Using test bank questions 7
at the knowledge or comprehension level. Hampton et al. also noted that the marketing and
management test bank authors overrated the cognitive level of approximately 18% of the
questions. This is problematic because educators may overestimate the ability of their
Methodology
The study explored test bank questions from a convenience sample of five undergraduate
nursing textbooks listed in Appendix A. The research questions included: What percentage of
the questions assess above the comprehension level of Bloom’s (1984) taxonomy? Are the
correct answers randomized and evenly distributed between options A, B, C, D and E? How
many guideline violations will be found within the questions? The study examined the questions
to determine if they include the following violations: use of specific determiners or negative
questions, heterogeneity of option lengths, grammatical incorrectness, and use of all or none of
the above.
For the study, 10 chapters were chosen from each of the test banks. All of the questions
within each chapter were examined by this author employing the criteria listed above. The
results were then compiled and compared to the findings of the Masters et al. (2001) study.
Limitations
A major limitation of this study was the small number of test banks reviewed. No test
banks were reviewed that covered maternal-child, pediatric, or psychiatric nursing. Another
limitation was the potential for bias when determining the cognitive level of the test bank
questions, as all of the questions were reviewed by this author only. Additionally, this author
experienced difficulty evaluating of some of the test bank questions due to unfamiliarity with the
textbook.
Results
Using test bank questions 8
Examination of the cognitive levels of the test bank questions revealed that 36% of the
questions tested at the application level or higher (Table 1). This value is skewed by the Leonard
(2003a) medical terminology test bank, however, which contained 500 questions that tested
exclusively at the comprehension and knowledge levels. The low cognitive level of the
questions may be due to the fact that the Leonard (2003b) textbook and instructors’ manual were
developed for use by a variety of health care professional training programs, not just nursing
education.
Table 1
When the Leonard (2003b) textbook was removed from the equation, it was found that 48% of
the questions tested at or above the application level (Table 2). These findings differ from those
of the Masters et al. (2001) study, in which only 28% of the questions tested above the
Using test bank questions 9
comprehension level.
Table 2
A significant finding was that 67% of the questions in the Ignatavicius and Workman
(2003) medical-surgical nursing test bank tested at or above the application level. Additionally,
almost 60% of the questions in the Castaldi (2005) nursing fundamentals test bank tested at
application and analysis levels. An interesting finding was also noted regarding the health
assessment test banks evaluated by Masters et al. (2001) and this author. The test bank for the
2nd edition of the Jarvis (1996) health assessment textbook that was evaluated by Masters et al.
contained 58% application and analysis level questions. This author noted that 59% of the
questions in the Plowden and Hausauer (2000) test bank for the 3rd edition of the Jarvis (2000)
While several of the test banks contained large numbers of questions that tested above the
comprehension level, they were not consistent from chapter to chapter. The Plowden and
Hausauer (2000) health assessment test bank chapters ranged from 14% to 84% application and
Using test bank questions 10
analysis level questions. Interestingly, in the chapter on critical thinking, it was noted that only
14% of the questions tested above the comprehension level. The Castaldi (2005) nursing
fundamentals test bank ranged from 31% to 77% application and analysis questions.
Interestingly, it was noted that in the chapter on critical thinking in the Castaldi test bank, 77% of
The Doig (2004) pathophysiology test bank ranged from 0% to 31% application and
analysis level questions. On the other hand, the Ignatavicius and Workman (2003) medical-
surgical nursing test bank ranged from 42% to 89% application and analysis level questions. The
questions for the chapter on pain included 67% that tested above the comprehension level, in
contrast to the Doig pathophysiology test bank which had no questions that tested at the higher
cognitive levels. Remarkably, the Ignatavicius and Workman chapter on dysrhythmias included
89% application and analysis level questions. The majority of the questions required students to
interpret and decide on a nursing intervention based on EKG tracings. Only a very few questions
required the student to simply recall information about medications or dysrhythmias. The high
percentage of application and analysis questions in the Ignatavicius and Workman medical-
surgical nursing test bank most likely reflects the focus on critical thinking that is a major feature
of their 2002 textbook. Likewise, the Plowden and Hausauer (2000) health assessment test bank
was written to accompany the Jarvis textbook which is used for both graduate and undergraduate
education.
Analysis revealed that the correct answers appeared to be randomized, and were relatively
evenly distributed as options A, B, C and D (Table 3). Option A was the correct answer 23% of
the time, with 30% as option B, 26% as option C, and 21% as option D. These findings
correspond with the Masters et al. (2001) study, in which option A was used 25% of the time,
While the correct answers were evenly distributed overall, some chapters were quite
skewed. For example, chapter 10 in the Plowden and Hausauer (2000) health assessment test
bank had 47% of the questions (21 out of 45) with the correct answer as option C, while only
11% of the questions (5 out of 45) had the correct answer as option A. Chapter 42 in the
Ignatavicius and Workman (2003) medical-surgical nursing test bank had 37% of the questions
(14 out of 38) had the correct answer as option C, while only 13% (5 out of 38) had the correct
answer as option B.
Table 3
A B C D Total
Pathophysiology
101 148 136 84 469
(Doig, 2004)
Health assessment
85 124 122 83 414
(Plowden & Hausauer, 2000)
Fundamentals of nursing
43 52 55 60 210
(Castaldi, 2005)
Medical-surgical nursing
88 135 99 97 419
(Ignatavicius & Workman, 2003)
Medical terminology
137 137 121 104 500
(Leonard, 2003a)
GUIDELINE VIOLATIONS
Only a small number of negative questions were found in the test bank chapters that were
reviewed. The Plowden and Hausauer (2000) health assessment test bank and the Castaldi
Using test bank questions 12
(2005) nursing fundamentals test bank each contained one negative question, and it was noted
that neither test bank highlighted the negative wording. The Doig (2004) pathophysiology test
bank contained three negative questions, but the negative wording was italicized in each. The
Ignatavicius and Workman (2003) medical-surgical nursing test bank contained eight negative
questions, all of which failed to highlight the negative wording. Five other questions in the
Ignatavicius and Workman test bank presented a variation on negative question phrasing,
requiring the student to select patient statements indicating that additional client teaching was
required. These questions also required the student to select the client statement that was
incorrect, although the negative aspect was stated more clearly. No negative questions were
None of the test bank chapters that were reviewed utilized specific determiners, or options
that included all or none of the above. When heterogeneity of option lengths was assessed,
however, it was noted that the Plowden and Hausauer (2000) health assessment test bank
contained three questions in which the option lengths were markedly unequal, and the
Ignatavicius and Workman (2003) medical-surgical nursing test bank contained only one. No
inequality of option lengths were found with the other test banks.
The Doig (2004) pathophysiology test bank, the Plowden and Hausauer (2000) health
assessment test bank, and the Ignatavicius and Workman (2003) medical-surgical nursing test
bank each contained one typographical error. Additionally, one question in the Ignatavicius and
Workman medical-surgical nursing test bank was missing the question mark at the end of the
stem, and two others were poorly worded, making them difficult to read. No typographical
errors were found in the Castaldi (2005) nursing fundamentals test bank or the Leonard (2003a)
Six incorrect answers were found in the Ignatavicius and Workman (2003) medical-
surgical nursing test bank out of the 419 questions that were reviewed. Eight incorrect answers
were found in the Doig (2004) pathophysiology test bank out of the 469 questions that were
reviewed. In each case, the correct answer was included as an option, but the answer key listed
Using test bank questions 13
the wrong one. Neither test bank included page references or rationale for the correct answers in
the answer key. No incorrect answers were found in the other test banks.
In addition to the violations listed above, other problems were noted with some of the
questions in the Ignatavicius and Workman (2003) medical-surgical nursing test bank. Some of
the questions had more than one correct or best answer. One important guideline for multiple
choice questions is that each question should have only one correct answer. “Care must be taken
to see that other responses do not confuse the issue from a logical perspective” (Shick, 1988, p.
42). Violation of this guideline can be seen in the following question from the Ignatavicius and
The diabetic client has severe peripheral neuropathy resulting in numbness and reduced
sensation. Which intervention should you teach the client to prevent injury as a result of
this complication?
Correct answer: D
Option D is correct, as diabetics should always use a bath thermometer to test water
diabetics should also examine their feet daily to check for hammer toes or bunions that can lead
to blister formation or sharp toenails that can cause skin irritation (University of Michigan Health
System, 2003). Therefore, both options A and D are correct for this question.
Several other questions asked the student to select the nursing diagnosis with the highest
priority from the option list. However, the limited information presented in the stem and the
Using test bank questions 14
option choices made it difficult to determine exactly which diagnosis was the most important, as
seen with this question from the Ignatavicius and Workman (2003) medical-surgical nursing test
bank:
Which nursing diagnosis has the highest priority for the client who is receiving epidural
Correct answer: C
While risk for infection is certainly an important concern for patients receiving epidural
that requires close monitoring (Moraca, Sheldon & Thirlby, 2003). Therefore, both options A
and C would be appropriate answers to this question. Another example of a poorly written
nursing diagnosis question from the Ignatavicius and Workman (2003) medical-surgical nursing
What is the priority psychosocial nursing diagnosis for a person with moderate COPD who
Correct answer: B
Again, it is difficult to determine which nursing diagnosis is the most important based on
the information provided in the stem. Patients with emphysema do not have significant coughing
and little sputum production, as compared to patients with chronic bronchitis. Patients with
chronic bronchitis also do not typically develop barrel chests as there is no hyperinflation of the
lungs (Boyle & Locke, 2004). Therefore, it is not clear whether or not barrel chest or chronic
coughing would be significant symptoms for this patient, making it impossible to tell if
disturbed body image or social isolation are even appropriate diagnoses. Additionally, the
information in the stem does not state whether or not the patient lives alone. Impaired home
maintenance would be much less of a concern if the patient was living with other healthy family
members. With the two questions listed above, the authors did not provide enough information
for the students to select one nursing diagnosis that takes priority over the others. For the “best
answer” format to be effective with multiple choice questions, one answer should be listed that is
One additional problem was also noted with this question from the Doig (2004)
Vitamin ____ is required for normal clotting factor synthesis by the liver.
A) K
B) D
C) E
D) B12
Correct answer: A
(Doig, 2004, question 65, chapter 20)
The ordering of the options can make it difficult for the student to mark the correct
answer on the test sheet. Multiple choice options should be ordered alphabetically, numerically,
or logically to facilitate the ease of response (Shick, 1988). The options would be more easily
Vitamin ____ is required for normal clotting factor synthesis by the liver.
A) E
B) B12
C) K
D) D
Correct answer: A
This order matches vitamin B12 with option B, and vitamin D with the option D, making
it easier for the student to transcribe the answer onto the test copy. Options A and C were simply
Discussion
Multiple choice test questions are most effective when they are carefully developed by
educators (McDonald, 2002). It is vitally important for educators to review questions taken from
Using test bank questions 17
textbook test banks prior to using them in examinations with students. While most of the
questions from the test banks were found to be acceptable, some significant guideline violations
were noted. Most worrisome were the questions with wrong answers or more than one correct
answer, as these questions are frustrating for the students and have the potential to bring down
The instructors’ manuals and test banks were available in a variety of formats, including
CD-ROM, book, and internet website. The Ignatavicius and Workman (2003) medical-surgical
nursing test bank was on CD-ROM, but this author found it very difficult to formulate the
questions into test format and print them out. The Castaldi (2005) nursing fundamentals test bank
was available in paperback as well as CD-ROM format, which allows instructors to read and
highlight questions in the book, and then download selected questions directly to Word from the
CD-ROM.
Several differences were noted when comparing the results of this study with those of
Masters et al. (2001). This study found that 48% of the questions tested at the application level
or higher, while the Masters et al. (2001) study found only 28% of the questions tested above the
comprehension level. This may be due to the fact that the test banks used in the Masters et al.
study were written between 1991 and 1997, while the test banks for this study were written
between 2000 and 2005. It is possible that the test bank authors are reflecting the recent
While all of the test banks that were reviewed for this study were designed for use with
undergraduate nursing students, some variation may be noted between the textbooks. The Doig
(2004) pathophysiology test bank had the lowest percentage of questions that tested above the
comprehension level by far, only 14%. This is surprising, as the Heuther and McCance (2003)
pathophysiology textbook is written for baccalaureate nursing students, with critical thinking
exercises at the end of each chapter. Conversely, the Castaldi (2005) test bank for Potter and
Perry’s (2005) medical-surgical nursing textbook included 67% application and analysis
Using test bank questions 18
questions, the highest percentage of the test banks reviewed in this study. This is very consistent
with the textbook focus on critical thinking and application of the nursing process for patient
care. The Potter and Perry textbook includes critical thinking exercises in each chapter, as well
as critical thinking models and concept maps to help students understand the theoretical
Within the Plowden and Hausauer (2000) test bank for the Jarvis (2000) health assessment
textbook, 59% of the questions reviewed for this study tested at the application or analysis level.
It can be expected that this test bank would include a high percentage of questions that utilized
higher cognitive functions, as the Jarvis text was written for use at the undergraduate as well as
the graduate levels. The Plowden and Hausauer test bank was designed for use by both
undergraduate and graduate nurse educators, which raises the question of whether or not the
same questions are appropriate for both levels of study. Conversely, the Leonard (2003b) test
bank for the Leonard (2003a) medical terminology textbook had no questions that tested above
the comprehension level. This may be due to the fact that the textbook is geared for beginning
students to learn basic medical terms, and may be used for any of the allied health care
professions.
Within each of the test banks, wide variations were noted between chapters in the
percentages of questions that tested above the comprehension level. The Plowden and Hausauer
(2000) health assessment test bank was the most significant, ranging from 14% in the critical
thinking chapter to 84% within the chapter on neurological assessment. Reasons for these
variations were not clear. When analysis was done across chapter content, no pattern was readily
apparent. For instance, three of the chapters in the different test banks reviewed covered content
about care and assessment related to the cardiac system. Within the Doig (2004) pathophysiology
test bank, 8% of the questions in the cardiac chapter tested above the comprehension level, as
compared to the overall average of 14%. Within the Plowden and Hausauer health assessment
test bank, 55% of the questions in the cardiac chapter tested at the application or analysis levels,
as compared to the total average of 59%. Within the Ignatavicius and Workman (2003) medical-
Using test bank questions 19
surgical nursing test bank, 89% of the questions in the cardiac chapter tested above the
comprehension level, as compared to the overall average of 67%. Similar findings were noted
with other content areas that were tested by multiple test bank chapters. However, table four
shows that the test banks with two authors had a larger range of differences (47 percentage points
for the Ignatavicius and Workman medical-surgical nursing test bank and 70 points for the
Plowden and Hausauer health assessment test bank) than the test banks with only one author (31
percentage points for the Doig pathophysiology test bank and 46 points for the Castaldi
Table 4
Range 31 46 47 70
The Doig (2004) pathophysiology test bank was also on CD-ROM, but required Internet
Using test bank questions 20
Explorer to access the questions on the internet. Each chapter had to be downloaded separately,
which was time-consuming. However, the test questions were available in RTF or PDF formats,
which allow instructors to copy and paste questions for their examinations. Additionally, the
questions could be accessed and printed either with or without the correct answers visible. This
feature is useful for instructors who wish to put together an answer key to go along with the
examination copy. The Doig test bank also offered true/false, fill in the blank, and multiple
answer questions for instructors, although this study only reviewed the multiple choice questions.
This author found it helpful when the authors put rationales and/or page numbers along
with the correct answers for the questions. This is very helpful when there is an incorrect answer
listed, as the instructor can readily find the correct information to change the answer key. The
page numbers are also helpful for when instructors conduct reviews after the test is administered.
The students can be easily referred to the correct page in their textbooks to clarify any confusion
Recommendations
Further research is needed into the use of instructors’ test banks for undergraduate nursing
education. Larger studies may be conducted that compare test banks from several textbooks in
the same specialty, i.e. various medical-surgical nursing test banks. Although the research
findings of Masters et al. (2001) did not exactly correlate with the observations of this author, the
significance remains clear. Test banks of multiple choice questions available for use by nurse
educators are a helpful resource for examination development, but should be reviewed carefully
before use. Instructors should evaluate all of the questions carefully to ensure that they follow
general guidelines for multiple-choice question development and accurately reflect the content
References
Using test bank questions 21
Boyle, A. & Locke, D. (2004). Update on chronic obstructive pulmonary disease. MEDSURG
Castaldi, P. (2005). Instructor's resource manual and test bank to accompany Potter and Perry’s
Clute, R., & McGrail, G. (1989). Bias in examination test banks that accompany cost accounting
Demetrulias, D., & McCubbin, L. (1982). Constructing test questions for higher level thinking.
Dewey, R. (2004). Writing multiple choice items which require comprehension. Retrieved
Ellsworth, R., Dunnell, P., & Duell, O. (1990). Multiple-choice test items: What are textbook
Farley, J. K. (1989). The multiple-choice test: Developing the test blueprint. Nurse Educator,
14(5), 3-5.
Frary, R. (1995). More multiple-choice item writing do's and don'ts. Retrieved October 18, 2004,
http://www.ericfacility.net/ericdigests/ed398238.html
Hampton, D., Krentler, K., & Martin, A. (1993). The use of management and marketing textbook
multiple-choice questions: A case study. Journal of Education for Business, 69(1), 40-43.
analysis of auditing test banks. Journal of Education for Business, 73, 94-97.
Heuther, S., & McCance, K. (2003). Understanding pathophysiology (3rd ed.). St. Louis: Mosby.
Hoepfl, M. (1994). Developing and evaluating multiple choice tests. The Technology Teacher,
53(7), 25-26.
Using test bank questions 22
Ignatavicius, D. & Workman, L. (2003). Brownstone computerized test bank CD-ROM for
Jarvis, C. (1996). Physical examination and health assessment. (2nd ed.). Philadelphia: Saunders.
Jarvis, C. (2000). Physical examination and health assessment. (3rd ed.). St. Louis: Saunders.
Jeffries, P., & Norton, D. (2005). Selecting learning experiences to achieve curriculum
outcomes. In D. Billings & J. Halstead (Eds.), Teaching in nursing: A guide for faculty
Kehoe, J. (1995). Writing multiple-choice test items. Retrieved October 18, 2004, from ERIC
http://www.ericfacility.net/ericdigests/ed398236.html
Leonard, P. (2003a). Instructor’s curriculum resource to accompany quick and easy medical
Leonard, P. (2003b). Quick and easy medical terminology (4th ed.). St. Louis: Saunders.
Masters, J., Hulsmeyer, B., Pike, M., Leichty, K., Miller, M., & Verst, A. (2001). Assessment of
multiple-choice questions in selected test banks accompanying text books used in nursing
McAfee, D. (1979). College text test validity. Health Education, 10(2), 18-19.
Moraca, R., Sheldon, D., & Thirlby, R. (2003). The role of epidural anesthesia and analgesia in
surgical practice. Annals of Surgery, 242(5), retrieved December 20, 2005 from the
Morrison, S., & Free, K. (2001). Writing multiple choice test items that promote and measure
Plowden, K. & Hausauer, J. (2000). Instructor’s manual and test questions for Jarvis’ physical
Potter, P., & Perry, A. (2005). Fundamentals of nursing (6th ed.). St. Louis: Elsevier Mosby.
Using test bank questions 23
Schick, J. (1988). Those tantalizing textbook tests. Health Education, 18(6), 42-45.
Schick, J. (1989). Tantalizing textbook tests, part II: True-false, matching, completion and essay.
Schick, J. (1990). Textbook tests: The right formula? The Science Teacher, 57(6), 33-39.
University of Michigan Health System. (2003). For diabetics, proper foot care can save life and
http://www.med.umich.edu/opm/newspage/2002/diabeticsfoot.htm
Using test bank questions 24