Professional Documents
Culture Documents
.ourth step:
#iving 3eightage to the Type of %tems
%n this step, a teacher determines the number of items, their types,
their relative marks. .or it, it would be convenient to use the
following table:
&. @o. Type of %tems @umber of %tems &core Percentage
8. Jong answer type
5.
<.
&hort answer type
9b"ective type
.ifth step:
Eetermining Alternatives:At this level, it is determined how many
alternatives or options should be given according to the type of
+uestions. #iving alternatives influences the reliability and validity of a
test7 therefore, it is suggested that alternatives should not be given
in ob"ective type +uestions, while in essay type +uestions only internal
choice can be given.
&ith step:
Eivision of &ections: %f the scope or types of +uestions is uniform,
them it is not necessary to divide the test into sections. Fowever, if
it is diverse and different types of +uestions have been specified and
the nature of the test seems to be heterogeneous, then a separate
section should be made comprising each type of item.
&. @o. &ections Type of items &core Percentage
8.
5.
A
2
C
9b"ective type
Jong answer type
&hort answer type
&eventh step:
(stimation of Time:At this step estimation of the total time the
whole test is likely to calculate. Time is estimated on the basis of
type and number of items. &ome time should be reserved for
distribution and collection of answer sheets. The following table can be
used for convenience.
&. @o. Type of %tems @umber of %tems Time ,in minutes-
8.
5.
<.
9b"ective type
Jong answer type
&hort answer type
(ighth step:
Preparation of 2lueprint:A blueprint provides a bird!s eye view of
the entire test. %n it we can see the topics, teaching ob"ectives, and
types of +uestions, number of items and distribution of scores and
their mutual relationships. A blueprint is the basis for test
construction. A format is given below/
&.@. Teaching
9b"ective
4nowledge comprehension Application &kill Total
Types of
Kuestion
Topics
,J- ,&- ,9- ,J- ,&- ,9- ,J- ,&- ,9- ,J- ,&- ,9-
8.
5.
<.
H.
I.
J/ Jong Answers Type &/ &hort Answers Type
9/9b"ective Answers Type
@inth step:
Preparation of score key:
A score key increases the reliability of a test &o that the test
constructer should provide the procedure for scoring the answer
script. Eirections must be given whether the scoring will be made by a
scoring key ,3hen the answer is recorded on the test paper- or by
scoring stencil ,when the answer is recorded on separate answer
sheet- and how marks will be awarded to the test items.
%n case of essay type items it should be indicated whether to score
with Lpoint method! or with the Lrating method!. %n the point
method each answer is compared with a set of ideal answers in the
scoring key. Then a given number of points are assigned. %n the rating
method the answers are rated on the bases of degrees of +uality and
determines the credit assigned to each answer.
3hen the students do not have sufficient time to answer the test or
the students are not ready to take the test at that particular time.
They guess the correct answer. %n that case to eliminate the effect
of gusseting some measures must be employed..2ut there is lack of
agreement among psycho/matricians about the value of correction
formula so far as validity and reliability are concerned %n the words of
(bel 7neither the instruction nor penalties will remedy the problem of
guessing. 4eeping in view the above opinioned and to avoid this
situation the test constructor should give enough time for answering
the test idea.
Thus in order to bring ob"ectivity in a test , it is essential that a
tester should be fully clear about the type of answer epected from
a +uestion. .or this, if they are ac+uainted with the right answers.
Then diversity in scoring can be eradicated.
8..or 9b"ective Type:
&. @o. %tem &erial Answer &core
5..or Jong Answer and &hort Answer Type:
&. @o. %tem &erial 9utline of Answer &core )emarks
.irst try/out of the test:
At this stage the initial format of the test is administered on a small
representative sample. After that the process of item analysis be
used to calculate difficulty level and discriminative value. There are a
variety of techni+ues for performing an item analysis, which is often
used, for eample, to determine which items will be kept for the final
version of a test. %tem analysis is used to help 0build1 reliability and
validity are 0into1 the test from the start. %tem analysis can be both
+ualitative and +uantitative. The former focuses on issues related to
the content of the test, eg. content validity. The latter primarily
includes measurement of item difficulty and item discrimination.
Anitem!s difficulty level is usually measured in terms of the
percentage of eaminees who answer the item correctly. This
percentage is referred to as the item difficulty inde, or 0p1.
%tem discrimination refers to the degree to which items differentiate
among eaminees in terms of the characteristic being measured ,e.g.,
between high and low scorers-. This can be measured in many ways.
9ne method is to correlate item responses with the total test score7
items with the highest test correlation with the total score are
retained for the final version of the test. This would be appropriate
when a test measures only one attribute and internal consistency is
important.
This initial format is administered on small representative sample group.
After that the process of item analysis is used in order to calculate
the difficulty level, discriminative value and alternative ,in multiple
choice items-.
.irst of all in the contet of multiple choice items the appropriate
choice is also selected and that alternative is re"ected which has been
opted by the least number of students.
#enerally, a test is constructed for average students, @aturally the
division according to ability grouping is an essentiality. #enerally the
ability distribution used in normal probability curve provided the basis
for the distribution.
9n the basis of @.P.C., it is advisable that a test must not be
constructed for etreme cases, such as backward or gifted students.
Therefore the items which have been solved by top gifted students
and the items solved by the below most dullard students must be
eliminated from the test as they must be treated as to difficult or to
easy test items.
%n the contet of difficulty level, the following difficulty levels are
suggested for the selection of +uestions as per 4atz ,8;I;- also
recommendation/
&.@6. Types of items Eifficulty Jevel B
8.
5.
<.
H.
I.
Jong answer type
Alternatives I
Alternatives H
Alternatives <
Alternatives 5
I6B
M6B
M5B
MMB
=IB
%n the same way which may be measuring the same content area twice
as ,e.g. who was the founder of $ughal empire in %ndiaN And 3hich
empire was formed by 2aber in %ndiaN- both +uestions refer to one
content area that 2aber established the mughal empire in india. 9ut of
these two +uestions one +uestion be treated as bogus and must be
ecluded from the test.
%n the same way the discriminating power of the items be calculated
and the +uestions with least discriminating power must be ecluded
from the test. #enerally the items having 5I hundred discriminating
value are considered suitable from a test.
Preparation of final test:
The test will provide useful information about the students!
knowledge of the learning ob"ectives. Considering the +uestions relating
to the various learning ob"ectives as separate subtests, the evaluator
can develop a profile of each student!s knowledge of or skill in the
ob"ectives.
The final test is constructed after the above analysis for this a
suitable format is prepared and norms are specified. Also, instructions
for eaminees be prepared.
The test constructed in accordance to the above referred procedure
will definitely assumes a purpose or an idea of what is good or
desirable from the stand point of individual or society or both.