You are on page 1of 26

Assessing Vocabulary Recognition

_______________________________________________________________

Traditional Multiple Choice and Alternative Assessment Techniques

Kri Howland AL 6730 Dr. Hahn Nguyen 6 December 2012

Howland 1

Introduction The purpose of this paper is to report on a group project completed in AL 6730 Assessment for Second Language Learning, covering various theoretical and practical applications of assessment, and finally reflecting on a specific aspect of the test that could be changed (i.e. from traditional assessment to alternative assessment) while maintaining the tests objectives. The background context of the students, creation of the test, administration of the test, analysis of the test and possible alternatives for the test are discussed in detail. A literature review on the advantages and limitations of alternative assessment from reliable sources follows the description of the groups test. Lastly, future inquires based on the aforementioned categories are mentioned. Project Description Background Information Host class The host class was Dr. Brian Rugens International Education class for the Bridge Program. It was hosted at Hawaii Pacific University on Mondays, Wednesdays and Fridays at 12:55-1:40pm. The students were upper intermediate level at the college age, meaning their academic performance was not yet high enough to place them in classes with native speakers of English. Therefore the students goal is to receive extra English instruction in order to gain placement in those classes. The host teacher takes an EAP (English for Specific Purposes) approach and requested that the test the group created to be recognition only. After observing the class, the students were thought to be shy in volunteering answers, but very active in group activities.

Howland 2 Host Institution The host institution was Hawaii Pacific University; specifically, the International Bridge Program. The goal of this program is to offer an opportunity for international students to build English language competency, academic skills, and acquire contentbased knowledge in preparation for academic success at Hawaii Pacific University. It is located in Downtown Honolulu, Hawaii and is a private university with 1/3 international population, 1/3 mainland American population and 1/3 local Hawaiian resident population. Group Members The group members for this project were Hanne Hkonsen of Norway, Ciwang Cirenwangdui of Tibet, Rahma Kadir of Indonesia and myself, Kri Howland of Massachusetts, U.S.A. Ciwang has extensive experience teaching EFL in Tibet for 7 years at the high school level. Rahma has 2 years of teaching experience as an English elementary teacher in Indonesia. Hanne has not taught ESL/EFL, but was briefly a physical education teacher in Norway to 12-13 year old students at the junior high level. Kri has had no ESL/EFL teaching experience, but favors research in Computer Assisted Language Learning and hopes to pursue an administration path at Study Abroad and International Service Centers in which she would incorporate the theories and material development of the TESOL program. Language Assessment Instrument Administration of assessment

Howland 3 The test was given on Monday, November 5, 2012 to 14 students. We returned graded tests on Friday, November 9, 2012 to the host teacher. The type of assessment approach used was an achievement test for vocabulary recognition. This was done though multiple choice items with and without context. The teacher-friendly version of test (with keys) can be found in Appendix A. The student friendly version (without keys) can be found in Appendix B. Type of assessment The purpose of the assessment was to test the students achievement of learned academic vocabulary through recognition techniques. The Item-Design Approach components were as follows. The test was discrete-point testing because we were only testing one skill (vocabulary recognition) and the students did not need to utilize any other language skills to perform the task presented to them. The test was also indirect, because there was no productive task that could be measured; we were not testing exactly the skill we wish the students to be proficient in. And finally, the test was criterionreference because it was an achievement test; the students performance was in no way affected by the performance of their classmates. For the scoring procedure, we decided that each item would be worth 1 point. Students received a full point if they chose the correct answer, spelled it correctly, and, if need be, changed the tense correctly. Students received half a point if they chose the correct answer but either spelled it incorrectly or, if applicable, did not change the tense correctly. We chose to score this way because it was a recognition test and our belief is that the students should be able to spell and conjugate correctly when given the words in

Howland 4 the word bank. All items were of the same point value because all were recognition. If the answer was incorrect, students receive no points. Objectives The objectives were given to us by the teacher and modified by the group fit our test specifically.

1. The student will be tested on the ability to choose the correct definition for select vocabulary words from the Academic Word List (AWL). 2. The student will be tested on the ability to identify the meaning of select vocabulary words from the AWL based on their use in context. 3. The student will be tested on the ability to distinguish between multiple meanings of select vocabulary words from the AWL based on their use in context. 4. The student will be tested on the ability to replace words from a short reading with select vocabulary words from the AWL that have a similar meaning.

Specification 1. Specifications of content: a. Operations: Recognition of academic words with and without context. 1. Recognizing word meaning in sentence context 2. Recognizing definitions 3. Recognizing synonyms without context 4. Recognizing word meaning in paragraph context

Howland 5 5. Spelling answers correctly b. Types of text: Authentic, academic c. Length of text: 277 words d. Addressees of text: Non-native speakers at the undergraduate level in an International Education class. e. Topics: Single sex schooling. f. Readability (Flesh-Kincaid or grade level): 7-8th grade because they are an upper intermediate level. g. Structural range: Simple grammar because we are testing them on vocabulary. h. Vocabulary range: Generally academic. i. Dialect and style: Standard American English. 2. Structure, timing, medium, and techniques: a. Test structure: 4 sections 1. Multiple choice with context

2. Matching 3. Multiple choice without context 4. Multiple choice in passage context b. Number of items: 20 multiple choice items, 10 matching items. Total: 30 c. Number of passages: 4 sections 1. section1: 5 items 2. section 2: 10 items 3. section 3: 5 items

Howland 6 4. section 4: 10 items d. Medium: Paper and pen. e. Testing techniques: Multiple choice 3. Criterial level of performance: Satisfactory performance is recognizing 80% of the vocabulary in each section. So students who reach this level with be considered having succeeded the course objectives in terms of this quiz. 4. Scoring procedure: There will be objective scoring with four scorers. A correct answer will receive 1 point, an incorrect answer will receive 0 points and a misspelled answer will receive a point as long as its comprehensible to the scorers. 5. Sampling (where drew the materials for the test from): Vocabulary will be selected from the Academic Word List (AWL) and the passage came from a website (singlesexschools.org/evidence.html) on single sex schooling in relation to their unit on single sex schooling. Student Results The item analysis showed that the majority of the students performed well for an achievement test with an average of 83%, however the distractors did not perform their job correctly, as they were not strong enough to sway students from the correct answer. Below is a breakdown of the results. Table of student results

Howland 7 As previously stated, the average of the students was 83% out of 100%. The highest score was 29 out of 30. The lowest score was 19 out of 30. The most frequent score was 26.5 out of 30, or 88%. The test was designed for 30 minutes (with a concern from the host teacher that 30 items was too long for the students), however all students completely the test within the 30 minutes. Figure 1 (below) shows the percentage out of 100% of the scorers. The light shaded areas highlight the top scorers of the group, while the dark shaded areas highlight the low scorers of the group.
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. Noriyo Yurika Dominic Younji Gina Tran Curtis Ho Seung Jean Maria Wendy Jinny Dan luo Lavenda Yuki 96.7 93.3 88.3 88.3 88.3 88.3 83.3 83.3 81.7 81.7 78.3 76.7 70.0 63.3

Figure 1 Histogram The histogram displayed below (Figure 2) shows that the majority of the students scored in the mid-20s range, a large number of students scored between 25 and 30 and a low number of students scored in the low 20s or below. The bin numbers were from 1530 with the range of 5 in order to properly display the distribution of scores.

Howland 8

Histogram
8 Frequency 6 4 2 0 15 20 25 Bin 30 More Frequency

Figure 2 There were four sections to the test and the results for frequency distribution, central tendency and dispersion, item facility and item discrimination will be sectioned as well, as seen in Appendices A and B. Part I: Multiple Choice

Distractor Analysis

Item 1 2 3 4 5 Distractors:
1 D. 2 A, D. 3 B, C, D. 4 B. 5 B, C, D.

A 1 0 14 (key) 10 (key) 14 (key)

B 12 (key) 1 0 0 0

C 1 13 (key) 0 3 0

D 0 0 0 1 0

Nobody selected these distractors, they did not fulfill their purpose of distracting. They might be too easy, and should be considered for revision. The distractor that worked best was 4C. Three out of fourteen students chose this distractor. Also, distractor 2B, 1C, 4D were each chosen by one student.

Item Facility (n=14)

Item 1 2 3 4 5

Number of students who answered correctly 12 13 14 10 14

I. F 0.8 0.9 1 0.7 1

Everyone answered item 3 and 4 correctly. This is good because the test was criterion referenced. The most challenging item was number four. Ten out of fourteen students answered it correctly. Item 1 and 2 were also easy items, because their IF value was very close to 1.

Item Discrimination (n=14) 25% of total number of students. Item 1 2 3 4 5 Number of high scorers (top 4) who answered correctly 4 4 4 2 4 Number of low scorers (bottom 4) who answered correctly 3 3 4 3 4 I. D 0.2 0.2 0 -0.2 0

Item 3 and 5 showed no difference between high and low scorers. In item 4 the low scorers preformed better than the high scorers. Item 1 and 2 are not within the acceptable range of 0.35-1. All the items therefore did a poor job of discriminating between high and low scorers, which would need to be improved upon in future tests.

Part II: Multiple Choice (Matching)

Distractor Analysis*

Distractors:
(3) I (1) C (1) J

Only these three distracters were selected, meaning too many of the other distracters were poor. The best distractor was letter J.

Item Facility (n=14) Item 1 2 3 4 5 6 7 8 9 10 Number of students who answered correctly 13 13 12 14 14 14 14 13 14 14 I.F. 0.93 0.93 0.86 1 1 1 1 0.93 1 1

The most challenging item for the students was #3. Items 4-7,9,10 were answered correctly by all of the students. This is good by a teachers standpoint because it was a criterion-referenced test and the students learned the vocabulary well. Item discrimination (n=14) 25% of total number of students. Item 1 2 3 4 5 6 7 8 9 10 Number of high scorers (top 4) who answered correctly 4 4 3 4 4 4 4 4 4 4 Number of low scorers (bottom 4) who answered correctly 4 3 4 4 4 2 4 3 4 4 I.D. 0 0.29 -0.29 0 0 0.57 0 0.29 0 0

Items 1,4,5,7,9,10 do not show any difference between high scorers and low scorers. Item 6 showed the biggest distinction between the high scorers and the low scorers. Item analysis part III
Item Facility Item 1. 2. 3. 4. 5. Students who answered item correctly 14 14 6 14 9 IF 1 1 0.42 1 0.64

Item 1, 2 and 3are very easy for the students. All of the students answered these items correctly. Item 3 and 5 has moderate difficulty.

Distractor Analysis Item A 1 14*

B 0 0

C 0

2 3 4 5

0 6* 0 2

0 4 14* 2

14* 0 0 1

0 4 0 9*

Distractor 1. b, c, d are not working 2. a, b, d are not working 3. c is not working 4. a, c, d are not working

All of the distractors in item 1, 2, 3 and 4 are not working because nobody chose these distractors. They need to be rejected and changed.

Item Discrimination

Item

High scorers (top four) With correct answers

Low scorers (bottom four) with correct answers 4 4 0 4 3 0 0 1 0 0

I.D

1. 2. 3. 4. 5.

4 4 4 4 3

Item 1, 2, 4, 5 are very easy because they cannot discriminate between high scorers and low scorers students. Part IV: Passage

Item Facility (n= 14)

Item 1 2 3 4 5

Students who answered item correctly 13 13 10 13 9

I.F. 0.93 0.93 0.71 0.93 0.64

6 7 8 9 10

2 6 8 4 12

0.43 0.50 0.57 0.29 0.86

The most challenging items for the students were #9, #7 and # 6. There were only 2 students who answered it correctly for item #6. And there were only 4 students who answered it correctly for item #9, 6 students who answered it correctly for item #7. So the other options are very good distractors. Items #1, #2 and #4 were almost answered correctly by everyone (except 1for each item). They may need to be revised. For item #3 there were 10 students who gave correct answers, so the other options are acceptable distractors. For item #5, 9 students gave correct answers, so the other options are good distractors. For item #8 and #10, there are 8 students gave correct answers for #8, and 12 students gave correct answers for #10, the other options worked well as distractors. Students scores:
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. Noriyo Yurika Dominic Younji Gina Tran Curtis Ho Seung Jean 83.3 Maria Wendy 81.7 Jinny Dan luo Lavenda Yuki 96.7 93.3 88.3 88.3 88.3 88.3 83.3 81.7 78.3 76.7 70.0 63.3

Item Discrimination (I.D.) Item High scores (top four) with correct answers 4 Low scores (bottom four) with correct answers 3 I.D.

0.29

2 3 4 5 6 7 8 9 10

4 4 4 4 2 2 4 1 4

3 1 3 1 0 3 1 0 3

0.29 0.86 0.29 0.86 0.57 -0.29 0.86 0.29 0.29

Because items #1, #2, #4, #10 got an I.D. of 0.29, it indicates that the item does distinguish a slight difference between the high scorers and low scorers. Items #3, #5, #8, got an I.D of 0.86, which indicates that the items distinguish the difference between the high scorers and low scorers well. Reflection and Discussion Traditional assessment was used in the above-mentioned group project, however after designing, administering and evaluating the test one could easily think of different ways to assess vocabulary recognition and production that is more student-friendly. This section will focus on a different style of evaluation, alternative assessment, and how it can be used in a content-based course aiming to test vocabulary. Each technique must be specified for the EAP (English for Specific Purposes) aspect of the course, as mentioned in the teachers overall teaching approach in the background context. It will also look at how to the test shown in Appendices A and B could have been converted to an alternative assessment while addressing the same objectives. There are various types of alternative assessment. For instance, the test could have been approached as a performance test, in which the students would be expected to perform a particular job or set of situated functions. (Bailey, K., 1998). The main difference between a traditional multiple-choice technique and a performance test is the positive backwash students receive. So in the given test of this paper, the students could be asked to pretend they were in a

committee to determine if a district school should be placed in single-sex schooling and told to correct an informal proposal from a fellow committee member using the given vocabulary words. This would ensure they understood the vocabulary words enough to understand when they should be used. A potential issue with this approach would be ensuring the students did not hear one anothers answers; therefore, the assessment would most likely have to be in a one-on-one conference setup, which detracts from the practicality of the test. Genesee & Upshur (1996) suggest using portfolios as an alternative assessment. They claim that through portfolios, the development of student language can be showcased to not only the teacher, but other classmates and parents, which therefore improves the students overall involvement and achievement. Because this is a highly student-centered assessment, conferences with the students is also recommended for the teacher to take notes on their progress. Duong, M. T., Nguyen, T. K. C., & Griffin, P suggest that portfolios are popular due to the fact that the components display both the process and the product (2011). Here the test featured in Appendix A and B will be discussed as to how it could be changed into a portfolio culminating in a teacher conference in order to assess to see if the students have reached the same objectives. A full portfolio would suggest various components included as a summative assessment at the end of the term. However, because the students were only about halfway through their term, a mid-semester portfolio assessment would be prudent as a formative assessment. Using a portfolio as a formative assessment for assessing vocabulary, one must first examine the benefits of using portfolios as formative assessment in general. In a recent study done by Lam, R., & Lee, I. (2010) portfolios used as formative assessment was conducted with

students in Hong Kong. Several formative strategies were adapted to the study: ongoing teacher feedback, peer review and conferencing. Many students wrote in a questionnaire about the portfolio process that they liked the portfolio process since they had more autonomy to choose their best work to be graded. (Lam, R., & Lee, I., 2010). Others mentioned the positive environment and the improvement they noticed in their work over time because of the teacher conferences. When considering adapting the project test detailed in this paper into a portfolio assessment, specifying the process that the students would go through is imperative; establishing a clearly defined purpose for the portfolio is the most problematic aspect of implementation. (Johnson, K.E., 1996). The purpose presented to the students would be a formative assessment meant to display their progress in acquiring academic vocabulary based on the AWL (Academic word list). The suggested order is as follows: provide specific guidelines to students on what is expected of them in the portfolio, collect two drafts and return them graded to the students before the final submission and finally the rubric for the graded test. Modifying the test to include production could be one way of proceeding, however for the purpose of keeping all variables but the testing technique the same, the portfolio should include only vocabulary recognition as well. In order to do so, the portfolio would be sectioned into the 20 words given in the project test: criteria, derive, dimension, initiate, integral, orientation, reside, site, sole, unique, Alma Mater, elect, blossom (in terms of a person maturing), out of reach, shortchange, exuberant, fragile (in terms of a person), self-esteem, warrant (the verb), and endorsement. Each vocabulary word would have their own page in the portfolio and the students would need four sections under each vocabulary word.

The first section would be for the first objective (the student will be able to choose the correct definition for select vocabulary words from the Academic Word List (AWL)): In class, the teacher will provide the same matching section as the project test shown in Appendices A and B, but cut into strips. There would still be more definitions than words to ensure the students could not use process of elimination and guess correctly. The students would assemble the correct definition to the appropriate word and turn it into the teacher. During the individual teacher conference, either during allotted class time or office hours, the teacher would explain any mistakes the student made and why the correct definition is correct. If more than one student made the same mistake, the teacher could address the situation to the group as a whole. The students would then fix their mistakes and correctly match the definitions and include it as the first section under the word. Although this would detract from the practicality of the assessment, it would satisfy the ongoing teacher feedback and conferencing strategies mentioned about formative assessment by Lam, R., & Lee, I., (2010). The second section, going along with the second objective (the student will be able to identify the meaning of select vocabulary words from the AWL based on their use in context), would be: a newspaper or magazine clippings, found by the students to ensure it is not overwhelming for the teacher, with the vocabulary word used in the correct context. This ensures that the students are able to recognize the word in context using authentic material, which follows our groups philosophy that authentic materials should be used as much as possible in order for communicative learning to succeed. The third objective (the student will be able to distinguish between multiple meanings of select vocabulary words from the AWL based on their use in context) would be satisfied by the third section: a video or audio clip of an English user using a synonym of the vocabulary word in

context. This could be found on the internet. The student would then be tasked with transcribing the sentence (or sentences if necessary) and providing the link for the teacher to check after. This would ensure that, just like Part III of the project test, students are able to recognize which words are similar enough to the original vocabulary word to ensure that they know it. The only potential problem with this part of the test is ensuring academic honesty that the students wont ask native English speaker friends to help them find a clip using the vocabulary word in the appropriate context. For the last section that the students must write under the word, in concurrence with the fourth objective (the student will be able to replace words from a short reading with select vocabulary words from the AWL that have a similar meaning): find an article on single-sex schooling and replace similar words with words from the vocabulary list provided in the guidelines. Students must show the before sentence and after sentence. This would still be considered recognition because the students are provided with the vocabulary words and do not have to produce them by memory; however, further objectives may have to be developed. During the teacher conference, any grammatical inconsistencies can be discussed. Then the students would edit their answers and put it as the fourth section under each word. Future Inquires Below are inquiries concocted as a result of this project and reflection: How valid are self-asssment and peer-assessment portfolios, as described by Sharifi, A., & Hassaskhah, J.? (2011). What is the best method to do a trial run on a test before administering it to students?

What is the best determiner to decide which vocabulary words are tested by different techniques? (I.e. by multiple choice without context, multiple choice with context, gap filling, etc.) References

Bailey, K. M. (1998). Learning about language assessment: Dilemmas, decisions, and directions. Heinle & Heinle Publishers ITP An Internation Thom. Duong, M. T., Nguyen, T. K. C., & Griffin, P. (2011). Developing a framework to measure process-oriented writing competence: A case of vietnamese EFL students' formal portfolio assessment. RELC Journal, 42(2), 167-185. Retrieved from http://ezproxy.hpu.edu/login?url=http://search.proquest.com/docview/896183826?accoun tid=2514 Genesee, F., & Upshur, J. A. (1996). Classroom-based evaluation in second language education. Cambridge, United Kingdom: Cambridge Univ Pr. Johnson, K. E. (1996). Portfolio assessment in second language teacher education. TESOL Journal, 6(2), 11-14. Retrieved from http://ezproxy.hpu.edu/login?url=http://search.proquest.com/docview/85653491?accounti d=2514 Lam, R., & Lee, I. (2010). Balancing the dual functions of portfolio assessment. ELT Journal, 64(1), 54-64. Retrieved from http://ezproxy.hpu.edu/login?url=http://search.proquest.com/docview/744444171?accoun tid=2514 Sharifi, A., & Hassaskhah, J. (2011). The role of portfolio assessment and reflection on process writing. Asian EFL Journal, 13(1), 192-225. Retrieved from http://ezproxy.hpu.edu/login?url=http://search.proquest.com/docview/1010694326?accou ntid=2514

Appendix A

Vocabulary Quiz
International Education (This quiz will approximately take 30 minutes. Each item is worth 1 point)

Part I Please choose the answer (a, b, c, or d) closest in meaning to the underlined word. 1) Jack thought the girl was very exuberant.

a) b) c) d)

exciting energetic (key) intelligent interesting

2) The fact that he is an honor roll student doesnt warrant his arrogant nature. a) create b) excel c) justify (key) d) manage 3) Studying was a(n) integral part of Kates life as a graduate student. a) necessary (key) b) productive c) annoying d) frustrating 4) Anna felt she had sole responsibility with the group project. Anna had: a) all the responsibility (key) b) a lot of the responsibility c) little responsibility d) no responsibility 5) Lukes parents thought he had blossomed during his senior year in college. a) developed (key) b) changed c) failed d) fought Part II Please match each item with its corresponding definition. There will be more definitions than there are words, so choose carefully.

Vocabulary 1. Criteria _____ (key: l ) 2. Derive _____(key: a ) 3. Dimension _____(key: d ) 4. Initiate 5. Integral 6. 7. 8. 9. _____(key: e ) _____(key: f )

a. b. c. d.

Definitions To get something such as happiness or strength from someone or something Only To be a part of something bigger than yourself A particular part of a situation

Orientation _____(key: i ) Reside _____(key: h ) Site _____(key: m ) Sole _____(key: b ) Unique _____(key: k )

10.

e. To arrange for something important to start f. Necessary g. Average or usual h. To live in a place i. Beliefs or interests that a person or group has j. To rent out a space k. Being the only one of its kind l. Facts or standards used to help in deciding something m. A place where something happened or where something is being built

Part III Choose the alternative (a, b, c, or d) which is closest in meaning to the word on the left of the page 1. Unique a. special b. multiple c. usual d. general (Key A) 2. Confidence a. self assured b. self-doubt c. self-esteem d. self-distrust (Key C) 3. Criteria a. benchmark b. testing c. possibility d. conjecture (Key A) 4. Authorization a. breach b. warrant c. rejection d. dissent (Key B) 5. Fragile a. durable b. tough c. firm d. brittle (Key D) Part IV Read this article about single-sex school in US. Complete it with the words and expressions from the box. There are more words than needed. Change the form

to fit the gap. Copy the words into the gap. Each word should be used only once in the passage

criteria derive unique self-esteem shortchange alma mater sole blossom orientation exuberant out of reach reside fragile exuberant initiate
Single-sex Education Advocates of single-sex education do not believe that "all girls learn one way and all boys learn another way." We cherish and celebrate the diversity among girls and among boys, but we also notice that each individual is 1_____ . We understand that some boys would rather read a poem than play football. We understand that some girls would rather play football than play with Barbies. Educators who understand these differences have developed different ways to facilitate every child to learn to the best of her or his ability and help them to build their 2_____. Besides that, single-sex schools have students with different political 3______ (s), so it is the schools job to lessen the reinforcement of gender stereotypes. However, because of the high cost, many single-sex schools are 4_______ for most American families. And high costs are not the 5______ challenge that singlesex education is facing. , the good news is that the gender-separate form can boost grades and test scores for both boys and girls. That is one of the reasons why people 6______ single-sex schools in the U.S. in the first place. In fact, some educators and parents recognize that all too often, girls or boys are still being 7_____ in coeducational settings. They believe that boys and girls would clearly 8______some benefit from living and studying in same sex groups. However, the opponents believe that single sex education reduces boys and girls opportunities to work together, and actually reinforces gender stereotypes. They also believe that the better educational outcome does not 9 _____in gender or gender separation. Therefore, the question is what 10_____ should we base single-sex school on? Keys: 1.unique 7.shortchanged 2. self-esteem 8.derive 3. orientations 4. out of reach 5. sole 6. initiated 10.criteria

9. reside

Appendix B Name:________________________________ Vocabulary Quiz

Date: ________________

International Education (This quiz will approximately take 30 minutes. Each item is worth 1 point)

Part I Please choose the answer (a, b, c, or d) closest in meaning to the underlined word. 1) Jack thought the girl was very exuberant. a) exciting b) energetic c) intelligent d) interesting 2) The fact that he is an honor roll student doesnt warrant his arrogant nature. a) create b) excel c) justify d) manage 3) Studying was a(n) integral part of Kates life as a graduate student. a) necessary b) productive c) annoying d) frustrating 4) Anna felt she had sole responsibility with the group project. Anna had: a) all the responsibility b) a lot of the responsibility c) little responsibility d) no responsibility 5) Lukes parents thought he had blossomed during his senior year in college. a) developed b) changed c) failed d) fought Part II Please match each item with its corresponding definition. There will be more definitions than there are words, so choose carefully.

11. 12. 13. 14. 15. 16. 17. 18. 19. 20.

Vocabulary Criteria _____ Derive _____ Dimension _____ Initiate Integral _____ _____

n. o. p. q.

Definitions To get something such as happiness or strength from someone or something Only To be a part of something bigger than yourself A particular part of a situation

Orientation _____ Reside _____ Site _____ Sole _____ Unique _____

r. To arrange for something important to start s. Necessary t. Average or usual u. To live in a place v. Beliefs or interests that a person or group has w. To rent out a space x. Being the only one of its kind y. Facts or standards used to help in deciding something z. A place where something happened or where something is being built

Part III Choose the alternative (a, b, c, or d) which is closest in meaning to the word on the left of the page 6. Unique a. special b. multiple c. usual 7. Confidence a. self assured b. self-doubt d. self-distrust 8. Criteria a. benchmark b. testing c. possibility 9. Authorization a. breach b. warrant c. rejection 10. Fragile a. durable b. tough d. brittle d. general c. self-esteem d. conjecture d. dissent c. firm

Part IV Read this article about single-sex school in US. Complete it with the words and expressions from the box. There are more words than needed. Change the form to fit the gap. Copy the words into the gap. Each word should be used only once in the passage

Single-sex Education

criteria derive unique self-esteem shortchange alma mater sole blossom orientation exuberant out of reach reside fragile exuberant initiate
Advocates of single-sex education do not believe that "all girls learn one way and all boys learn another way." We cherish and celebrate the diversity among girls and among boys, but we also notice that each individual is 1_____ . We understand that some boys would rather read a poem than play football. We understand that some girls would rather play football than play with Barbies. Educators who understand these differences have developed different ways to facilitate every child to learn to the best of her or his ability and help them to build their 2_____. Besides that, single-sex schools have students with different political 3______ (s), so it is the schools job to lessen the reinforcement of gender stereotypes. However, because of the high cost, many single-sex schools are 4_______ for most American families. And high costs are not the 5______ challenge that singlesex education is facing. , the good news is that the gender-separate form can boost grades and test scores for both boys and girls. That is one of the reasons why people 6______ single-sex schools in the U.S. in the first place. In fact, some educators and parents recognize that all too often, girls or boys are still being 7_____ in coeducational settings. They believe that boys and girls would clearly 8______some benefit from living and studying in same sex groups. However, the opponents believe that single sex education reduces boys and girls opportunities to work together, and actually reinforces gender stereotypes. They also believe that the better educational outcome does not 9 _____in gender or gender separation. Therefore, the question is what 10_____ should we base single-sex school on?

You might also like