Assessment is…

Assessment is no longer, if it ever truly was, paper and pencil tests focused on language knowledge. From oral interviews, open ended questions, portfolio work, and student generated exams, assessment has grown to better meet the needs of individual language learners in specific contexts. And yet, assessment is still filled with stories of heartbreak and missed opportunities. In this issue, iTDi members carry on the discussion of what assessment often is, what it can be, and where it might be moving.

The ‘Why’ of Testing by Kevin Stein
Naomi EpsteinAssessment is Sometimes Heartbreak
Dave DodgsonReassessing assessment
iTDi-circle

The ‘Why’ of Testing

March is almost over. When it ends, so officially ends the school year here in Japan as well. Which means that tests have been marked, desks wiped down, and grades entered into each student’s report card. Again this year I am left with the nagging feeling that those grades, those indelible marks of ‘achievement’, fail to capture the important story of what learning is all about. But at least this year there were moments where I did feel that assessment was doing its job. That students were taking the kinds of tests which highlighted their own development and which fostered, as opposed to inhibited, learning. I would like to share two of the alternative assessment techniques I used in my classroom this year. While neither one was perfect, they both helped my learners, and myself, come to a better understanding of how and why we take tests in my school.

Student Designed Testing

This year, I taught one ‘standard’ English class built around a lexis and grammar focused syllabus. Each unit consisted of 3 target grammar points and a 1500 word article containing a high number of set phrases and multi-word verbs. Instead of focusing on memorisation of vocabulary and grammar rules, I spent a majority of class time helping students develop the underlying skills they need to be more effective language learners. We worked on summarising skills, creating comprehension questions as a means of checking understanding against a classmate’s, and techniques for identifying multi-word verbs and phrases. At the end of each unit, one group of four students was designated to design the unit-quiz which would then be administered to the other students. While some of the skills students learned in class made it onto each quiz, most of the questions required nothing more than memorisation of large chunks of text and the application of discrete grammar rules. When I followed up with students, one group explained that for the standard tests they would be taking in the future, whether it be a university entrance exam or the TOEIC test, knowing lots of vocabulary and grammar rules would help them get the best possible grade. It was a classic example of negative washback, or the tendency for teachers to teach to a test. Only in this case, it was the students themselves who had chosen to learn specifically for a high stakes test they would be taking sometime in the future.

Collaborative Learning Assessment & Social Testing

In a recent article on ‘social testing’, Tim Murphey (2013, p. 30) points out that, “our minds are no longer, if ever they were, isolated, independent, and individual entities, but rather our minds and our brains are interconnected and networked and work best with other minds in collaboration.” The way we test students on their learning is becoming more and more disconnected from the way students utilise that learning in the real world. By expecting students to merely study in order to fill in blanks on a piece of paper, or answer questions put to them by a teacher during an oral interview, we are arbitrarily cutting them off from the collaborative process of meaning making which is crucial to most learning outside of the classroom. With that in mind the final exam for my intermediate 4 skills English class this year consisted of the following stages:

  1. Students, in small self-selected study group of 3 or 4, had two 50-minute class periods (as well as any time they wanted to use after school) to work through their class notes to create a 3×5 inch study-card which contained all the information they thought might be useful when taking the final exam.
  2. Students took the test with the aid of their study-card.
  3. At the beginning of the next class, I gave the tests back to the students unmarked. They were then given 15 minutes to make changes or additions to their exam using a red pen. They were not allowed to use their study-cards at this point.
  4. Students were then given 10 more minutes to consult with members of their own study group and make further changes.
  5. Students were given 10 minutes to consult with members of other groups and make any final changes or additions to their answers.
  6. I collected and graded the exams giving an original score, a score after revision, and a final score which was an average of the two.

I realize that, on the face of it, this certainly doesn’t seem like much of an evaluation. Students who had not, in any traditional sense, ‘learned’ the class content still had the potential to receive a passing grade. But in the end, the students’ test grades—a test which was extremely long and covered almost all the material for 8 months of coursework—did not significantly differ from their scores in class up to that time. I would also argue that students, through this process, ended up gaining something vital to their future as language learners. They learned to see studying for exams as a dynamic process and to recognise the importance of social capital. As Murphy points out, there is real value in helping students develop, “their ability to learn more socially rather than testing an isolated brain unconnected to others. (ibid. p. 29)” In feedback interviews with students at the end of the school year, many students highlighted how much more information they retained from this final exam than from other tests. One student in particular said, “I always study just so I can pass the test. Then I forget everything as fast as I can because I really hate studying. But this time, I remember most of what was on the test. Especially the things I learned from the other students.”

A final word (which is hardly the final word)

Just as our students rarely focus on and learn what it is we think we are teaching during our regular class time, my students’ ideas of the value and importance of a test is often very different from my own. Some take tests purely to get a good grade. Some to prepare themselves for a future, even more difficult test. Some for the pleasure of learning. Perhaps an important side-effect of thoughtful assessment is to allow students to recognise the various and sometimes conflicting reasons for taking a test. Before students put pencil to paper or begin to review their notes, they should be encouraged to consciously evaluate for themselves why they are taking an exam, to ask, “Why and how am I preparing for this test?” If they can find a meaningful answer to that question, they might see the evaluation process as a valuable opportunity to grow as a language learner, regardless of the grade handed back to them at the end of the year.

References:

Murphey, T. (2013). Social Testing: Turning Testing into Healthy Helping and the Creation of Social CapitalPeerSpectives, 10, 27-31

 

Assessment is Sometimes Heartbreak

Naomi EpsteinIt’s that time of year again at the high school. The twelfth graders are about to take a series of final exams before graduating. Every year there are a few students who break my heart. But this year one student seems to stand out in particular.

naomi290315-3

We’ll call him P. Just like the other heartbreaking students before him, he “bought” the school system’s slogan “hard work = success”, worked hard, did his homework, missed very few classes and reviewed the material. Unlike those other students, he remembers vocabulary items better than most of the students in all my classes. He’s curious about words, and brings in brings in words he encounters online. Even more remarkable, he demonstrates a more extensive world knowledge than many of the other Deaf & hard-of-hearing students I teach.
This week we had another Mock Exam in preparation for the finals. Unseen reading passages are the main and most important section of our final exams. Students at his level are required to answer questions that summarize the main idea of a paragraph and demonstrate a command of vocabulary and grammar.

naomi290315-2

The topic of the unseen reading passage was changes at NASA. P. knew what NASA was — not something to take for granted in my classes. He remembered to use the highlight-marker the way we practiced. P told me proudly that he had remembered some of the words without using the dictionary.
Once again P. got the lowest grade in his class: a barely passing grade. Lower than students who, to put it politely, are not model students at all.
Despite all the ways we work on reading comprehension, he can’t seem to integrate the information in the text well. Some things (in every test) baffle him even after we discuss them in mother tongue. In the aforementioned text there was a paragraph explaining how in the past only NASA employees could work on space projects (today the situation is different, that is one of the changes presented). P. simply could not understand the answer to the question related to who used to pay the people who worked on space projects (P. was able to translate the word employee correctly into his mother tongue). We discussed it for 10 minutes afterwards and he still did not see the connection between the word “employee” and how it implied the source of the payment. I tried to give examples closer to his reality, (in mother tongue!) such as the fact that I teach him at school (I’m not his employee) vs. a private tutor who could come to his home (he is then the employer). P. still didn’t understand it. Other students did not have a problem with this question!

naomi290315-1

After every test P. looks so disappointed to see his peers get higher grades, while he barely gets a passing grade. He knows he works harder than they do. He looks at me and what can I say?!!
We just continue practicing…

Reassessing Assessment

Dave DodgsonThink of assessment and many teachers think of tests – that moment of fixed time when all the learning of the past few weeks, months or years comes down to a series of multiple choice questions about grammar, vocabulary, course content, with perhaps a writing task or reading comprehension thrown in.

Tests are one enduring feature of education. In every language learning setting I have worked in, tests have been a regular and dominant feature. They are used to determine success or failure, report grades, and determine how ‘good’ someone is judged to be in English.

There are alternatives of course: project work, portfolios, oral assessments, self-assessment and more, some of which I have discussed in a previous iTDi post. I previously worked with young learners and tried to give them as much of a say in their assessment as possible, even getting them to compose their own exam questions, but this has had limited success. I did this for my classes in the year group I was teaching in but it was not adopted elsewhere in the school. Prescribed questions on discrete language items still held sway and I expect my old area of influence has now reverted back to the same format.

However, my new job has offered the chance to start afresh in many ways. I am now the coordinator of a new language school for adults in Gabon. I have set up the school, designed the learning programme, and implemented an assessment programme from scratch. This post will detail how I have approached this task.

In essence, we have three kinds of ‘tests’: we have ‘placement tests’ to put new students in the right level; there is also an ‘exit test’ which takes place at the end of each 6-week course; and we have external exams such as TOEIC and TOEFL. The external exams have been included because of market demand. Even here in central Africa, such exams are much sought after by employers and universities so they are a necessary component of our programme. Like them or not, they are a demand we have to cater to, and there is little we can do to change the style or content of those tests.

However, I have had the chance to affect the style and content of those other tests. The quick turnaround of courses means there is little time to get students to produce their own questions but there are other ways I can get these tests to be more meaningful, purposeful, and therefore more indicative of the students’ language level.

The placement tests

  • They begin with the speaking component. There are a few ‘starting point’ questions but our teachers are trained to let the conversation flow depending on the input of the student.
  • There then follows some short reading and writing tasks (graded according to the performance in the speaking test) but not with multi-choice questions or strict instructions. The tasks are designed to provoke thought and the questions are open-ended to give the student a chance to show what they can produce rather than what grammar they know (or can guess!)
  • This gives a much more accurate impression of where the student will fit into our learning programme than a multi-choice test would. Students can freeze when faced with a grammar based test. Conversely they can get lucky with their choices in an A/B/C/D test. Our speaking test gives them the chance to loosen up and our written test gives them the chance to express themselves and fits much better with the philosophy of our school.

End of course assessment

  • It actually begins at the start of the course! In the first week of a new 6-week cycle, the students are given a written task to complete and a speaking activity to engage in. The writing task provides a wealth of information about where the student is in terms of their English ability and is used to inform the focus of the course (see this recent post by Willy Cardoso for another explanation of this).
  • The same tasks are then repeated in the final week of the course. The students and teacher then compare the earlier effort with the more current one to identify areas of progress and things that still need to be worked on. This is great for showing students how far they have come and in making learning plans for future courses.
  • The final week also includes a self-assessment task asking the student to identify their progress and what they are struggling with. This is then discussed with the teacher in an end-of-course feedback meeting.
  • There is ‘pen and paper’ assessment as well but it contains no grammar or vocabulary questions. Instead, it features an ‘input text’ (a reading passage, an audio recording, or a video). Students are then asked to respond to it through a series of open-ended questions or a short written response.
  • All of this tests not what points of grammar they have memorised but what language they can use and how they can express themselves. Surely that is what language learning is all about!

I have been lucky in this job. I was asked to design a learning and assessment programme and I was trusted to implement my own ideas. Not everyone is in this position. You may work in a school where there is a designated ‘test writer’ or tests that come with the course book are used as a standard.

But what I would say to that is ask ‘why?’ Why is a test prepared and imposed by someone from outside the class? Why is an isolated test of the content of recently taught units used? Ask these questions. Suggest alternatives. Start a conversation about assessment. Decide with your colleagues and with your students what approach to testing will work best to measure language ability and to inform future learning.

That is where the importance of testing lies, not in the summative or the formative but in the informative. Start the conversation, get the information, and use it to move forward.