Think of assessment and many teachers think of tests – that moment of fixed time when all the learning of the past few weeks, months or years comes down to a series of multiple choice questions about grammar, vocabulary, course content, with perhaps a writing task or reading comprehension thrown in.
Tests are one enduring feature of education. In every language learning setting I have worked in, tests have been a regular and dominant feature. They are used to determine success or failure, report grades, and determine how ‘good’ someone is judged to be in English.
There are alternatives of course: project work, portfolios, oral assessments, self-assessment and more, some of which I have discussed in a previous iTDi post. I previously worked with young learners and tried to give them as much of a say in their assessment as possible, even getting them to compose their own exam questions, but this has had limited success. I did this for my classes in the year group I was teaching in but it was not adopted elsewhere in the school. Prescribed questions on discrete language items still held sway and I expect my old area of influence has now reverted back to the same format.
However, my new job has offered the chance to start afresh in many ways. I am now the coordinator of a new language school for adults in Gabon. I have set up the school, designed the learning programme, and implemented an assessment programme from scratch. This post will detail how I have approached this task.
In essence, we have three kinds of ‘tests’: we have ‘placement tests’ to put new students in the right level; there is also an ‘exit test’ which takes place at the end of each 6-week course; and we have external exams such as TOEIC and TOEFL. The external exams have been included because of market demand. Even here in central Africa, such exams are much sought after by employers and universities so they are a necessary component of our programme. Like them or not, they are a demand we have to cater to, and there is little we can do to change the style or content of those tests.
However, I have had the chance to affect the style and content of those other tests. The quick turnaround of courses means there is little time to get students to produce their own questions but there are other ways I can get these tests to be more meaningful, purposeful, and therefore more indicative of the students’ language level.
The placement tests
- They begin with the speaking component. There are a few ‘starting point’ questions but our teachers are trained to let the conversation flow depending on the input of the student.
- There then follows some short reading and writing tasks (graded according to the performance in the speaking test) but not with multi-choice questions or strict instructions. The tasks are designed to provoke thought and the questions are open-ended to give the student a chance to show what they can produce rather than what grammar they know (or can guess!)
- This gives a much more accurate impression of where the student will fit into our learning programme than a multi-choice test would. Students can freeze when faced with a grammar based test. Conversely they can get lucky with their choices in an A/B/C/D test. Our speaking test gives them the chance to loosen up and our written test gives them the chance to express themselves and fits much better with the philosophy of our school.
End of course assessment
- It actually begins at the start of the course! In the first week of a new 6-week cycle, the students are given a written task to complete and a speaking activity to engage in. The writing task provides a wealth of information about where the student is in terms of their English ability and is used to inform the focus of the course (see this recent post by Willy Cardoso for another explanation of this).
- The same tasks are then repeated in the final week of the course. The students and teacher then compare the earlier effort with the more current one to identify areas of progress and things that still need to be worked on. This is great for showing students how far they have come and in making learning plans for future courses.
- The final week also includes a self-assessment task asking the student to identify their progress and what they are struggling with. This is then discussed with the teacher in an end-of-course feedback meeting.
- There is ‘pen and paper’ assessment as well but it contains no grammar or vocabulary questions. Instead, it features an ‘input text’ (a reading passage, an audio recording, or a video). Students are then asked to respond to it through a series of open-ended questions or a short written response.
- All of this tests not what points of grammar they have memorised but what language they can use and how they can express themselves. Surely that is what language learning is all about!
I have been lucky in this job. I was asked to design a learning and assessment programme and I was trusted to implement my own ideas. Not everyone is in this position. You may work in a school where there is a designated ‘test writer’ or tests that come with the course book are used as a standard.
But what I would say to that is ask ‘why?’ Why is a test prepared and imposed by someone from outside the class? Why is an isolated test of the content of recently taught units used? Ask these questions. Suggest alternatives. Start a conversation about assessment. Decide with your colleagues and with your students what approach to testing will work best to measure language ability and to inform future learning.
That is where the importance of testing lies, not in the summative or the formative but in the informative. Start the conversation, get the information, and use it to move forward.