10 Feb Helping Learners with Exam Writing
Author: David Petrie. Article originally published in issue no. 39 of the IH Journal.
Generally speaking, writing isn’t precisely the students’ cup of tea… And when they have to practice writing for exams (IELTS, TOEFL, Cambridge suite), they might be extra reluctant. This article gives you a few practical suggestions as to how to engage learners in completing writing tasks and better prepare them for exams.
In this first column, we’ll be looking at an area that learners always say they want to improve, know they need to improve, and go to any lengths to avoid actually doing: Writing.
When it comes to writing, learners can be overly task-oriented. They see their writing as something they are doing for you, the teacher, in order to prove their mastery over the language, which they see (or have been taught to see) as mostly consisting of bits of vocabulary held together by strands of grammar. Or possibly the other way round. In this view, it is enough to produce just over 140 vaguely topic-related words (depending on level), all of which have been carefully counted so as to ensure that none are missing. We are, after all, only assessing their English, so the content itself doesn’t actually matter.
And a lot of the time, this approach works. Until the students hit an exam class where things like answering the question become important…
Here, then, are a few ideas to help learners move past their preconceptions and towards a more successful piece of exam writing. These suggestions are aimed more at teachers of IELTS, TOEFL, and the Cambridge English main suite exams, though teachers of other exams will also hopefully find them useful.
Brainstorming and Ranking
Most exam tasks are based around a core general topic with about three key sub-topics. For example, on page 31 the Cambridge English: First handbook gives an essay topic of pollution and environmental damage, with the sub-topics of transport, rivers and seas and “your own idea”.
For this activity you need a range of board pens. Before the learners see the essay task, write up the topic – in our example this is “Pollution and Environmental Damage” – and ask learners to write up as many sub-topics or ideas on the board as possible. For added spice, this could also be a board race. When done, ask learners to work in pairs or small groups to create categories for these ideas, bearing in mind that some ideas could go in more than one category, and to make a record in their notebooks. This can be done as a table or as a spidergram / mind map type diagram. The purpose here is for learners to think about how the ideas fit together.
At this stage, you can then reveal the writing task. It should be immediately obvious that some of their ideas are more relevant than others, so ask the learners (still in their pairs or small groups) to rank the categories and / or the ideas they wrote down in order of importance and relevance to the task. This could lead to a useful discussion on what to include and why and perhaps more importantly, what to leave out! As a further flexistage, you could run the ranking exercise as a pyramid discussion, where learners initially make choices on their own, then agree a set of choices with a partner, then in a group of four, a group of eight etc.
This thinking about which ideas are important, relevant and useful, should help inform any further planning that takes place, perhaps in relation to the task type and structure. For example, which ideas could go in which paragraphs and why.
Reader / Writer Role plays
It is perhaps unfair to expect a 16- or 17-year-old to write meaningfully on problems with integrating new staff into a company, or even to have an idea of what editors consider important to include in theatre reviews, as it will probably be outside the realm of their experience; yet this is what some tasks ask learners to do. Most learners will only have produced longer texts for teachers assessing their content knowledge (e.g. in a history class) and so familiarising them with the concept of a reader that exists beyond the teacher becomes vitally important. Very often I find students understand the idea in the abstract, but don’t understand the concrete implications of the situation.
Fortunately, almost all exam tasks contain a mythical “target reader”, often the principal of a college, the editor of a magazine or the English speaking friend who is shortly to visit their country. The contexts created by the writing task then set up the dynamic of the reader / writer role play.
Here is a sample Cambridge English: Preliminary writing task:
Your friend, Chris, has invited you to a special party which he is organising for your college teacher.
Write an email to Chris. In your email, you should
- accept the invitation
- suggest how you could help Chris prepare for the party
- ask Chris for some ideas for a present for your teacher.
And here are two roles that you can derive from this task:
You are organising a surprise party for your teacher. Think about all the things that need doing. Think about all the things that you need help with. Now talk to your friend and invite them to the party. Try and get them to help you as much as possible!
Your friend Chris is organising a party for your teacher. You want to help. You don’t want to do a lot! You need some help to think of a good gift for the teacher.
By asking the students to act out the scenario, it lifts the task off the page and makes it more meaningful and more realistic. The roles themselves are quite straightforward, though to make the activity more engaging it helps to add a degree of contrariness! It helps learners react to the readers in the tasks, rather than solely the tasks themselves.
Using Peer Feedback
The main purpose of this activity is to familiarise learners with the criteria that are used in assessing their work, which are freely available in most exam handbooks or online.
Learners are often reluctant to offer feedback on each other’s writing; partly because they don’t want to seem to criticise each other unfairly, partly because they aren’t sure what to say and partly because they don’t actually know how to assess their own, or anybody else’s work. Very often when you ask learners to look at their own or each other’s work, they tend to focus on linguistic errors only as these are the more apparent, surface-level, issues and because they feel more confident in asserting error in these areas. Getting learners to move past the linguistic and to assess and be more aware of deeper structural or task-related issues is more difficult.
Fortunately, descriptions of what the examiners expect to find in a good piece of writing are readily available from relevant websites:
- IELTS: http://takeielts.britishcouncil.org/find-out-about-results/ielts-assessment-criteria
- TOEFL: https://www.ets.org/toefl/teachers_advisors/scores/guides/
- Cambridge Exams: http://www.cambridgeenglish.org/exams/
These descriptions are available as pdf downloads (with Cambridge English, you will need to go in and select your relevant exam and possibly download the handbook), but in many cases are difficult for learners to understand. What, for example, would your learners understand by “may demonstrate inconsistent facility in sentence formation and word choice” (TOEFL)?
With higher levels, e.g. C1 or C2, it might be possible to simply download the assessment criteria or grid, give it to the learners to look at, and to discuss the criteria and what it all means before they then go on to look at some sample writing. If you photocopy all the written work before you mark it, the learners will have a copy they can scrawl all over to their hearts’ content. You may also want to anonymise the written work by covering up learner names, and then making sure you don’t give learners back their own work! If you have more than one class at the level, then you can use written work from different classes. You could also arrange swaps with your colleagues.
With lower levels, you may need to produce a simplified version of the assessment criteria. For example, the Cambridge English: First writing criteria for language deals with lexical range, grammatical range and accuracy, from the very good to the very poor. This might be difficult for learners to access and so might be better broken down as a range of simpler questions like: Does the writer use lots of different words or do they repeat the same words a lot? Do they use words correctly? Do they use different grammatical structures or do they use the same structures and tenses a lot? How many mistakes do they make? Can you understand the text with the mistakes?
The aim of the process is for learners to develop awareness of what constitutes “good writing” in the exam context and it is not so much about learners providing useful feedback to each other, that is perhaps a by-product of the process and is an indicator of how well they understand the criteria (the more useful the advice, the better the understanding).
All of these ideas try to help learners see writing as more than “getting the grammar right” and they try to focus learners on looking at the purpose and goals of a piece of writing, rather than the words they put on the page. I hope these ideas prove useful for any exam teachers out there. I’m always happy to get feedback and if there is anything you would like me to address in a future column, please don’t hesitate to let me know
Author’s Bio: David Petrie teaches in Coimbra, Portugal and is also teacher trainer with IH OTTI. He is DELTA qualified and has a MA Applied Linguistics & TESOL. Professionally, his interests lie in the area of teaching exam classes and ways of using technology in teaching. He blogs about this and everything else ELT related at www.teflgeek.net and can also be found on twitter as @teflgeek
International House Bucharest runs regular CELTA, DELTA and IHCYLT courses for teachers of English at its Teacher Training Centre, as well as training events online, onsite or at partner locations. To sign up for a course or event, contact firstname.lastname@example.org.