A Little Knowledge is a Dangerous Thing
by Magnus Coney
I’ve always been a fan of research. Primarily as a consumer, and occasionally uninformed critic, culminating in a presentation at the IH AMT conference where I cautioned against overreliance on the work of John Hattie himself, a giant among education researchers. My nonexistent track record of actually doing any research nagged at me, so at the next year’s conference I was thrilled to hear about the new IH Action Research Course (ARC), run by David Petrie.
Putting the ARC into practice
The final task on the course was to plan our own action research project. I decided to base mine on one of my earliest ELT-related interests, which is the Dogme philosophy. This was, in fact, my first encounter with the idea of thinking about what effective teaching is according to research, rather than just “You should do this, my students loved it!”. At this stage, however I must admit that the main appeal was the possibility of reduced planning and preparation time, rather than any careful consideration of the evidence.
The project itself was fairly simple, using the following process:
During each lesson, keep a record of planned and emergent language items that were encountered by students.
- Create a test where Ss would have to produce these items.
- Give the test the following week.
- Repeat.
We went through this process three times in the end, for various reasons including absences, public holidays and lessons which were skewed too much towards one or the other type of language. I put all the tests from all the students together, counted the number of planned items retained correctly and the number of emergent items retained correctly. The results were as follows:
Planned items | 10 out of 38 correct (26%) |
Emergent items | 41 out of 58 correct (70%) |
It would have been easy at this point to write some kind of sensationalist article along the lines of Focus on form emerges victorious over focus on forms: nearly three times more effective, but, as with many headlines, this runs the risk of making a claim that is not fully backed up in the article So, I sat down for an exciting couple of hours with my results and a calculator, and set about trying to see what I’d missed.
Digging Deeper
Was it one particularly good or bad week?
Planned items correct | Emergent items correct | |
Test 1 | 17% | 67% |
Test 2 | 50% | 90% |
Test 3 (2 weeks after, not 1) | 13% | 55% |
There is some variability here which would be worth further investigation, but the overall trend of emergent items being more memorable than planned items is maintained.
Was it one particularly strong or weak student?
Student | Planned items correct | Emergent items correct |
S | 36% | 69% |
G1 | 13% | 50% |
C1 | 18% | 72% |
C2 | 22% | 90% |
G2 | 33% | 80% |
Some more variability here, but again the overall trend is reflected.
Were the items all equally difficult?
This is harder to tell with a mix of structures (e.g. How long have you…), chunks (e.g. Think outside the box), single words (e.g. survey) and so on, so I judged it by the number of words in each item. Here, there was a clear difference with the planned items being on average 2.73 words long, while the emergent items were on average only 1.63 words long. So, I eliminated all the longer items and compared only those of 1 or 2 words in length.
Planned items correct (1 and 2 words long only) | 31% |
Emergent items correct (1 and 2 words long only) | 77% |
Both groups increase, but the difference is maintained.
Were all the items tested in the same way?
This is important – different ways of testing have different levels of challenge, depending on whether it is receptive or productive, the level of control and ambiguity in the answers, the clarity of any prompts, et cetera:
Task types | |
Planned items | 2 translations
6 synonyms 1 prompt to elicit item 1 gapfill 1 definition to elicit item |
Emergent items | 6 translations
3 what’s the difference between item X and item Y 3 gapfills 2 sentence completions 2 definitions to elicit item |
So here there is certainly an issue with consistency, and future replications of this research would need to be more consistent than I was.
Conclusions
My original hypothesis was that emergent language would be more memorable than planned language, and this was apparently borne out. In my opinion, however, several limitations remain:
- The small sample size, both in terms of number of students, number of items and number of tests.
- My enthusiasm for emergent language may have been infectious.
- The value for these Business English students of some of the emergent items compared to the value of the pre-planned material that came out of a needs analysis at the start of the course.
- The difficulty of measuring the complexity of different items.
- The different ways in which I dealt with both types of items.
- Only 26% of planned items remembered after a week? Is my teaching that bad?
Despite these limitations, I think the important elements are that it was classroom-based research, real lessons with real students, focused on longer-term retention, where a focus on form and focus on forms were directly compared – features I have not been able to find in the existing literature (although my literature review was admittedly rather minimal). Like any true scientist, I’d be very happy to be proved wrong (anyone with time and access to research papers could look into Doughty and Williams’ work on focus on form, for example).
A more general, philosophical conclusion also comes to mind here. What I’ve tried to do in this article is to think like a real scientist; that is, make a hypothesis, test it, look for evidence, and be ready to change your mind in the light of new information. Indeed, a habit of actively seeking to disprove yourself is something we could all develop, scientists or not.
So, what’s next?
A key issue for many research projects is that of a lack of replication. This is because scientific journals often prefer to publish new findings, especially those with interesting results, and as a result many studies are only conducted once. What is needed is more people to try to replicate and improve existing studies like this one, to see whether the reported effects are consistent. If everyone changed their approach to teaching purely on the basis of this slightly amateurish small-scale study, I’d be perturbed, to put it mildly. As written in the last lines of most research papers, “More research is needed”!
Author’s Bio: Magnus Coney is the teacher training coordinator at IH Milan, running the Cambridge CELTA and other courses, as well as teaching a range of ages and levels. As he gets older he is increasingly open to getting things wrong and finding out why.
References and further reading Doughty, C. and Williams, J. eds, 1998. Focus on Form in Classroom Second Language Acquisition. Cambridge University Press. https://criticalelt.wordpress.com/ (A blog by Geoff Jordan, who is a passionate advocate of a focus-on-form approach. Read the comments sections for spirited counter-arguments!) https://puravidadogme.wordpress.com/ (More action research on the Dogme philosophy) |