Assessing spoken language through the oral proficiency interview
Access Status
Authors
Date
2011Type
Metadata
Show full item recordCitation
Source Title
Source Conference
ISBN
School
Remarks
See the Related Links field for a link to the full Proceedings
Collection
Abstract
In response to the identification of the level of English language proficiency of students in Australia‘s universities as an issue of concern, many universities have introduced post-entry language assessments (PELAs), either at an institutional level, or within a specific disciplinary area. Most of these instruments take the form of a writing assignment, but there is a growing recognition that many students may require further development of their oral language, and that this, too, should be assessed. This paper presents the findings of a small-scale study which sought to explore the differences in language produced by the same candidates in different types of oral proficiency interview. This particular form of assessment was selected not only because it is commonly used in a variety of pre-tertiary and post-entry contexts to assess oral language proficiency, but also because academic course and unit coordinators often participate in one-to-one interviews or meetings with students, and may make judgements about their oral language capabilities on the basis of those encounters. The study compared candidates’ oral language use in the context of three different interactive formats: a scripted interview with a live interlocutor, an unscripted interview with a live interlocutor, and an ‘interview’ comprising responses to a set of pre-recorded prompts. The study was conducted with twelve participants from a range of language and cultural backgrounds, all of whom spoke English as an additional language (EAL).The results indicated that while some significant differences were observed according to which of the three formats the candidates had undertaken, it also appeared that the influence of the live interlocutor on candidates’ language output might have extended beyond that associated with the format of the test to differences in the interlocutors’ personal styles. The paper concludes that the identification of differences, even in the brief extracts of language produced within the study, reinforces the need to exercise caution when designing and conducting an oral PELA, so that candidates are not disadvantaged by the format of the assessment.
Related items
Showing items related by title, author, creator and subject.
-
Dunworth, Catherine M. (2001)This study was initiated as a result of the appearance of a number of articles and commentaries in the academic press which intimate that the English language levels of many overseas students studying in Australia are not ...
-
Hasegawa, Hiroshi; Chen, Julian ; Collopy, Teagan (2020)This chapter explores the effectiveness of computerised oral testing on Japanese learners’ test experiences and associated affective factors in a Japanese program at the Australian tertiary level. The study investigates ...
-
Morris, Judith (2006)The growing diversity of school populations around the world means that for many students the language of instruction in mainstream classrooms is not their first language. Content-based second language learning in a context ...