Tải bản đầy đủ - 0 (trang)
5 Students’ Perception of the Impact of DELTA on their English Language Learning Habits

5 Students’ Perception of the Impact of DELTA on their English Language Learning Habits

Tải bản đầy đủ - 0trang

5 Can Diagnosing University Students’ English Proficiency Facilitate Language…



103



Table 5.11 Top English activities by growers and sustainers that helped improve their English

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20



Reading in English (fiction, non-fiction, magazines)

Using self-access centre (Speaking and/or Writing Assistance Programme)a

Listening to lecturesa

Watching TV shows or programmes

Text messaging

Talking to exchange students (inside or outside the classroom)

Academic reading (journal articles, textbooks)a

Using dictionary to look for unknown words

Listening to and/or watching TED talks

Doing grammar exercises

Listening to music

Test preparation

Watching YouTube clips

Watching movies

Doing online activities

Attending formal LCE classesa

Memorising vocabulary

Joining clubs and societies

Reading and writing emails

Exposure to English environment



a



Study-related activities



Sustainers were similar to growers in that they did not find the independent learning links provided in the DELTA report useful for their learning. However, they

required further guidance from teachers to improve their English. They felt that the

DELTA report was useful and accurately reflected their strengths and weaknesses

but they attributed their lack of development to not having support from teachers to

show them what the next step in their language learning should be. This confirms

the survey finding that lack of confidence in their ability to learn English was a hindrance to further development, as well as supporting Alderson et al.’s (2014) second

principle for diagnostic assessment, that teacher involvement is key.

The participants were also asked to describe the top activities that they thought

helped them improve various aspects of their English. Table 5.11 lists the top 20

activities that growers and sustainers specifically thought were useful in their

English language growth.

Surprisingly, only three of the top ten activities are study-related (listening to

lectures, using self-access centre and academic reading) and the rest are all nonstudy-related activities. Reading in English was the most popular activity followed

by the use of services offered by the self-access centre and finally listening to lectures and watching TV shows or programmes in English. These results suggest that



104



A. Urmston et al.



if students want to improve their English, they clearly have to find activities that suit

their learning styles, and this in turn will motivate them to learn. As Tony said,

So I think it’s very important when you think about your proficiency - if you’re a highly

motivated person then you will really work hard and find resources to improve your English.

But if you’re like my roommate, you don’t really work hard in improving English, then his

English proficiency skills will be just like a secondary school student. Seriously. Tony

(grower)



Clearly then, as concluded by Alderson (2005), it is the (informed) intervention

in the learning process that is the most essential contribution that diagnostic testing

can make. The developers of DELTA have worked hard to provide the support that

students need during their development, including links to learning websites and

online resources, teacher mentoring programmes and extracurricular activities, to

help motivate students to continue to engage in the language learning process.



5



Discussion



The first of our research questions asked whether the diagnostic testing instrument

used, DELTA, can reliably measure difference in students’ English language proficiency over a 1-year period. Overall, the results of the psychometric analysis provided fairly strong support for the quality of the four component tests (listening,

reading, grammar and vocabulary). In addition, the bootstrapped paired sample

t-test results indicated that there was a statistically significant difference between

students’ performance across time. In other words, DELTA can be used to measure

differences in English language proficiency over a 1-year period.

Secondly, there was a difference shown in some students’ proficiency, i.e. their

DELTA Measures, between their first and second attempts of the DELTA. Inevitably,

some students improved or grew while others actually showed regression or decline.

In most cases, though, there was no difference measured. Results seemed to indicate

an overall increase in proficiency of the group in terms of numbers of growers being

greater than the numbers of decliners, which would no doubt please university

administrators and programme planners. More specifically, there were more growers than decliners in listening and vocabulary, while reading and grammar saw no

discernible change. Such information again is useful for programme planners and

teachers in that they can look at which aspects of their English language provision

they need to pay more attention to.

In seeking what might account for this difference in DELTA Measures, we have

looked at students’ reported English-use activities. Time spent in lectures, seminars

and tutorials requiring them to listen in English seems to have impacted their proficiency in this skill, while their self-reported attention to academic reading seems to

have improved their academic vocabulary to a greater extent than their reading



5 Can Diagnosing University Students’ English Proficiency Facilitate Language…



105



skills. Indications are that students who do show growth are those that adopt their

own strategies for improvement to supplement the use of the language they make in

their studies.

Qualitative results suggest that DELTA has impact as students report that it is

valuable as a tool to inform them of their strengths and weaknesses in their English

proficiency. For those required to create independent learning plans, DELTA reports

are the first source of information students rely on. The real value of DELTA, however, is the tracking function it provides. Interviews with growers and sustainers

suggest that those students who want to improve their proficiency obviously do

more than the average student; these students are fully aware of their learning styles

and seek their own learning resources and maximize these. Thus, DELTA’s tracking

function serves to validate the perception that their efforts have not been in vain.

This suggests that perhaps DELTA should be part of a more organised programme

which helps students identify learning resources that suit their learning styles and

needs and involves the intervention of advisors or mentors. An example of this is the

Excel@English Scheme (EES) at Hong Kong Polytechnic University mentioned

previously. This scheme integrates DELTA with existing language learning activities as well as custom-made learning resources and teacher mentoring. It allows for

student autonomy by providing the support that is clearly needed.



6



Conclusion



This chapter has described how a diagnostic assessment can be used to inform and

encourage ESL students’ development in English language proficiency as support

for them as they progress through English-medium university studies. The assessment in question, the Diagnostic English Language Tracking Assessment (DELTA)

has been shown to provide reliable measures of student growth in proficiency, while

the diagnostic reports have proved to be a useful starting point for students in their

pursuit of language development. What has become clear, though, is that the diagnostic report alone, even with its integrated language learning links, is not enough

and students need the support of teachers to help them understand the results of the

diagnostic assessment and provide the link to the resources they can use and the

materials that are most appropriate for them, given their needs and learning styles.

Clearly a bigger picture needs to be drawn to learn more about how a diagnostic

assessment like DELTA can impact language development, and this will be possible

as more students take the assessment for a second, third or even fourth time.

Language proficiency development is a process and it is to be hoped that for university students, it is one that is sustained throughout their time at university.



A. Urmston et al.



106



7



Appendix



5 Can Diagnosing University Students’ English Proficiency Facilitate Language…



107



108



A. Urmston et al.



5 Can Diagnosing University Students’ English Proficiency Facilitate Language…



109



References

Alderson, J. C. (2005). Diagnosing foreign language proficiency: The interface between learning

and assessment. London: Continuum.

Alderson, J. C., Brunfaut, T., & Harding, L. (2015). Towards a theory of diagnosis in second and

foreign language assessment: Insights from professional practice across diverse fields. Applied

Linguistics, 36(2), 236–260.

Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford: Oxford

University Press.

Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford: Oxford University

Press.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education,

5(1), 7–71.

Buck, G. (2001). Assessing listening. New York: Cambridge University Press.

DiCiccio, T. J., & Efron, B. (1996). Bootstrap confidence intervals. Statistical Science, 11,

189–228.

Engelhard, G. (2012). Invariant measurement: Using Rasch models in the social, behavioral, and

health sciences. New York: Routledge.

Evans, S., & Green, C. (2007). Why EAP is necessary: A survey of Hong Kong tertiary students.

Journal of English for Academic Purposes, 6(1), 3–17.

Evans, S., & Morrison, B. (2011). Meeting the challenges of English-medium higher education:

The first-year experience in Hong Kong. English for Specific Purposes, 30(3), 198–208.

Field, A. (2005). Discovering statistics using SPSS (2nd ed.). London: Sage.

Lee, I., & Coniam, D. (2013). Introducing assessment for learning for EFL writing in an assessment of learning examination-driven system in Hong Kong. Journal of Second Language

Writing, 22(1), 34–50.

Linacre, J. M. (2014). Winsteps: Rasch model computer program (version 3.81). Chicago: www.

winsteps.com

Miles, M. B., & Huberman, M. A. (1994). Qualitative data analysis: An expanded sourcebook.

Thousand Oaks: Sage.

Miller, L., & Gardner, D. (2014). Managing self-access language learning. Hong Kong: City

University Press.

OECD. (2014). Pisa 2012 results in focus: what 15-year-olds know and what they can do with

what they know. The Organisation for Economic Co-operation and Development (OECD).

Retrieved from. http://www.oecd.org/pisa/keyfindings/pisa-2012-results-overview.pdf

Patton, M. Q. (2002). Qualitative research and evaluation methods. Thousand Oaks: Sage.

Sadler, R. (1998). Formative assessment: Revisiting the territory. Assessment in Education, 5(1),

77–84.

Taras, M. (2005). Assessment – summative and formative – some theoretical considerations.

British Journal of Educational Studies, 53, 466–478.

Urmston, A., Raquel, M., & Tsang, C. (2013). Diagnostic testing of Hong Kong tertiary students’

English language proficiency: The development and validation of DELTA. Hong Kong Journal

of Applied Linguistics, 14(2), 60–82.

Yorke, M. (2003). Formative assessment in higher education: Moves towards theory and the

enhancement of pedagogic practice. Higher Education, 45(4), 477–501.



Part III



Addressing the Needs

of Doctoral Students



Chapter 6



What Do Test-Takers Say? Test-Taker

Feedback as Input for Quality Management

of a Local Oral English Proficiency Test

Xun Yan, Suthathip Ploy Thirakunkovit, Nancy L. Kauper, and April Ginther



Abstract The Oral English Proficiency Test (OEPT) is a computer-administered,

semi-direct test of oral English proficiency used to screen the oral English proficiency of prospective international teaching assistants (ITAs) at Purdue University.

This paper reports on information gathered from the post-test questionnaire (PTQ),

which is completed by all examinees who take the OEPT. PTQ data are used to

monitor access to the OEPT orientation video and practice test, to evaluate examinee perceptions of OEPT characteristics and administration, and to identify any

problems examinees may encounter during test administration. Responses to the

PTQ are examined after each test administration (1) to ensure that no undue or

unexpected difficulties are encountered by examinees and (2) to provide a basis for

modifications to our administrative procedures when necessary. In this study, we

analyzed 1440 responses to both closed-ended and open-ended questions of the

PTQ from 1342 test-takers who took the OEPT between August 2009 and July

2012. Responses to these open-ended questions on the OEPT PTQ provided an

opportunity to examine an unexpectedly wide variety of response categories. The

analysis of the 3-year data set of open-ended items allowed us to better identify and

evaluate the effectiveness of changes we had introduced to the test administration



X. Yan (*)

Department of Linguistics, University of Illinois at Urbana-Champaign,

Urbana-Champaign, IL, USA

e-mail: xunyan@illinois.edu

S.P. Thirakunkovit

English Department, Mahidol University, Bangkok, Thailand

e-mail: suthathip.thi@mahidol.ac.th

N.L. Kauper

Oral English Proficiency Program, Purdue University, West Lafayette, IN, USA

e-mail: nkauper@purdue.edu

A. Ginther

Department of English, Purdue University, West Lafayette, IN, USA

e-mail: aginther@purdue.edu

© Springer International Publishing Switzerland 2016

J. Read (ed.), Post-admission Language Assessment of University Students,

English Language Education 6, DOI 10.1007/978-3-319-39192-2_6



113



114



X. Yan et al.



process during that same period of time. Carefully considering these responses has

contributed substantially to our quality control processes.

Keywords Test-taker feedback • Quality management • Speaking assessment •

International teaching assistants • Semi-direct tests • Ethics and responsibility • Test

administration



1



Introduction



Purdue University is a large, research-intensive US university located in Indiana,

specializing in science and engineering, with a large and growing number of international students. Purdue’s Oral English Proficiency Program (OEPP) was established in 1987 in response to the perceived crisis in higher education associated with

the presence of, and dependence on, a growing number of international graduate

students to teach undergraduate introductory courses. One reaction to the “foreign

TA problem” (Bailey 1984) was to establish English language screening and training programs for prospective international teaching assistants (ITAs). At the time,

many state governments were mandating screening and training programs

(Oppenheim 1997), parents and students were bringing lawsuits against universities, and Purdue’s program was established to protect the university from both.

Today, ITA programs are well established, and ITA screening has become one of the

most widely practiced forms of post-entry testing at large research universities in

the United States.

From 1987 to 2001, to screen prospective ITAs the OEPP used the Speaking

Proficiency English Assessment Kit (SPEAK), which is an institutional version of

the Test of Spoken English developed by Educational Testing Service (ETS) (1985).

In 2001, the program replaced the SPEAK with a locally developed test, the Oral

English Proficiency Test (OEPT), which is semi-direct (using pre-recorded questions and no interlocutor) and computer-administered. The introduction of the

OEPT resulted in a number of improvements in terms of construct representation

and test administration, not the least of which was a reduction of two-thirds in the

time required for test-taking and rating. Although computer-administered, semidirect oral English testing is now widespread (notably in the internet-based TOEFL

(iBT)), when we introduced the OEPT in 2001, test-taker preparation for and comfort with this testing format was less than assured.

In a situation in which an institution requires its students to take a test, every

effort must be made to ensure that prospective examinees understand the motivation

for the testing and have access to appropriate test preparation materials. The OEPP

is primarily responsible for the provision of both test justification and preparation

materials, but we share this responsibility with Purdue’s Graduate School and with

each of the more than 70 graduate departments and programs that require oral

English screening. The post-test questionnaire (PTQ) was introduced with the



6 What Do Test-Takers Say? Test-Taker Feedback as Input for Quality Management…



115



OEPT in 2001 in order to (1) track student access to and use of test preparation

materials and (2) understand and monitor examinee perception of general OEPT

characteristics. Section III of the PTQ, consisting of two open-ended questions, was

added in 2009 in order to identify any problems that may have been missed in testtaker responses to the fixed-response items in Sections I and II. Monitoring examinee feedback through the PTQ has become a central component of our quality

management process.



2

2.1



Literature Review

Test-Taker Feedback About Semi-direct Testing



Of particular interest in our context are studies examining test-taker feedback about

semi-direct testing formats for oral proficiency testing. Given that the Speaking

subsection of the TOEFL iBT is semi-direct and is taken by the majority of international applicants for North American universities to demonstrate required language

proficiency, semi-direct oral proficiency testing can now be assumed largely familiar to prospective international examinees; however, familiarity does not ensure

comfort with, or acceptance of, the procedures associated with the semi-direct

format.

The benefits of semi-direct testing are largely associated with cost effectiveness

and efficiency in that interviewers are not required and ratings of recorded performances can be captured, stored, and rated remotely after the real-time administration of the test. However, cost efficiency alone cannot justify the use of semi-direct

formats, and researchers have considered the comparability of semi-direct and

direct formats to determine whether examinees are ranked similarly across formats.

In a comparison of the ACTFL Oral Proficiency Interview (OPI) to its semi-direct

counterpart (the ACTFL SOPI), Stansfield and Kenyon (1992) reported a high

degree of concurrent validity based on strong positive correlations (0.89–0.92)

across direct and semi-direct formats. Shohamy (1994) also found strong positive

correlations across a Hebrew OPI and SOPI but cautioned against assuming total

fidelity of the formats as language samples produced in response to the direct OPI

tended to be more informal and conversational in nature, while those produced in

response to the SOPI displayed more formality and greater cohesion.

The absence of an interviewer can be seen as either a negative or positive attribute of the semi-direct format. The most obvious drawback in the use of semi-direct

formats lies in the omission of responses to questions and in the lack of opportunity

for responses to be extended through the use of interviewer-provided probes; that is,

the apparent limitations to the validity of the format are due to the absence of interactivity. On the other hand, standardization of the test administration removes the

variability associated with the skill and style of individual interviewers, resulting in

an increase in reliability and fairness, in addition to cost effectiveness and

efficiency.



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

5 Students’ Perception of the Impact of DELTA on their English Language Learning Habits

Tải bản đầy đủ ngay(0 tr)

×