The Construction And Validation of a test in English for Tertiary Education With Reference to Addis Ababa University

No Thumbnail Available

Date

1995-06

Journal Title

Journal ISSN

Volume Title

Publisher

Addis Ababa University

Abstract

This study explores the validity of a battery of tests of English for academic purposes. A goal-oriented, skills- and task-based test of English is developed and validated as a measure of the English language disposition of Freshman students at Addis Ababa" University. The test is based on analysis of the communicative language needs of the students (Morris, 1982; Haile Michael, 1993): the receptive skills of reading and listening rank higher than the productive skills of writing and speaking. The reliability of the test is quite satisfactory: the coefficients are 0.94 (KR20) for the written test, 0.88 (KR20) for the listening test and 0.73 (KR21) for the oral test. The level of difficulty: and discrimination of the test is reasonable. The mean facility value of the written test is 0.640 with mean discrimination index of 0.314; and the mean facili ty value of the listening test is 0.612 with mean discrimination index of 0.526. All the sections of the tests are properly contributing to the total tests. The average item-test correlations of the sections range from 0.242 to 0.515 for both tests. The validity of the test is also satisfactory. Evidence from the comments of students and language teachers suggests that the tests have good face and content validities. There is evidence in support of the construct and concurrent validities of the tests. Both intercorrelations of sections and correlations between tests show a degree of common variance as well as some unshared variance; ie, with overlaps not exceeding 44% for the former and 62% for the latter. In other words, they are all testing English, but different aspects or skills are also being tapped as well. The greatest agreement (r=O.81) is observed between the new written test and the criterion (written) test. In addition, there is evidence of a clear relationship between the test scores and the University Semester Grade Point Average (SGPA): both the written test and the test of listening correlate with SGPA at r=O.6. Finally, the study concludes with some observations on the testing of English at this University based on these findings.

Description

Keywords

test in English for Tertiary Education

Citation