The Repository @ St. Cloud State

Open Access Knowledge and Scholarship

Date of Award

12-2018

Culminating Project Type

Thesis

Degree Name

English: Teaching English as a Second Language: M.A.

Department

English

College

College of Liberal Arts

First Advisor

Choonkyong Kim

Second Advisor

Edward Sadrai

Third Advisor

Tim Fountaine

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Keywords and Subject Headings

pedagogical grammar, error analysis

Abstract

This study consists of a computer-aided error analysis of grammar errors in 70 university placement essays, scores on which resulted in students being either placed in EAP (English for Academic Purposes) Level 1, placed in EAP Level 2, or exempted from the EAP program. Essay scoring happened prior to the study, using the department process whereby each essay was scored by at least two raters using an analytic rubric. An error taxonomy of 16 categories based on Lane and Lange (1999) was used to code the essay data. Data was assembled into a corpus and tagged using the text analysis program UAM (Universidad Autónoma de Madrid) CorpusTool. Results were exported and analyzed with statistical tests. The results of the study validate the EAP placement process. Scores in the language use section of the rubric were highly correlated with total scores, and inter-rater reliability was also found. Errors rates were also found to correlate with language use score, suggesting that raters were responding to grammatical errors in making their assessments. Comparisons between the three placement groups revealed significant differences in error rates between Level 2 and Exempt. Based on the correlations, between-group comparisons, and overall frequency of errors, six error categories were chosen for closer analysis: sentence structure, articles, prepositions, singular/plural, subordinate clauses, and other. The findings suggest that local errors, though often given low priority in textbooks, do significantly impact rater assessment. Results also suggest that error rates do not necessarily decrease with advancing level—some error rates may increase. Though this finding was surprising, it might be attributed in part to the fact that some errors can be evidence of interlanguage development as new forms are acquired. The study concludes with suggestions for teaching and future research.

Share

COinS