Sunday, April 27, 2008

Assessing Written Language Proficiency of ESL Students

Emily Watkins

In recent years, as the validity and integrity of standardized testing has been called into question, more research has been conducted investigating the validity of existing forms of second language proficiency assessment. Effectively assessing the written proficiency of ESL students has proven particularly difficult, as language is not only being assessed, but must be used as a medium to complete the assessment. Teachers of English as a Second Language and researchers will take particular interest in the literature reviewed, as it analyzes whether written proficiency of ESL students can be best assessed through pencil and paper tests, computerized tests, or a student’s written work. The literature reviewed also addresses how to best evaluate the written work of ESL students.

Existing forms of Assessment
Traditionally, written language proficiency has been assessed through pencil and paper tests, computerized tests and standardized tests. The validity of such forms of assessment has been challenged (Genesee, McNamara, Upshur). These assessments often attempt to assess the writing process and various components of “writing ability” such as vocabulary and syntax and have proven to be inadequate (McNamara). Simply grading the written work of students has also proven to be an inadequate means of assessing language proficiency, as writers can often avoid using vocabulary, grammar, etc. they are unsure of (Perkins).

The popularity of standardized tests as a means of assessment has grown in recent years, because of federal legislation such as No Child Left Behind and because they can be more fairly and objectively scored. Standardized tests may typically be more reliable than teacher-made assessments, but they often have cultural, racial, and gender bias, a huge disadvantage when assessing ESL proficiency.

The most recent suggestion to remedy the invalidity and bias of existing forms of assessment is the use of computerized assessments such as DIALANG. DIALANG, a computer program that consists of diagnostic foreign languages tests, provides a detailed analysis of strengths and weaknesses and offers immediate feedback (Alderson). Despite support and its more proven statistics, many schools may not have the resources to fund this kind of computer technology for students.

Alternate Forms of Assessment
Because of the invalidity of existing forms of assessment, researchers and teachers have sought other means of assessment in hopes of providing a “fairer” evaluation. No single test can produce entirely reliable and valid results, nor is one type of assessment transferable to all situations (Genesee, Perkins, Upshur). Rather than using an actual “test” as a form of assessment, it has been suggested that using the written work of students may prove to be a more accurate assessment of written language proficiency.

When assessing the written work of students, the use of performance based assessment is the most popular alternative to traditional testing (Cruickshank, Bainer Jenkins, Metcalf). Using portfolios, or collections of student work, is beneficial as it provides evidence of student progress and the developing knowledge and craft of writers (Bainer Jenkins, Cruickshank, Metcalf, Hurley, Villamil Tinajero). Because portfolios can include a variety of works, such as papers, journals, projects, and homework, they can be used to make judgment on a range of skills.

Journals have also proven helpful as a means of evaluation by serving as interactive conversations that force a student to take ownership of learning. Like the written work in a portfolio, journals also monitor proficiency and can be used to keep record of student progress. Journaling also provides insight that tests are unable to provide, such as insight into the student’s linguistic, cultural, and educational backgrounds and experiences, as well as their attitudes and goals (Genesee, Hurley, Upshur, Villamil Tinajero).

Evaluating written work
When evaluating the written work of ESL students, careful consideration should go into the grading process. To better chart progress rather than one single written product, it may be helpful for students to complete rough drafts and turn these in with a completed final copy (Hurley, Villamil Tinajero). This way, teachers can monitor the corrections and changes students make and base their evaluation off of more than one piece of work.

Essays and other written work produced by ESL students are often evaluated holistically. Holistic assessment, or simply designating a number or letter grade for student writing, is often insufficient for assessing the many components of student writing (Hamp-Lyons, Hurley, Perkins, Villamil Tinajero). In holistic assessment, diagnostic feedback is impossible and thus provides no help for students hoping to understand their grade assigned or curious about how to improve their writing (Hurley, Villamil Tinajero).

Rather than holistic assessment, the use of multiple trait assessment, or rubrics, is much more appropriate when evaluating the written work of ESL students. Multiple trait assessment takes into account the differences of ESL learners, whereas holistic scoring and essay tests do not. Multiple trait assessments also aid teachers in assigning grades more consistently and give the student feedback on particular aspects of their writing (Cruickshank, Bainer Jenkins, Metcalf, Hurley, Villamil Tinajero).

Conclusion
Research existing on the validity of existing forms of assessment is lacking. Research shows that the traditional methods of evaluating written language proficiency are insufficient and that no type of assessment (whether it be a pencil and paper test, a computerized test, or standardized test) is completely reliable. Although the use of computerized diagnostic tests such as DIALANG may more validly assess language proficiency, this method may not be feasible for all school districts. Alternate assessments, such as performance based assessments, can chart student progress and offer insight into student writing that other means of assessment cannot, but research on the validity of this means of evaluation is also lacking.

To close this gap, future research can be devoted to investigating valid, reliable, and non-biased means of assessment that can be easily implemented in schools. Although some research has been done that shows the benefits of multiple trait assessment over holistic assessment, further research can be conducted on how these types of measurements can be modified to suit ESL writing needs.

For works cited, see annotated bib.

No comments: