Difference between revisions of "Automated Essay Scoring"
Jump to navigation
Jump to search
Line 3: | Line 3: | ||
*Essays written by Hispanic and Asian-American students over-graded than those by White and African American peers. | *Essays written by Hispanic and Asian-American students over-graded than those by White and African American peers. | ||
*inaccurately give Chinese and Korean students significantly higher scores than human essay raters on a test of foreign language proficiency | *inaccurately give Chinese and Korean students significantly higher scores than human essay raters on a test of foreign language proficiency | ||
*Correlate more poorly and bias upwards in terms of GRE essay scores for Chinese students, | |||
Bridgeman, Trapani, and Attali (2012) [[https://www.tandfonline.com/doi/pdf/10.1080/08957347.2012.635502?needAccess=true pdf]] | Bridgeman, Trapani, and Attali (2012) [[https://www.tandfonline.com/doi/pdf/10.1080/08957347.2012.635502?needAccess=true pdf]] |
Revision as of 02:25, 24 January 2022
Bridgeman, Trapani, and Attali (2009) [pdf]
- E-Rater system that automatically grades a student’s essay
- Essays written by Hispanic and Asian-American students over-graded than those by White and African American peers.
- inaccurately give Chinese and Korean students significantly higher scores than human essay raters on a test of foreign language proficiency
- Correlate more poorly and bias upwards in terms of GRE essay scores for Chinese students,
Bridgeman, Trapani, and Attali (2012) [pdf]
- A later version of E-Rater system for automatic grading of GSE essay
- Model gave lower scores to African American students than human-raters
- Chinese students are given higher scores than human essay raters
- Speakers of Arabic and Hindi were given lower scores
Ramineni & Williamson (2018) [pdf]
- A later version of E-Rater system for automatic grading of GSE essay
- for some types of essays, E-Rater gave African American students substantially lower scores than human raters did