Difference between revisions of "Socioeconomic Status"
Jump to navigation
Jump to search
(added Kung & Yu (2020)) |
(Added Litman et al. (2021)) |
||
Line 27: | Line 27: | ||
* Equal performance for low-income and upper-income students in course grade prediction for several algorithms and metrics | * Equal performance for low-income and upper-income students in course grade prediction for several algorithms and metrics | ||
* Worse performance on independence for low-income students than high-income students in later GPA prediction for four of five algorithms; one algorithm had worse separation and two algorithms had worse sufficiency | * Worse performance on independence for low-income students than high-income students in later GPA prediction for four of five algorithms; one algorithm had worse separation and two algorithms had worse sufficiency | ||
Litman et al. (2021) [https://link.springer.com/chapter/10.1007/978-3-030-78292-4_21 html] | |||
* Automated essay scoring models inferring text evidence usage | |||
* All algorithms studied have less than 1% of error explained by whether student receives free/reduced price lunch |
Revision as of 11:32, 4 July 2022
Yudelson et al. (2014) pdf
- Models discovering generalizable sub-populations of students across different schools to predict students' learning with Carnegie Learning’s Cognitive Tutor (CLCT)
- Models trained on schools with a high proportion of low-SES student performed worse than those trained with medium or low proportion
- Models trained on schools with low, medium proportion of SES students performed similarly well for schools with high proportions of low-SES students
Yu et al. (2020) pdf
- Models predicting undergraduate course grades and average GPA
- Students from low-income households were inaccurately predicted to perform worse for both short-term (final course grade) and long-term (GPA)
- Fairness of model improved if it included only clickstream and survey data
Yu et al. (2021) pdf
- Models predicting college dropout for students in residential and fully online program
- Whether the socio-demographic information was included or not, the model showed worse accuracy and true negative rates for residential students with greater financial needs
- The model showed better recall for students with greater financial needs, especially for those studying in person
Kung & Yu (2020)
pdf
- Predicting course grades and later GPA at public U.S. university
- Equal performance for low-income and upper-income students in course grade prediction for several algorithms and metrics
- Worse performance on independence for low-income students than high-income students in later GPA prediction for four of five algorithms; one algorithm had worse separation and two algorithms had worse sufficiency
Litman et al. (2021) html
- Automated essay scoring models inferring text evidence usage
- All algorithms studied have less than 1% of error explained by whether student receives free/reduced price lunch