53
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Detecting Rater Bias in Mixed-Format Assessments

ORCID Icon & ORCID Icon
 

ABSTRACT

Mixed-format assessments made up of multiple-choice (MC) items and constructed response (CR) items that are scored using rater judgments include unique psychometric considerations. When these item types are combined to estimate examinee achievement, information about the psychometric quality of each component can depend on that of the other. For example, the presence of differential item functioning (DIF) in MC items could compromise the sensitivity of rater effect indicators, such as differential rater functioning (DRF) indices. Likewise, the presence of DRF could compromise the sensitivity of DIF indices. We used real data and a simulation study to consider the impact of DIF on DRF indices, and the impact of DRF on DIF indices. Results indicate some interaction between DIF and DRF that varies across data collection designs. We consider the implications of these results for research and practice.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1. For data security purposes, we rounded the sample sizes to the nearest 10.

Additional information

Funding

The work was supported by the Spencer Foundation [202000164]

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.