Abstract
This research investigated whether features of examination questions influence students with dyslexia differently to others, potentially affecting whether they have a fair opportunity to show their knowledge, understanding and skills. A number of science examination questions were chosen. For some questions two slightly different versions were created. A total of 54 students considered by their teachers to have dyslexia and a matched control group of 51 students took the test under exam conditions. A dyslexia screening assessment was administered where possible and some students were interviewed. Facility values and Rasch analysis were used to compare performance between the versions of the same question and between those with and without dyslexia. Chi-square statistics found no statistically significant differences in performance between groups or between question versions. However, some tentative implications for good practice can be inferred (e.g. avoiding ambiguous pronouns, using bullet points).
Acknowledgements
We would like to thank the teachers, students and examiners who assisted with the research. We would also like to thank Viv Kilburn at OCR and Jennifer Owen-Adams and Sue Flohr at the British Dyslexia Association for input to the planning of this research.
Notes
1. Unidimensionality is an assumption of Rasch analysis and thus has led to some criticism of the use of the Rasch model on the grounds that many assessments do not measure just one trait (e.g. Goldstein, Citation1979). However, this criticism has since been countered with the notion that unidimensionality does not necessarily imply only one trait or dimension but suggests one dominant dimension with other minor dimensions (e.g. Hambleton et al., Citation1991). It is suggested that multidimensionality is only a significant concern for Rasch modelling if there are two or more disparate dimensions (Linacre, Citation1998). See Panayides et al. (Citation2010) for a full discussion.