236
Views
0
CrossRef citations to date
0
Altmetric
Research Article

How Did Spain Perform In PISA 2018? New Estimates Of Children’s PISA Reading Scores*

ORCID Icon, ORCID Icon & ORCID Icon
Pages 177-198 | Published online: 28 Sep 2023
 

ABSTRACT

International large-scale assessments have gained much attention since the beginning of the twenty-first century, influencing education legislation in many countries. This includes Spain, where they have been used by successive governments to justify education policy change. Unfortunately, there was a problem with the PISA 2018 reading scores for this country, meaning the OECD refused to initially release the results. Therefore, in this paper we attempt to estimate the likely PISA 2018 reading scores for Spain, and for each region within. The figure finally published by the OECD for Spain – in terms of reading scores – was 476.5 points, which is between the lower and upper bound of the interval we find (475 to 483 test points in 2018). Additionally, we report some robustness checks for the OCED countries participating in PISA 2018, which show that the difference between the actual scores and the ones we found with the imputation methods are quite close.

7. Disclosure Statement

No potential conflict of interest was reported by the author(s).

8. Supplementary Data

Supplemental data for this article can be accessed online at https://doi.org/10.1080/00071005.2023.2258184

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Notes

1 There are many other examples of problems with PISA data in specific countries; for example, in PISA 2012 Albania presented some serious irregularity (OECD, Citation2014a; Annex A4), in PISA 2015 Albania, Argentina, Kazakhstan and Malaysia (OECD, Citation2016; Annex A4) and, in PISA 2018, Viet Nam and Spain (OECD, Citation2019c; Annex A4).

2 The list of countries participating on paper-based assessment in PISA 2018 can be found in OECD (Citation2019c; Annex A5).

3 Many more competences (such as financial literacy, problem-solving skills or the global competence) are assessed by PISA, together with other background questionnaires (parental, teacher, ICT, well-being, educational career questionnaires); nevertheless, their administration has been performed irregularly by PISA cycles and not all countries took them, so we focus on the competences and student information which remain fixed through PISA cycles.

4 Official information on other previous PISA subjects such as sample design and weighting can be found at OECD (Citation2009, Citation2012, Citation2014b, Citation2017, Citation2020a). A summary of this topic can be found in Jerrim et al. (Citation2017).

5 Due to the change from a paper- to a computer-based assessment since PISA 2015 some of these PISA procedures changed from one cycle to the following; hence, we focus here on the last cycle (2018), but more information on this subject for PISA 2009, 2012 and 2015 can be found at OECD (Citation2012, Citation2014b, Citation2017).

6 This global competence was new in PISA 2018 and it ‘“examines students” ability to consider local, global and intercultural issues, understand and appreciate different perspectives and world views, interact respectfully with others, and take responsible action towards sustainability and collective well-being’ (OECD, Citation2019c, p. 29).

7 More information on this procedure can be found in OECD (Citation2019a).

8 The software employed by the OECD to perform these IRT models is mdltm (von Davier, Citation2005).

9 This background information was incorporated by, first, coding variables so that refused responses could be included (i.e., contrast coding); then, a principal component analysis was performed, so that background information can be summarised and information from students with missing values can be kept, satisfying the linearity assumption for the model (OECD, Citation2020a).

10 The OECD employed the software DGROUP (Rogers et al., Citation2006) to estimate the multivariate latent regression model and obtain the plausible values to estimate this model, fixing the parameters of the cognitive items obtained from the multi-group IRT models.

11 PISA technical reports have widely shown these high correlations between the three domains (e.g., OECD, Citation2020a). Although these reports do not analyse much the underlying mechanism behind these high correlations (Ding and Homer, Citation2020), some authors such as Ashkenazi et al. (Citation2017) indicated that there might be shared cognitive processes (e.g., memory) or a general ability (e.g., intelligence) which may contribute to the three of them simultaneously. Additional explanations might be that reading ability may act as a proxy of some other constructs that influence mathematics and science performance (Grimm, Citation2008) or the relatively high reading demands in PISA’s cognitive tests for all domains (Wu, Citation2010).

12 These correlations are similar once demographic characteristics and school composition have been controlled.

13 The ESCS index was created by the OECD using the highest level of education of parents, highest parental occupation, and home possessions by the use of principal component analysis (OECD, Citation2020a).

14 These variables have been consistently found in the literature to be very relevant in the definition of the education production function (Hanushek, Citation1979; Hyde et al., Citation1990; Karadag, Citation2017; Reilly et al., Citation2015; Sirin, Citation2005; Wößmann, Citation2005).

15 These results make sense. For PISA 2018, in Ireland reading scores are high compared to mathematics and science scores (518, 500, 496, respectively; OECD, Citation2019c), meaning we get the largest imputed value for Spain when using this nation as the donor country. On the other hand, in Japan reading scores are lower than mathematics and science scores (504, 527, 529, respectively; OECD, Citation2019c), meaning that our predicted score for Spain is very low.

16 In order to check the capacity of our model to predict gender differences in scores for PISA 2018, we have run a similar specification for Spain in mathematics, considering this domain as missing completely at random for PISA 2009 to 2018. This model has accurately estimated PISA 2009, 2012 and 2015 mathematics scores by gender and has also predicted a mathematics score for boys of 489 (being the real one 485) and 479 for girls (being the real one 478), so we can be quite confident on the results of our multiple imputation model also by gender. Results for mathematics scores in previous PISA cycles will be provided upon request to the authors.

17 These differences in terms of standard deviations have been obtained by calculating the absolute difference between the previous and predicted reading scores for that region and dividing the result by 100 (which is plausible values’ standard deviation).

Additional information

Funding

This work has been partly supported by FEDER funding [under Research Project PY20-00228]; Ministerio de Ciencia e Innovación [under Research Project PID2020-119471RB-I00]; by the Andalusian Regional Government (SEJ-645) and the Universidad de Málaga under Research Project B1-2022_23.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 417.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.