452
Views
0
CrossRef citations to date
0
Altmetric
Research in K-12 Ed

Lessons to Demonstrate Statistical Literacy Skills: A Case Study of Japanese High School Students on Reading Statistical Reports

Abstract

Statistical literacy, encompassing the interpretation and evaluation of statistical reports, is a skill developed in schools. However, the pedagogical approaches to statistical literacy skills have not been sufficiently investigated. This study investigated whether statistical literacy skills could be demonstrated through high school lessons. These lessons include a statistical literacy process, grounded in critical thinking principles, to facilitate students’ interpretation and evaluation of statistical reports. Additionally, a corresponding worksheet, aligning with this process, is included in the lessons to help students comprehend statistical reports. Seven 50-min lessons were conducted with 34 high school students, aged between 17 and 18 years, using this worksheet. Additionally, a statistical literacy skills test, designed to assess various statistical literacy skills within a short timeframe, was developed and administered at the beginning of the first lesson and at the end of the last lesson. The outcomes revealed that students were able to demonstrate some skills and effectively evaluate statistical reports. This study makes notable contributions by introducing a structured statistical literacy process and an associated skills assessment test. These tools can serve as valuable resources for students seeking to critically evaluate statistical reports and enable educators to gauge the extent of students’ acquisition. Supplementary materials for this article are available online.

1 Introduction

Many organizations have produced numerous statistical reports that include information such as the population studied, the method of data collection, and the results of the analysis. These reports, often disseminated via the media, must be critically evaluated and are linked to decision making. The ability to evaluate such reports is referred to as statistical literacy, and while multiple definitions of statistical literacy exist, this study adheres to the following definition (Gal Citation2002):

(a) people’s ability to interpret and critically evaluate statistical information, data-related arguments, or stochastic phenomena, which they may encounter in diverse contexts, and when relevant (b) their ability to discuss or communicate their reactions to such statistical information, such as their understanding of the meaning of the information, their opinions about the implications of this information, or their concerns regarding the acceptability of given conclusions (pp. 2–3).

Gal (Citation2002) elucidates that statistical literacy comprises a variety of knowledge, skills, and attitudes. He recommends that people evaluate statistical reports by employing a set of “worry questions” (p. 16), which consists of various questions for statistical information (e.g., How reliable or accurate were the instruments or measures used to generate the reported data? Overall, are the claims made here sensible and supported by the data?) and assists in the critical evaluation of the contents of statistical information.

This study focuses on using lessons to demonstrate statistical literacy skills, introducing a structured process (see Section 2.2) for interpreting and communicating statistical information. The study unfolds by presenting a series of lessons designed to instill multiple statistical literacy skills and culminates in an assessment tool to gauge students’ proficiency in these skills. Moreover, the study endeavors to evaluate the effectiveness of the devised lessons through practical implementation with high school students. The remainder of this article is structured as follows. Section 2 delves into the background of statistical literacy education, describes the statistical literacy process, and introduces the research question. Section 3 elaborates on the methodology, outlining the lesson design for reading statistical reports while demonstrating statistical literacy skills. Section 4 offers an in-depth analysis of the results, and Section 5 concludes this study, encapsulating its findings, delineating its limitations, and charting potential avenues for future research.

2 Background

2.1 Forms of Statistical Literacy Education

The Pre-K-12 Guidelines for Assessment and Instruction in Statistics Education II (GAISE II) Report (Bargagliotti et al. Citation2020) states that our lives are heavily influenced by data, and statistical reasoning is crucial to enable evidence-based decision making. Furthermore, the Pre-K-12 GAISE II Report points out the need for statistical literacy to achieve success at work, stay informed about current events, and lead healthy, happy, and productive lives. Schooling has attempted to teach statistical literacy and a review of previous studies reveals the following three points regarding teaching statistical literacy. First, the most important teaching material is current media reports as statistical literacy instruction is developed around reports from the media (Bargagliotti et al. Citation2020). For example, Merriman (Citation2006) and Budgett and Pfannkuch (Citation2010) developed activities that involve reading the content of published statistical reports. Additionally, Budgett and Rose (Citation2017) introduced reports on content that interests students and consequently motivates them to develop statistical literacy skills; their study used Gal’s (Citation2002) list of “worry questions” to encourage students to question and discuss the report. Second, it is important to include activities wherein learners discuss the content of the reports with each other; this has been developed in lessons by, for example, Merriman (Citation2006). Gal’s (Citation2002) definition of statistical literacy includes the “ability [for people] to discuss or communicate their reactions to such statistical information” (p. 3), suggesting that discussing the content of statistical reports is an essential activity in statistical literacy education. Parke (Citation2008) conducted statistics courses for graduate students in which group discussions were introduced, during which students communicated about data analysis or commented on paragraphs written by their peers. This study suggested that collaboration with peers can help students develop a deeper understanding of statistical concepts and raise their level of comfort in talking about statistics. Third, it is important for teachers to encourage students to question the content of a statistical report. Consumers of statistical information are expected to develop habits such as skepticism, considering variability, and relating data to the context of the problem (Chance Citation2002). These three points need to be addressed when teaching statistical literacy in schools.

2.2 Statistical Literacy Process

Given the multifaceted nature of statistical literacy, as elucidated by Gal (Citation2002), it becomes evident that people who possess this literacy exhibit diverse statistical literacy skills. They do not merely rely on statistical knowledge in reading statistical reports. Statistical literacy is at the core of interpreting and evaluating statistical reports appropriately. In the process of interpreting and evaluating statistical information, a variety of skills are required. Therefore, if a statistical literacy process is developed and learners are aware of this flow, they may be able to demonstrate a variety of skills in reading statistical reports. Accordingly, this study developed a process that helped students demonstrate various statistical literacy skills.

Organizing the concept of statistical literacy, Sharma (Citation2017) stated “it is evident that statistical literacy is a complex construct that requires not only a range of basic skills (reading, comprehension and communication) but also higher-order cognitive skills of interpretation, prediction and critical thinking” (p. 129). Similarly, Ziegler and Garfield (Citation2018) stated that “definitions of statistical literacy range from the context of basic skills (Garfield, delMas, and Chance Citation2002) to critical thinking (Gal Citation2002)” (p. 161). Watson (Citation1997), cognizant of the pivotal role of context and critical thinking, examined evaluation tasks predicated on dubious statistical reports disseminated by the media. Tishkovskaya and Lancaster (Citation2010) introduced an instructional unit designed to foster statistical literacy, cultivating students into astute consumers of statistical information by encouraging critical thinking. Weiland (Citation2017) stated there are two ways of using the term “critical” in statistical literacy. The first is critical thinking and the second is “using statistics as a lens and using a broader reflective lens of the world in dialog, enabling a new view of the world” (Weiland Citation2017, p. 39).

Koga (2022) provided a framework for statistical literacy skills built on critical thinking and comprising eight different skills from A to H: Identifying, Questioning, Judging, Clarifying, Assessing, Conjecturing, Considering, and Concluding (). These skills were developed based on the critical thinking skills proposed by Facione (Citation1990), Ennis (Citation1996), and others, and are demonstrated when interpreting and evaluating statistical reports. Since critical thinking can be viewed as a series of cognitive processes (Dwyer, Hogan, and Stewart Citation2014), reading reports while demonstrating these skills may help readers evaluate them appropriately. Additionally, critical thinking functions as the foundation of statistical literacy, which includes elements such as understanding arguments and critiquing others’ inferences.

In this study, a statistical literacy process () that systematically arranged the Koga (2022)’s eight statistical literacy skills mentioned above was employed. The lettering of the eight skills in the is the same as lettering shown in Koga (2022). The process was developed based on Ennis (Citation1996) FRISCO approach that involved the six critical thinking elements: Focus, Reasons, Inference, Situation, Clarity, and Overview. Ennis (Citation1996) described how, when thinking critically, people sometimes go through these elements in order, however often, the order is changed or elements are skipped. This study depicted what these six elements mean in the context of statistical literacy; specifically, the evaluation of statistical information. Notably, this study did not include Situation among the other FRISCO elements, primarily because it was already set in the context of reading in school lessons.

Fig. 1 Statistical literacy process.

Fig. 1 Statistical literacy process.

To interpret and evaluate reports, it is important first to understand their content properly. Statistical information transmitted by the media often contains particular claims. To be satisfied with these claims, sufficient information is needed to evaluate the quality of the data. Nolan and Stoudt (Citation2021) stated that it is important to know about the type of data collected and the conditions under which it was collected. This information can help evaluate the appropriateness of the data analysis methods, determine whether and how valid the conclusions drawn by the authors are, and evaluate the limitations, generalizability, and impact of the study. To understand a report, it is crucial to correctly interpret the meaning of statistical terms (e.g., average and percentage) and ambiguous expressions (e.g., “concerns” and “relates”). These correct interpretations correspond to the concept of Clarity within the FRISCO framework and consequently, Skill D (Clarifying Skill) comes into play at this stage.

Once the content of the report is understood, the reader must determine what claims are being made within the report. The concept of Focus within the FRISCO framework is about identifying the report’s claims. After identifying the claims, the evidence supporting the claims must be identified and evaluated for reliability. That is, when evaluating a statistical report, one must clarify the claims and evidence as well as determine whether the evidence is reliable. Therefore, it is necessary to demonstrate Skill A (Identifying Skill), clarify the claims and evidence within the statistical information, and assess the reliability of the evidence through Skill B (Questioning Skill).

The claims cannot be considered reliable simply because their evidence appears reliable, especially when the strength of the evidence supporting them remains unknown. The concept of Inference within the FRISCO framework judges whether the evidence is sufficient to support the claims. Thus, the validity of the claims can be judged by the reliability and relevance of the evidence. This requires Skill C (Judging Skill) where the reader can assess whether the claim generalized had limited statistical evidence, or whether the claim of causality was based on the evidence of correlation. Additionally, Skill F (Conjecturing Skill) is exercised to ascertain whether the claims are reliable by finding other pieces of evidence from the statistical information being read or by looking for relevant information from other sources. Based on the various and available information obtained by demonstrating these skills, the acceptability of the claims is assessing. Therefore, at this stage, Skill E (Assessing Skill) comes into play.

So far, the reader has understood the content of the report, identified the claims and evidence, and judged their validity. Subsequently, it is important for them to summarize the report and evaluate whether it is reliable. The concept of Overview within the FRISCO framework calls for a check on what has been discovered, decided, considered, learned, and inferred. In other words, it summarizes previous checks to see if they make sense. Once Skills A (Identifying Skill), B (Questioning Skill), C (Judging Skill), D (Clarifying Skill), F (Conjecturing Skill), and E (Assessing Skill) are considered, learners demonstrate Skill H (Concluding Skill) to draw their own conclusions from an examination of a particular situation or context. In evaluating this reliability, it is necessary to provide adequate logical justification for such an evaluation. For instance, if the claims are judged to be unreliable, Skill G (Considering Skill) would suggest how the investigation in the report could be improved; namely, how the sampling method and research design should be improved. This skill is demonstrated when considering other possible conclusions and forming alternative interpretations about statistical information. Dwyer, Hogan, and Stewart (Citation2014) framework suggests that monitoring the elements involved in critical thinking is an important part of critical thinking itself. For example, critical thinkers should summarize their thoughts and check whether they are coherent. Furthermore, to identify areas for improvement, they should reflect on how they evaluated the information and decide which aspects were problematic.

2.3 Research Question

If students learn statistical literacy skills through lessons, they may be able to interpret and evaluate statistical reports appropriately. However, much about the teaching method of report interpretation and evaluation namely, what steps are used to interpret reports and what are the focal points in content evaluation remains unclear. Statistical information, especially reports in media, contains a certain number of complex sentences along with figures and tables. Therefore, critical reading of these content is required. It is believed when students read statistical reports with the statistical literacy process () in mind, they can demonstrate a variety of statistical literacy skills and also properly interpret and evaluate the reports. Therefore, this study set the following research question: What statistical literacy skills will students be able to demonstrate in the lessons that focus on activities involving an awareness of the statistical literacy process?

This study developed lessons for high school students to support their interpretation and evaluation of statistical reports by demonstrating statistical literacy skills and examined the effectiveness of the lessons. The statistical reports evaluated by the students were based on sample surveys and observational studies.

3 Methodology

3.1 Participants

In Japan, school-based statistics education generally involves learning to analyze data and to calculate statistics using software. In the Japanese high school curriculum, students (aged 15–18 years) learn how to calculate summary statistics and correlation coefficients, and how to use spreadsheet software in the required subjects “mathematics” and “information.” However, the Japanese curriculum does not deal with how to conduct surveys (e.g., experimental or observational studies) or how to identify causal relationships. Until recently, the Japanese curriculum did not focus on statistical literacy. Accordingly, most Japanese high school students had not received sufficient statistical literacy education.

This study involved 34 high school students; a class of students aged between 17 and 18 years in a private high school who were enrolled in the diploma of general course (where more than 70% of Japanese high school students belong) in Japan. While these students had previously covered statistical topics such as averages and scatterplots in their previous grades, they had not received instruction on conducting surveys or identifying causal relationships. Importantly, this study marked their first encounter with statistical literacy skills, specifically the task of reading and evaluating statistical reports. The class was composed in such a way that the ratio of males to females was similar (approximately 1:1). Additionally, most students go on to higher education after graduation. The class was taught by the teacher who had been instructed by the author on how to teach these lessons. Over the course of the study, seven 50-min lessons, specifically designed for this research were delivered by the teacher from June to July 2021. The lessons were recorded with a video camera and the author observed the lessons from the back of the classroom.

3.2 Lesson Worksheet Creation

A worksheet () consisting of five items was developed to reflect the statistical literacy process so that the students would be aware of the flow of this process through the lessons we designed. The worksheet was created in Google Spreadsheets, and students are to fill it out while reading the statistical reports.

Fig. 2 Worksheet used in the lessons (translated from Japanese).

Fig. 2 Worksheet used in the lessons (translated from Japanese).

When students complete the worksheet while reading reports, they are expected to demonstrate statistical literacy skills. First, students read the report, find terms or ambiguous expressions, and enter them in “1: Checking terminology and expressions.” This item relates to Skill D (Clarifying Skill), the first step in the statistical literacy process. Next, in “2: Examining the content of this report,” students identify the claims and the evidence that supports those claims in the report and record them in the “Claims” and “Evidence” sections. Then, they report the “Level of confidence in the Evidence and reasons,” where, in the first line of this section, they can choose from four levels of confidence ranging from “very confident” to “not confident at all.” In the second line, they can enter the reasons for their choice. After checking the confidence level of the evidence, they enter in the “Judgment” and “Reasons on Judgment” sections whether the evidence sufficiently supports the claims. In “Judgment,” they can choose between two levels: “The evidence supports the claims” or “does not support the claims.” In the “Reasons on Judgment” section, they can state the reasons for their choice. Therefore, in Item 2, the students were expected to demonstrate Skills A (Identifying Skill), B (Questioning Skill), and C (Judging Skill). In “3: Examining the information and hidden agendas,” students infer and enter the information and intentions of the authors that were not stated in the report and should be considered when evaluating the report content. Item 3 is related to Skill F (Conjecturing Skill).

In “4: Evaluation,” students evaluate whether the report’s content is reliable based on their examination of Items 1–3 on the worksheet. First, under “The content in this report is…,” students select four options, ranging from “Reliable” to “Unreliable.” Then, on the line below, they enter why they made that choice. If they did not choose “Reliable,” the claims in the report may be false, and the students must thus state how the statistical evidence could have been collected to support the claims in the report. In “5: Alternatives to the survey plan,” students enter suggestions for improvements to the survey methodology in the report. They state their own suggestions for collecting the sample and the kinds of survey methods and experiments that could alternatively be used. Items 4 and 5 are related to Skills E (Assessing Skill), G (Considering Skill), and H (Concluding Skill), which are expected to be demonstrated at the end of the statistical literacy process.

3.3 Lessons

This study encompassed a series of seven 50-min lessons. In Lesson 1, students completed a pretest on statistical literacy skills. In Lessons 2–5, students evaluated pre-selected statistical reports. In Lessons 6–7 students self-selected statistical reports to read and evaluate and also completed a posttest on statistical literacy. The core objective of the lessons was to facilitate the demonstration of statistical literacy skills by engaging students in the reading and evaluation of statistical reports published by the media. All reports used in the lessons were drawn from the Evaluate Statistically Based Reports (ESBR) unit’s external tests in New Zealand. The ESBR unit encompasses numerous lessons tailored for Year 13 students (ages 17–18) and covers topics such as survey design, non-sampling errors, causal relationships, and possible sources of bias (New Zealand Qualifications Authority [NZQA] 2019). The external test, administered by the NZQA, is an end-of-year test which is also known as New Zealand’s National Certificates of Educational Achievement, New Zealand’s national qualification for high school students. In the ESBR unit’s external tests, one or two statistical media reports and some questions about their content are presented. The author selected media reports from the external tests that were appropriate for the content of each lesson in this study. Then the author translated them into Japanese and slightly adapted the content to the Japanese context.

The outline of the lessons is shown in . The lessons dealt with identifying claims and evidence in the reports, examining the connections between evidence and claims, and examining the reliability of the statistical report’s claims. Specifically, they focused on examinations of the population and sample representativeness, the validity of the interpretation of results due to differences in causality and correlation, and measurement errors caused by the questionnaire and researcher. Three reports suitable for the lessons’ content were selected from ESBR unit’s external tests. These reports were evaluated by the students in Lessons 2–5, through the worksheet inputting.

After evaluating the reports, at the end of each lesson, students completed the worksheet and summarized what they learned in the checklist shown in . Based on what they learned in the lesson, students were asked to fill in what they will look for in their daily lives when reading reports. Similar to Gal’s (Citation2002) list of “worry questions,” this checklist presents a list of things to look for when reading reports.

Fig. 3 Checklist used in the lessons (translated from Japanese).

Fig. 3 Checklist used in the lessons (translated from Japanese).

In Lessons 6 and 7, the students searched on the Internet for statistical reports that interest them and examined these reports’ reliability. This activity was designed to summarize the previous lessons and to help students realize the importance of demonstrating statistical literacy skills. First, the teacher distributed the assignment file () to students and explained its content. Working in groups of two or three, the students identified reports they deemed less credible and proceeded to complete the assignment file with their evaluations. The teacher also encouraged the students to reflect on their learning experiences. At the end of Lesson 6, the students turned in the assignments and then at the beginning of Lesson 7, the teacher returned the files with comments to encourage students to explore statistical information from diverse perspectives.

Fig. 4 Assignment for students used in Lessons 6 and 7.

Fig. 4 Assignment for students used in Lessons 6 and 7.

3.4 Statistical Literacy Skills Test

An assessment was conducted to examine whether the students demonstrated statistical literacy skills. The statistical literacy skills test had two sections (), both of which contained statistical reports, that were created based on those given in ESBR unit’s external tests. The author selected media reports from the external tests that were suitable for the content of the lessons in this study. Then, the author translated them into Japanese and slightly modified them to fit the Japanese context. For the statistical literacy skills test, students were required to read the reports and judge the quality of their content on a four-point scale ranging from “good” to “not good.” After this, they stated in detail the reasons for making that judgment. Students took this test twice, before Lesson 1 (pretest) and after Lesson 7 (posttest). This test requires demonstrating statistical literacy skills in interpreting and evaluating the quality of the statistical report. Both sections aimed at measuring Skills B (Questioning Skill), C (Judging Skill), F (Conjecturing Skill), and G (Considering Skill) as described in .

Fig. 5 Statistical literacy skills test.

Fig. 5 Statistical literacy skills test.

3.5 Data Analysis

The data analyzed in this study were the outcomes of the assignments that students worked on in Lessons 6–7 and their responses to the statistical literacy skills test. Based on the outcomes of the assignments in the lessons, the author and teacher verified whether the students were able to demonstrate statistical literacy skills to evaluate their reports. In the statistical literacy skills test, the distribution of students’ responses on a four-point scale regarding the quality of the report was created. We also summarized the pretest and posttest results of the statistical literacy skills test and used a McNemar test to analyze the differences between the pretest and posttest in regard to students’ statistical literacy skills.

4 Results

4.1 Statistical Literacy Skills Test (Pretest)

This section presents the findings from the pretest phase. Data from 34 students who participated in all lessons and worked on all tasks were included in the analysis. provides a concise summary of students’ pretest evaluations regarding the quality of each report. In the pretest, approximately 40% of all students (13 out of 34 in both sections) answered “good” or “somewhat good” to describe the quality of each report (). When the teacher showed these results to the students, the author observed the students wonder why their evaluations were so different when they were reading the same report. The teacher then explained the importance of properly evaluating statistical reports, motivating the students to develop statistical literacy skills. The significance of motivating students to embrace statistics in their learning has been emphasized by Tishkovskaya and Lancaster (Citation2010). It is plausible that Lesson 1 may have contributed to enhancing student motivation at the outset of the study.

Table 1 Distribution of student responses about the report’s quality in Sections A and B of the statistical literacy skills test (pretest).

4.2 Students’ Activities in Lessons 6 and 7

In Lessons 6 and 7, the students were required to search for reports they were interested in or reports on current events. shows the reports that students searched as part of the task and the details of the evaluation. They tried to interpret the content from different perspectives. At the beginning of Lesson 7, the teacher and author returned the assignments to students with comments and encouraged them to reinterpret the content in light of those comments. With this kind of support from the teacher, students appropriately evaluated the reports. Many students referred to the reliability of statistical evidence, citing topics covered in Lessons 2 through 5, such as the “difference between population and sample,” “survey methods,” and “question sentence.” From these facts, the students were able to find statistical evidence in the reports and judge the reliability of that evidence, which may have encouraged the demonstration of statistical literacy Skills A (Identifying Skill) and B (Questioning Skill). However, not many students mentioned the “connection between evidence and the claims” or the “creator’s intentions.” Although the worksheet provided a framework for students to mention “information and hidden agendas” and to judge the connection between the evidence and the claims, this task may not have encouraged them to fully demonstrate Skills C (Judging Skill) and F (Conjecturing Skill).

Table 2 Themes of students’ reports and topics their report evaluations addressed in Lessons 6 and 7.

4.3 Posttest of Statistical Literacy Skills

provides a concise summary of students’ posttest evaluations regarding the quality of each report. shows that most students answered “somewhat not good” or “not good” on the posttest, suggesting that the lessons cultivated a sense of skepticism among the students, leading them to question the validity of claims and evidence within the reports.

Table 3 Distribution of student responses about report quality in Sections A and B of the statistical literacy skills test (posttest).

4.4 Comparison of Pretest and Posttest of Statistical Literacy Skills

The author and the teacher examined the students’ pretest and posttest descriptions to verify whether students were able to demonstrate each skill listed in . If a student’s description was similar to that listed in for each skill, it was assumed that the skill was demonstrated. The author and teacher independently coded whether each of the four skills (Questioning, Judging, Conjecturing, and Considering) appeared in the students’ descriptions and checked how well their codes matched. Across the 34 students’ pretest and posttest descriptions, codes matched for 23 (Section A in pretest), 26 (Section B in pretest), 20 (Section A in posttest), and 22 (Section B in posttest) students, respectively. Codes that did not match were discussed by the author and teacher until they reached agreement. After counting the number of student descriptions that addressed each skill, the McNemar test was used to determine if there was a change between the pretest and posttest (i.e., if descriptions that were not found in the pretest were found in the posttest). The results ( and ) show that some skills changed between the two tests (tables with p-values less than 0.05 are indicated in gray highlights).

Table 4 Pretest and posttest results of the statistical literacy skills test in Section A.

Table 5 Pretest and posttest results of the statistical literacy skills test in Section B.

Some changes were observed for Skills B (Questioning Skill) and C (Judging Skill). In the lessons, students spent considerable time on activities, such as assessing the reliability of evidence and exploring the connections between claims and evidence using worksheets. Therefore, it is possible that these skills were promoted. However, it’s worth noting that no significant changes were observed in Skills B2 (Questioning Skill) and C2 (Judging Skill) in Section B. Skill B2 (Questioning Skill) in Section B pertained to checking the timing of the survey against the graphs in the report and examining the validity of the survey method. In the lessons, students mainly focused on the text of the reports and did not sufficiently engage with the presentation of the graphs, which could be why no change was observed in this skill or Skill C2 (Judging Skill) in Section B, which checked the validity of the questionnaire text against the diagrams in the reports.

Further, for Skill F (Conjecturing Skill), there was a change in Section A, but no change in Section B. This skill is about inferring information not in the report and checking the validity of the evidence or claims based on it. Statistical literacy requires contextual knowledge (e.g., Gal Citation2002). In the lessons, the students found claims and evidence from the text and examined them; however, evaluating the text in various contexts was necessary. Additionally, there was no change in Skill G (Considering Skill) for both sections. In the lessons, students engaged in suggesting alternatives or revisions to the claims, but that was not the case in either section. Therefore, it can be discerned that students’ perceptions of the term “evaluation” may be limited to judging the contents of the reports as good or bad. It is plausible that the response options given to students to choose from about the quality of the reports could have led them to think this way. Additionally, students acquired knowledge of statistical terms such as scatterplots, correlations, explanatory and response variables, and confounding variables during the lessons of this study. However, the statistical literacy skills test did not assess their comprehension of these terms. Consequently, the extent of the students’ understanding of these terms remains uncertain and could potentially act as a confounding variable in their responses.

5 Discussion and Conclusions

5.1 Summary

This section provides a summary of this study and future work. This study primarily focused on the demonstration of statistical literacy skills through high school lessons. A statistical literacy process and corresponding worksheet for evaluating statistical reports were developed and lessons were created with this process and worksheet in mind. The author and the teacher investigated whether the lessons would help the students demonstrate statistical literacy skills. The results showed that students were able to demonstrate some skills and appropriately evaluate statistical reports. In particular, students demonstrated Skills B (Questioning Skill) and C (Judging Skill). Therefore, it is suggested that the lessons on reading statistical reports with the statistical literacy process may enable the students to question statistical evidence from a variety of perspectives and judge whether the evidence adequately supports the claims of the reports.

This study contributes to the field of statistical literacy education in three key ways: (a) the development of the statistical literacy process for evaluating statistical information from reports, encompassing a range of skills, (b) the creation of lesson plans specifically designed to demonstrate statistical literacy skills at the high school level, and (c) the introduction of a novel test aimed at assessing various statistical literacy skills. Previous research, such as the work by Budgett and Rose (Citation2017) and Parke (Citation2008), had also devised activities centered around reading statistical information, employing real media reports and fostering discussions among learners. In alignment with these prior efforts, the lessons in this study also aimed to demonstrate statistical literacy skills by analyzing media reports. However, this study stands out by proposing a unique reading strategy that actively engages learners in demonstrating a spectrum of statistical literacy skills. This innovative approach empowers readers to not only comprehend statistical information but also effectively evaluate it while applying the acquired skills.

It is important to highlight two key distinctions between the statistical literacy skills test development in this study and previous tests measuring statistical literacy, as exemplified by Ziegler and Garfield (Citation2018). First, the test introduced in this study can assess a broader spectrum of statistical literacy skills, whereas previous tests primarily focus on knowledge and attitudes. Second, this test is designed to accommodate teachers seeking a practical and time-efficient assessment method, particularly when faced with limited assessment time during their classes.

5.2 Limitations of This Study and Future Directions

This study suggests that students can demonstrate some statistical literacy skills and evaluate statistical reports by following a statistical literacy process. In this study, the students participated in statistical literacy lessons for the first time. Therefore, this process may be useful for implementing first-time statistical literacy education to demonstrate some statistical literacy skills. The challenges encountered in this study, such as how to demonstrate the skills that were not improved, will be further investigated. Additional lessons should be designed to address these challenges. For example, activities in which students actually conduct a statistical survey and make statistical reports to summarize the results and communicate about them should be incorporated into the lessons of future studies. The writing of reports is likely to lead to learning statistical terminology and the logical description of statistical evidence and claims. Weiland (Citation2017) notes that “being literate in the reading context can only go so far, as reading often operates in dialog with writing, and some experience in writing is important to be able to make sense of and evaluate statistical arguments” (p. 38). The students who participated in the instruction in this study had no experience conducting a survey or analyzing data. By immersing themselves in surveying and report composition, students can develop a better understanding of the information essential for inclusion in statistical reports. Additionally, by preparing reports with the expectation that others will read them, students may present their arguments in a clear understandable manner. Through these activities, it is expected that Skills F (Conjecturing Skill) and G (Considering Skill), which were not fully demonstrated in this study, will be developed.

Furthermore this study has some limitations. First, it used an observational design with no control group. Second, the participants were Japanese high school students, many of whom will attend university; this limits the generalizability of the results to the entire high school population. To improve generalizability, it is necessary to examine the results in other grade levels and populations with different academic abilities. Furthermore, in addition to the non-sampling error addressed here, the margin of error and experimental research methodology are also important topics in statistics education (e.g., Budgett and Rose Citation2017). Thus, future studies should investigate whether these skills can be similarly demonstrated when other statistics topics are addressed. Lastly, this study introduced the statistical literacy skills test for the first time. Consequently, continued research efforts are essential to gather additional evidence concerning the test’s reliability and validity.

Note

This article is a heavily revised version of the following study:

Koga, S. (2022b), Lessons aimed at demonstrating statistical literacy skills: a case study of Japanese high school lessons on reading statistical reports. Proceedings of 11th International Conference on Teaching Statistics.

In addition, the research in this article formed part of Authors’ doctoral dissertation:

Koga, S. (2023), “Teaching method of statistical literacy based on critical thinking: Focusing on activities to interpret and evaluate statistical information at the high school level.” [Unpublished doctoral dissertation]. Japan: University of Tsukuba.

The pre and posttest data in this study can be accessed at: https://osf.io/7fuwa/?view_only=52a6c12e2fba467caf4f40081678c4a3 (accessed on September 30, 2023)

Supplementary Materials

The supplementary materials include (1) Framework for statistical literacy skills from the perspective of critical thinking (Appendix A), (2) Outline of the lessons (Appendix B), and (3) Test evaluation perspectives (Appendix C).

Supplemental material

Acknowledgments

All procedures were approved by the ethics committee of the Faculty of Human Sciences, University of Tsukuba (Approval No: 2021-25 A). I extend my sincere gratitude to Dr. Naohiro Higuchi of the University of Tsukuba for engaging in insightful discussions. Additionally, I express my appreciation to all the students who participated in the lessons conducted for this study. Lastly, I would like to thank Editage (www.editage.com) for their English language editing service.

Disclosure Statement

The author reports there are no competing interests to declare.

Additional information

Funding

This work was supported by JSPS KAKENHI (Grant-in-Aid for JSPS Fellows) Grant Number JP19J20055.

References

  • Bargagliotti, A., Franklin, C., Arnold, P., Gould, R., Johnson, S., Perez, L., and Spangler, D. A. (2020), “Pre-K–12 Guidelines for Assessment and Instruction in Statistics Education II (GAISE II): A Framework for Statistics and Data Science Education,” American Statistical Association. Available at https://www.amstat.org/asa/files/pdfs/GAISE/GAISEIIPreK-12_Full.pdf
  • Budgett, S., and Pfannkuch, M. (2010), “Using Media Reports Promote Statistical Literacy for Non-Quantitative Majors,” in Data and Context in Statistics Education: Towards an Evidence-Based Society, Proceedings of the Eighth International Conference on Teaching Statistics. Available at http://iase-web.org/documents/papers/icots8/ICOTS8_7G1_BUDGETT.pdf
  • Budgett, S., and Rose, D. (2017), “Developing Statistical Literacy in the Final School Year,” Statistics Education Research Journal, 16, 139–162. DOI: 10.52041/serj.v16i1.221.
  • Chance, B. L. (2002), “Components of Statistical Thinking and Implications for Instruction and Assessment,” Journal of Statistics Education, 10, 1–14. DOI: 10.1080/10691898.2002.11910677.
  • Dwyer, A. P., Hogan, M. J., and Stewart, I. (2014), “An Integrated Critical Thinking Framework for the 21st Century,” Thinking Skills and Creativity, 12, 43–52. DOI: 10.1016/j.tsc.2013.12.004.
  • Ennis, R. H. (1996), Critical Thinking, Upper Saddle River, NJ: Prentice Hall.
  • Facione, P. A. (1990), “Critical Thinking: A Statement of Expert Consensus for Purpose of Educational Assessment and Instruction–Research Findings and Recommendations,” American Philosophical Association. Available at https://eric.ed.gov/?id=ED315423
  • Gal, I. (2002), “Adult’s Statistical Literacy: Meanings, Components, Responsibilities,” International Statistical Review, 70, 1–25. DOI: 10.1111/j.1751-5823.2002.tb00336.x.
  • Garfield, J., delMas, R., and Chance, B. (2002), “The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Project,” available at https://app.gen.umn.edu/artist/
  • Koga, S. (2022a), “Characteristics of Statistical Literacy Skills from the Perspective of Critical Thinking,” Teaching Statistics, 44, 59–67. DOI: 10.1111/test.12302.
  • Merriman, L. (2006), “Using Media Reports to Develop Statistical Literacy in Year 10 Students,” in Proceedings of the 7th International Conference on Teaching Statistics.
  • Nolan, D., and Stoudt, S. (2021), “Reading Science Articles,” in Communicating with Data: The Art of Writing for Data Science, eds. D. Nolan and S. Stoudt, Oxford: Oxford University Press.
  • NZQA (New Zealand Qualification Authority). (2019), “Achievement Standard (91584 Evaluate Statistically Based Reports).”
  • Parke, C. S. (2008), “Reasoning and Communicating in the Language of Statistics,” Journal of Statistics Education, 16, 1–24. DOI: 10.1080/10691898.2008.11889555.
  • Sharma, S. (2017), “Definitions and Models of Statistical Literacy: A Literature Review,” Open Review of Educational Research, 4, 118–133. DOI: 10.1080/23265507.2017.1354313.
  • Tishkovskaya, S., and Lancaster, G. (2010), “Teaching Strategies to Promote Statistical Literacy: Review and Implementation,” in Proceedings of the 8th International Conference on Teaching Statistics.
  • Watson, J. M. (1997), “Assessing Statistical Literacy Using in the Media,” in The Assessment Challenge in Statistics Education, eds. I. Gal, and J. B. Garfield, pp.107–121, Amsterdam: IOS Press & The International Statistical Institute.
  • Weiland, T. (2017), “Problematizing Statistical Literacy: An Intersection of Critical and Statistical Literacies,” Educational Studies in Mathematics, 96, 33–47. DOI: 10.1007/s10649-017-9764-5.
  • Ziegler, L., and Garfield, J. (2018), “Developing a Statistical Literacy Assessment for the Modern Introductory Statistics Course,” Statistics Education Research Journal, 17, 161–178. DOI: 10.52041/serj.v17i2.164.

Appendix A

Table A1 Framework for Statistical Literacy Skills from the Perspective of Critical Thinking (Koga Citation2022a, p. 65)

Appendix B

Table B1 Outline of the Lessons

Appendix C

Table C1 Test Evaluation Perspectives