349
Views
0
CrossRef citations to date
0
Altmetric
Intervention, Evaluation, and Policy Studies

The Effect of Active Learning Professional Development Training on College Students’ Academic Outcomes

ORCID Icon &
Pages 43-64 | Received 09 Aug 2021, Accepted 11 Nov 2022, Published online: 20 Dec 2022
 

Abstract

Growing literature documents the promise of active learning instruction in engaging students in college classrooms. Accordingly, faculty professional development (PD) programs on active learning have become increasingly popular in postsecondary institutions; yet, quantitative evidence on the effectiveness of these programs is limited. Using administrative data and an instructor fixed effects approach, we estimate the effect of an active learning PD program on student performance and persistence at a large public institution. Findings indicate that the training improved subsequent persistence in the same field. Using a subset of instructors whose instruction was observed by independent observers, we identify a positive association between training and implementation of active learning teaching practices. These findings provide suggestive evidence that active learning PD has the potential to improve student outcomes.

Acknowledgements

We thank Brian Sato, Andrea Aebersold and Mayank Verma for helpful conversations and feedback. Ashley Harlow, and Sabrina Solanki have also provided helpful suggestions on the manuscript. All mistakes are our own.

Notes

1 As of Spring 2021, there were 278 instructors listed as either “in progress” or “completed.” When we limit the sample to instructors with training dates and certified information the sample dropped to 105.

2 Although the COPUS protocol was initially developed to observe STEM classroom instructions, it has also been used in observing non-STEM classrooms (Denaro et al., Citation2021).

3 For example if the observer tallies 13 times that the instructor lectured during a 50-minute course, we would say that the instructor lectured 52% of class time (13/25).

4 Because of the way the data were obtained, we have a relatively large proportion of courses for which ALPD instructors did not teach the course. We have conducted analyses by limiting the sample to courses that were taught by both ALPD participants and non-participants and found that our results were similar regardless of this restriction.

5 A total of 250 classes between fall 2018 and winter 2020 were observed twice within the same term by independent observers affiliated with the Teaching and Learning Center and an additional 142 classes were observed once during this timeframe for a total of 392 classes. For classes that were observed twice, we averaged the classroom observation records.

6 To construct subsequent course achievement measures, we first looked at the entire course-taking records of each student and identified the next course within the same field for every course taken between fall 2016 to spring 2020 excluding summer terms. Repeat courses were excluded from next course persistence and performance.

7 On average, the ALPD participants taught four credits per term whereas all instructors at this institution taught about two credits on average per term. In addition, ALPD participants taught fewer graduate courses and were more likely to teach undergraduate courses compared to the population of instructors.

8 We also conducted heterogeneity analyses to see whether the effect of the training differs depending on the infrastructure (i.e., whether the course is offered in an active learning classroom), the size of the class, whether the course is an upper or lower division course and whether the course is a STEM/non-STEM course. Classroom layouts, for example, can allow for easier adoption of active learning techniques such as in-class group activities (Beichner & Saul, Citation2003; Dori & Belcher, Citation2005). As such, if the class is offered in a classroom designed to facilitate active learning, instructors may be more effective in raising student performance. Similarly, class size is an important consideration that determines whether active learning is adopted, with smaller class sizes being more conducive to active learning implementation (Carbone & Greenberg, Citation1998; Freeman et al., Citation2014; Heim & Holt, Citation2018). We further explore whether the impact is concentrated among lower division courses given that a majority of studies examining teaching effectiveness in higher education have focused on introductory/lower division courses (Figlio et al., Citation2015; Xu & Solanki, Citation2020). In all instances, we did not find that the impact of ALPD training is moderated by the classroom infrastructure, class size, whether the course is an upper division course or lower division course, and whether the course is a STEM course. In particular, the standard errors of the tests for interaction are large given our sample size such that we cannot reject the null hypothesis of no interaction effect.

9 Supplemental Appendix Table 2 shows a breakdown of the next course that students in our sample took. About half of the courses are lower division courses and the other half are upper division courses. In addition, 56% of the courses students took in the next term are non-STEM while 44% of the courses are STEM courses. 84% of the courses that students took as their next course are taught in small class settings (under 100 student seats per class), indicating that persistence effects are concentrated in small classes.

10 We may be concerned that ALPD participants and non-participants differ in their instructional approaches even in the absence of the training. Accordingly, we further conduct a pre- versus post-training comparison among ALPD participants only and control for all available covariates. The estimated coefficient shown in Supplemental Appendix Table 3 is positive (coefficient = 0.13, p = 0.187) and fairly comparable to the estimate shown in Table 3 column 2 that is based on the cross-sectional comparison. However, since only 71 ALPD participants have pre and post classroom observation data, the sample size is too small to yield a precise estimate.

Additional information

Funding

This work was supported by the National Science Foundation under grant number 1612258 and 1565073.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.