1,013
Views
2
CrossRef citations to date
0
Altmetric
Research Article

Validating a modified instrument for measuring Demand-Control-Support among students at a large university in southern Sweden

ORCID Icon, ORCID Icon & ORCID Icon
Article: 2226913 | Received 07 Mar 2023, Accepted 14 Jun 2023, Published online: 26 Jun 2023

ABSTRACT

Background

University students experience a distinct working environment in the context of completing their studies. In line with existing research into the connection between workplace environment and stress, it is rational to believe that such study environments can affect the level of stress that students experience. However, few instruments have been developed for measuring this.

Objective

The aim of this study was to validate a modified instrument based on the Demand-Control-Support (DCS) model among students at a large university in southern Sweden to determine its utility for assessing the psychosocial properties of the study environment.

Methods

Data from a survey performed at a Swedish university in 2019, which generated 8960 valid cases, was used. Of these cases, 5410 studied a course or programme at bachelor level, 3170 a course or programme at master level, and 366 a combination of courses and programmes on the two levels (14 missing). A 22-item DCS-instrument for students was used comprising four scales: Psychological workload (demand) with nine items, Decision latitude (control) with eight items, supervisor/lecturer support with four items, and colleague/student support with three items. Construct validity was examined using exploratory factor analysis (EFA) and internal consistency using Cronbach’s alpha.

Results

The results of the exploratory factor analysis of the Demand-Control components support a 3-dimension solution with dimensions corresponding to psychological demands, skill discretion, and decision authority in the original DCS model. Cronbach’s alpha coefficients were acceptable for Control (0.60) and Student Support (0.72) and very good for the Demand and Supervisor Support scales (0.81 and 0.84, respectively).

Conclusions

The results suggest that the validated 22-item DCS-instrument is a reliable and valid tool for assessing Demand, Control, and Support elements of the psychosocial study environment among student populations. Further research is necessary to examine the predictive validity of this modified instrument.

Responsible Editor

Stig Wall

Introduction

Universities can be understood as workplaces for students, as well as complex sites for study and social interaction. Students can therefore have specific expectations on psychosocial workplace environment and protection. In this context, psychosocial workplace can be understood as the combination of social and psychological job characteristics [Citation1]. It is rational to believe that such study environments can affect the level of stress that students experience, and existing research has shown that stress is a common occurrence among this population and associated with a number of negative outcomes for the students’ academic performance [Citation2]. Before actions can be taken to better address these high levels of stress, appropriate tools must exist to measure the psychosocial conditions of the study environment. Despite this, few instruments have been developed and validated for measuring this phenomenon.

In traditional workplaces, the Job-Demand-Control model (or job strain model) is one of the most influential and widely used models of occupational stress, appearing in numerous publications examining connections to, among other outcomes, psychological wellbeing [Citation3], cardio-vascular disease and blood pressure [Citation4], and musculoskeletal disorders [Citation5]. First developed by Karasek in 1979 utilising data from Sweden and the USA, the model postulates that work stress primarily comes from the interaction between psychological demands due to work and the effect of lack of decision latitude that allows employees to make their own decisions and enhance job satisfaction, commonly labelled ‘control’ [Citation6]. In 1988, the model was expanded to acknowledge the importance of social support as a potential buffer or, in its absence, as an additional stressor [Citation7]. This expanded model will be referred to as the Demand-Control-Support model (DCS) in this article.

The DCS model has been examined in both cross-sectional and longitudinal research [Citation3]. Results of these studies have been mixed, supporting both the additive [Citation3] and buffer hypotheses [Citation8]. In a large systematic review and meta-analysis of work environment and depressive symptoms research, Theorell et al. concluded that there is ‘substantial empirical evidence between lack of decision latitude, job-strain, and bullying and depressive symptoms’ [Citation9]. The findings indicated moderate evidence (the highest possible level for research in this study according to the GRADE criteria [Citation10] as no randomised trials were included) for control/decision latitude as protective factors against depressive symptoms, as well as for job strain (high demands and low control) as a harmful factor associated with depressive symptoms. Limited evidence was found for the relationship between psychological demands, passive jobs (low control, low demands), high-pressure jobs, and low support at the workplace, in relation to depressive symptoms [Citation9].

The DCS model is often operationalised by the Job Content Questionnaire (JCQ) and the shorter Demand Control Support questionnaire (DCSQ), both self-administered instruments.

The JCQ has been validated in, among other languages, Swedish, Norwegian, German, and English [Citation11–13]. The JCQ and the shorter DCSQ are general instruments used across a variety of different occupations including white collar employees of different groups [Citation13], teachers [Citation14], firefighters [Citation15], nurses and other health care workers [Citation16] to examine psychosocial work environment and associated outcomes in stress, psychological health, and burnout. Although sometimes used for specific work categories, the strength of the JCQ instrument is its ability to be applied across occupations and heterogenous populations [Citation1].

Less research has been conducted utilising the Demand-Control-Support (DCS) model to examine the psychosocial study environment of university students’ ‘workplace’. Existing studies focus on internships and work placements that resemble traditional workplaces [Citation17] or are smaller experimental studies that do not address the study environment as a whole [Citation18]. Previous studies share a common result: a negative correlation between decision latitude and perceived demands, which is unexpected when compared to the original model assumptions [Citation18,Citation19]. Where studies have used the DCS model in student settings, they use a variety of different scales drawn from different instruments [Citation20]. In some studies this includes using a version of the JCQ (23 items) that may have ‘left out some specific features related to the academic context’ as no students were involved in defining the most important areas [Citation21] or instruments based on the DCS model but with only 2–3 items per variable [Citation19].

Two studies have gone further in adapting versions of the JCQ for the student context. In a study by Schéle and colleagues among 322 dental students undergoing clinical training at four universities in Sweden [Citation22], a version of the JCQ Adapted for Dentistry students was used to examine environmental and individual characteristics related to stress. The study concluded that the psychosocial work environment of the students included in the study produced high levels of perceived stress [Citation23]. This instrument had been adapted for a specific set of students and had not been validated. A study conducted among 146 psychology students at two universities in Germany also utilised an adapted version of the JCQ [Citation24]. This version had been pretested among students and internal consistency had been measured, but no factor analysis was conducted.

In 2018, Lund University initiated the Tellus Project concerning sexual harassment among students and employees of the university. The project used a cross-sectional study design and an online questionnaire for data collection. This questionnaire included questions about study environment demand, control and support. The instrument used is a version of the instrument developed by Schéle and colleagues [Citation22] that had been modified by researchers at Lund University in collaboration with students and student organisations. In this article, this instrument is named the DCS-instrument.

The aim of the current study was to examine the construct validity and internal consistency of the Demand-Control-Support (DCS)-instrument among students at Lund University, Sweden, to determine whether it is an appropriate tool for measuring the psychosocial study environment in this setting.

Methods

Study design and data collection

A population-based cross-sectional study was conducted among students at Lund University as part of the Tellus project. The Tellus project is a three-year, research-based project concerning sexual harassment among students and employees at Lund University, a public university in southern Sweden with eight faculties and approximately 40,000 students.

All undergraduate and graduate students registered for studies during the autumn term of 2019 were invited to participate in an online self-administered questionnaire via their registered email addresses. The email text contained information about the study and contact details for those responsible. It also contained a link to the web-based questionnaire available in Swedish and English. Prior to answering the questionnaire, participants were asked to provide consent.

The survey instrument was developed based on a literature search and information gathered through a series of seven focus group discussions and 20 individual interviews with students at the university. The instrument had 117 questions divided into eight sections. These sections included sexual harassment, study environment, health, trust, and confidence as well as experiences of other types of harassment and derogatory treatment. Ethical approval was received from the Swedish Ethical Review Authority (number 2018/350).

Study measures

Demand-Control-Support instrument

The instrument used in this study is a modified version of the Job Content Questionnaire (JCQ) that was adapted for assessing dentistry students’ study environment by Swedish researchers [Citation22]. This instrument is named the DCS-instrument.

As our study included students from all faculties of Lund University, the instrument adapted by Schéle and colleagues was deemed to be too narrow in its formulation. Therefore, the authors tested the questions with students and representatives of student organisations to optimise question interpretation. Priority was given to the original formulation in the JCQ wherever differences occurred.

Based on this ‘face validity’ testing, some modifications were made. First, the instrument was shortened to maintain the balance among decision authority, skill discretion, psychological demands, and supervisor and co-worker support found in the original scale. Based on feedback from the students, two items were removed from the scales, one from psychological demands (My studies require me to learn new things) as this was taken to be a general expectation in a university setting, and one from student support (My fellow students are friendly) as there was a lack of consensus and conceptual clarity on this question. Other questions were reworded to bring them in line with the academic environment and student interpretation. A flowchart detailing this process is found in . The final DCS-instrument tested in this paper comprised nine items to measure demands, eight items for control, four items for supervisor/lecturer support, and three items for colleague/student support. Each item was answered on a nominal Likert-like 4-point scale. The final modified version of the instrument is shown in .

Figure 1. Flowchart of changes made to the instrument during the study.

Figure 1. Flowchart of changes made to the instrument during the study.

Figure 2. Modified 22-item Demand-Control-Support instrument (English version) for measuring psychosocial study environment and shortened question forms used in this article.

Figure 2. Modified 22-item Demand-Control-Support instrument (English version) for measuring psychosocial study environment and shortened question forms used in this article.

Background variables

In this study, Gender Identity was assessed with two questions; ‘What gender were you assigned at birth’, and ‘What is your current gender identity’. The second question had three options: female, male, and I do not identify as female or male. The answer to ‘current gender identity’ was used where provided, with ‘gender assigned at birth’ used for those cases without an answer for current gender identity. Respondent’s Age was recorded as ‘18–25’, ‘26–30’, ‘31–40’ and ‘41 years or older’, and their Country of birth was assessed as ‘Sweden’, ‘In a Nordic Country (not Sweden)’, ‘Europe (not a Nordic country)’ or ‘Outside of Europe’. The number of Semesters studied at Lund University by the respondent was recorded using the following question ‘How many semesters have you studied at Lund University in total? (Including the current semester)’, with options ‘0–1’, ‘2–3’, ‘4–5’, ‘6–7’, ‘8–9’, ’10–11’ and ‘More than 11’.

Statistical analysis

Two parameters for validity were tested in this study: construct validity and internal consistency.

Construct validity was calculated to measure dimensionality and the extent to which the sub-scales of the construct in question were measured. With reference to the underlying constructs of the DCS model, and an appreciation that significant changes had been made to the instrument warranting a more unrestricted analysis, an exploratory factor analysis (EFA) using principal factor extraction and varimax rotation was selected. Internal consistency was evaluated using Cronbach’s alpha and item-total correlations to measure reliability. Statistical analysis was conducted using Stata 16 [Citation25].

Results

Socio-demographic characteristics

A total response rate of 32% was achieved. Respondents with data missing for sex and gender (N = 69), age (N = 46), questions on sexual harassment (N = 74), or on one or more items of the DCS-instrument (N = 707), were excluded from the analyses. Data collection was part of a broader project on sexual harassment at Lund University, and the questionnaire was developed with this focus. In this context, having answered these questions was seen as important for inclusion in the data set. This resulted in a study population of 8960 respondents. A simple non-response analysis showed strong similarities between the study participants and the total population in a number of key characteristics [Citation26]. Characteristics of the study sample are provided in .

Table 1. Prevalence of socio-demographic factors among study sample of Lund University students who had responded to all items in the 22-item DCS-instrument (N = 8960).

Construct Validity

Prior to conducting the exploratory factor analysis, a series of methodological decisions were made following the steps outlined by Watkins [Citation27], to ensure the correct application of methods and increase interpretability.

Various authors recommend including a minimum of 3–6 variables for each common factor [Citation28]. With eight and nine variables for demand and control respectively, all items were included in the initial analysis. When considering sample size, allowance should be made for the absolute number of participants, as well as the ratio of participants to measured variables [Citation27]. Following Comrey and Lee [Citation28], an absolute number of 1000 and a ratio of 20:1 or higher are considered excellent. The current study had 8960 participants and a ratio of 373:1. As EFA assumes an underlying normal distribution, univariate skew and kurtosis were measured. A normal distribution will have a skew = 0 and kurtosis = 0, and in this data set the range was low (Range −0.70–0.71 and −1.1–0.1, respectively). Using the ‘standard’ values of skew >2 and Kurtosis >7 indicating univariate non-normality [Citation29], the results suggest that this data exhibits acceptable univariate normality. Multivariate normality was tested with Mardia’s skew. The test produced multivariate skewness of 5.69, suggesting some departure from normality (Reference 0).

Bartlett’s test of sphericity and the Kaiser-Meyer-Olkin measure of sampling adequacy were used to ascertain whether the dataset was appropriate for data reduction techniques such as EFA. As the p-value of Bartlett’s test of sphericity (p < 0.001) is below the 0.05 significance level utilised in this paper, this dataset is considered suitable for data reduction techniques such as EFA [Citation30]. In addition, the Kaiser-Meyer-Olkin Measure of sampling adequacy (0.851) was above the commonly recommended value of 0.7 [Citation31] and thus factor analysis was considered suitable for these items. Due to the presence of (low level) multivariate nonnormality (as measured by Mardia’s Kurtosis of 341.73 compared to a reference of 323), Principal Factor Extraction was chosen as the extraction method [Citation32]. This is supported by findings in previous research for example de Winter and Dodou, 2011 [Citation33].

Although several methods exist to guide selection of the number of factors to retain in the analysis, no ideal method has been identified [Citation34]. Thus, Parallel analysis (four factors), Minimum Average Partial Correlation (four factors) and a scree plot examination were conducted (four factors) to aid in the selection of the number of factors to retain [Citation34]. Based on these findings and the theoretical assumptions of the demand-control model, 2 and 3 factor models were selected for theoretical interoperability and best fit. Previous studies have shown that Control can sometimes be separated into skill discretion and decision authority, the foundation of the 3-factor model.

Both varimax (orthogonal) and Promax (oblique) rotations were tested, and the results were compared. Varimax was selected due to its commonness in similar research [Citation35], and Promax due to the fact that it is an oblique modification to the varimax procedure [Citation36]. Based on the similarities of the results, the theoretical assumption of a lack of correlation between the factors, and to improve the ease of interpretation [Citation37], the decision was made to apply varimax rotation.

An initial EFA using principal factor extraction and varimax rotation was conducted for the 17 Demand and Control items in 2 and 3 factor models consecutively. Factors with loading of 0.3 or higher were considered to load adequately with moderate correlation [Citation38].

Results from this analysis with the two-factor model grouped items connected with ‘Control’ under factor 2 except for ‘Requires skills that showed adequate factor loading (0.6) on factor 1. Factor 1 grouped items connected with ‘Demand’ except for ‘Clear expectations’ and ‘Complete assignments first’. ‘Clear expectations’ had adequate loading (0.5) on factor 2, and ‘Complete assignments first’ did not have adequate factor loading on either factor.

The 3-factor solution showed a clear meaning for the third factor within the ‘Decision Authority’ area of ‘Control’. Two of the three items showed strong loading, while the third ‘Lots to say’ did not load on this factor. As with the two-factor model, the items grouped under factor 2 connected strongly with skill discretion except for ‘Requires skills’ that loaded onto factor 1 (Factor loading 0.6). Factor 1 contained two items, ‘Clear expectations’ and ‘Complete assignments first’, that continued to load onto other factors or showed insufficient loading.

Removal of two items

After reviewing the preliminary results of the factor analysis and Cronbach’s alpha, items 13 (‘The expectations my education places on me are clear regardless of where the expectations come from’) and 17 (‘I often complete my assignments before my classmates’) were removed. Despite some items with low Cronbach’s alpha, this analysis and the Factor Analysis combined led to the decision to retain all other items. Only the 3-factor model was retained, shown in .

Table 2. Exploratory factor analysis using varimax rotation for 15-items of the DCS-instrument adapted for study environment at Lund University (n = 8960). Three factor solutions shown. Only factors with loading > 0.3 are shown.

The results of the revised scales show modest improvements in the factor loading of factor 3 when compared to the results of the instrument with 17-items. The item ‘requires knowledge/skills’ loaded onto factor 1 as opposed to the expected factor 2, while Opportunities for opinion’ showed loading on factor 2 instead of the expected loading on factor 3. All items in the Demand scale showed adequate loading on the expected factor.

Internal consistency

Cronbach’s alpha coefficients were calculated to map internal consistency for the three scales (Demand, Control, and Support). Internal consistency describes the extent to which all test items measure the same concept or construct and is based on the inter-relatedness of the items within the test [Citation39]. Item-test correlation shows the extent to which each item is correlated with the overall scale, and the item-rest correlation shows how correlated an item is with the scale computed from all other items. shows the results of this analysis.

Table 3. Item-test and item rest correlations and Cronbach’s alpha coefficients for items in the modified 22-item DCS-instrument among students at Lund University (N = 8960).

Item-test, item-rest and Cronbach’s alpha coefficients for the 22-item DCS-instrument are given in . Values for item-rest correlations for the psychological demand scale were all above 0.5 (range 0.50–0.72). This indicates a good correlation with the other items comprising the overall scale score when compared to the rule of thumb that item-rest correlations should be 0.4 or higher [Citation34]. Items in the ‘Control’ scale showed weaker correlation, while the scales for supervisor and student support showed high correlation (ranges 0.46–0.62 and 0.60–0.73 respectively).

Following the general rule that a Cronbach’s alpha of 0.6–0.7 is acceptable and 0.8 is very good [Citation40], the Cronbach’s alpha coefficients were acceptable for control (0.60) and student support (0.72) and very good for demand (0.81) and supervisor support (0.84).

Discussion

The purpose of this study was to validate the 22-item DCS-instrument through construct validity and internal consistency to examine whether it could be used to assess the psychosocial study environment of university students. To our knowledge, no validated instrument exists to assess strain related to demand, control, and support in university students’ study environment. Since the Demand-Control-Support model is a well-established international theory utilised in many studies, validation of a version for university students would allow comparisons across universities and countries.

After removing two variables from the DCS-instrument used in the Tellus survey, resulting in the 22-item version, the results of the exploratory factor analysis support a 3-dimension solution to the instrument with dimensions corresponding to the psychological demands, skill discretion, and decision authority aspects of the original instrument, although the decision authority solution was supported by only two items. All items in the psychological demand scale loaded adequately onto the same factor, while one item each in the skill discretion and decision authority sub-scales showed insufficient loading on the expected factor. Previous research has shown higher correlations between skill discretion and decision authority, and more homogeneity in these concepts [Citation41]. Having items that load between these two factors, as is the case with ‘Opportunities for opinion’, is therefore in line with the theoretical underpinnings of this model.

Cronbach’s alpha coefficients were acceptable for Control (0.60) and Student support (0.72) and very good for the Demand and Supervisor support scales (0.81 and 0.84, respectively). The results of the validation suggest that the modified 22-item instrument is reliable and valid for assessing Demand, Control, and Support dimensions of the study environment in student populations at universities in Sweden.

Two additional items could have been removed: ‘requires knowledge/skills’ and ‘opportunities for opinion’, as these did not adequately load onto the expected factors. Considering the combination of minimal advantages regarding Cronbach’s alpha to their removal and the tradition of maintaining the instrument as closely as possible to the original JCQ instrument to allow for comparisons, the decision was made to retain these.

Few comparable instruments for measuring Demand-Control-Support have been used to examine psychosocial study characteristics among university students. In one study among dental students in Sweden, Schéle et al. utilised a modified version of the JCQ instrument, and that is the basis for the instrument used in this study [Citation22]. Although this instrument was not validated, in later research results were favourably compared with other scales measuring environmental stress [Citation23]. Schmidt and co-authors conducted research at two German universities about Demand-Control, stress, and neuroticism using an adapted variant of the JCQ [Citation24]. This instrument was pre-tested with a group of students, and internal consistency was established with Cronbach’s alpha scores. In their study, Control and Demand dimensions had alpha scores of 0.80 and 0.77 respectively, similar results for Psychological demands, but substantially higher scores for Control [Citation24]. No evidence could be found of construct validity having been tested.

Existing research has been conducted using either general instruments designed for traditional workplaces [Citation17] or modified instruments designed for a relatively narrow group of students (Dentistry students during their clinical work [Citation22] and psychology students [Citation24]). This study validates a more universally applicable instrument that has removed specialised items related to clinical work compared to the instrument used for dentistry students [Citation22].

Strengths and limitations

The large sample from a public university, which covers the full range of university studies, makes it easier to claim generalisable results as a strength of the study, as well as the implementation of the data collection in close cooperation with student representatives and the university management. The large sample also made it possible to apply the evaluation tools in a robust manner. Four key limitations have been identified in this study.

The first limitation is the possibility of selection bias. This can take the form of self-selection, whereby individuals select themselves for the survey on the basis of factors correlated with the measures of interest that in turn bias the results [Citation42]. As this instrument was part of a questionnaire on experiences of sexual harassment, it is possible that those with experiences of sexual harassment could be overrepresented, especially considering the timing of the survey soon after the #metoo movement. However, with regard to questions on study-related stress, there is little reason to believe that this would have a systematic effect on the results.

The second limitation relates to the inability to differentiate between responses to the English vs. the Swedish version of the survey. As a new instrument in both languages, there could be differences in how the questions were interpreted. The translation was carried out by a native speaker, and back translation was utilised to examine differences in interpretation to minimise this. Despite this, there could still be differences in how respondents interpret the items in the two versions.

The third limitation is related to the relatively low Cronbach’s alpha coefficient for the control scale. Although this figure falls within the acceptable range [Citation40], it is relatively low compared to the other scales and thus the internal consistency could be questioned.

The fourth limitation is the low number of items that load onto factor three (i.e. the decision authority dimension). Although two items can sometimes be used to identify a factor [Citation27], three are generally needed for statistical identification [Citation35].

Having validated this instrument among a student population will allow future research into the study environment and its associations with other outcomes to be conducted, and comparisons made across settings. One area for future research would be an examination of the predictive validity of this DCS-instrument.

Conclusions

In conclusion, the findings of this study indicate that psychological demands, decision latitude, and support from fellow students or teachers can be described in a valid and reliable manner in a university study setting, by a modified version of the DCS-instrument which has been extensively used and in research on psychosocial factors and health in the general workforce.

Author contributions

JP, AA, and P-OÖ conceptualised the study and data collection instrument. JP and AA performed data collection. JP and PO-Ö conducted preliminary analyses. All authors participated in writing the manuscript and approved this final form.

Ethics and consent

Ethical approval was received from the Swedish Ethical Review Authority (number 2018/350). Consent was provided by study participants through completion of the survey instrument.

Paper context

Universities are frequently treated as workplaces for students although they also represent a unique context. Existing research shows the importance of occupational stress on a range of physical and psychological outcomes. No valid instrument exists to measure this phenomenon among university students. This article validates a modified Demand-Control-Support instrument for student populations examining internal consistency and construct validity. The instrument produced can be used to examine psychological study environment for university students in different contexts.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This study was funded by the Swedish Research Council Grant number 2018-02457.

References

  • Karasek R, Brisson C, Kawakami N, Houtman I, Bongers P, Amick B. The Job Content Questionnaire (JCQ): an instrument for internationally comparative assessments of psychosocial job characteristics. J Occup Health Psychol. 1998;3:322–9. doi: 10.1037/1076-8998.3.4.322
  • Wunsch K, Fiedler J, Bachert P, Woll A. The tridirectional relationship among physical activity, stress, and academic performance in university students: a systematic review and meta-analysis. Int J Environ Res Public Health. 2021;18:739. doi: 10.3390/ijerph18020739
  • van der Doef M, Maes S. The job demand-control (-support) model and psychological well-being: a review of 20 years of empirical research. Work Stress. 1999;13:87–114. doi: 10.1080/026783799296084
  • Landsbergis PA, Dobson M, Koutsouras G, Schnall P. Job strain and ambulatory blood pressure: a meta-analysis and systematic review. Am J Public Health. 2013;103:e61–e71. Epub 2013/01/17. doi: 10.2105/AJPH.2012.301153.
  • Bongers PM, Kremer AM, Ter Laak J. Are psychosocial factors, risk factors for symptoms and signs of the shoulder, elbow, or hand/wrist?: a review of the epidemiological literature. Am J Ind Med. 2002;41:315–342. doi: 10.1002/ajim.10050
  • Karasek RA. Job demands, job decision latitude, and mental strain: implications for job redesign. Administrative Sci Q. 1979;24:285–308. doi: 10.2307/2392498
  • Johnson JV, Hall EM. Job strain, work place social support, and cardiovascular disease: a cross-sectional study of a random sample of the Swedish working population. Am J Public Health. 1988;78:1336–1342. doi: 10.2105/AJPH.78.10.1336
  • Fox ML, Dwyer DJ, Ganster DC. Effects of stressful job demands and control on physiological and attitudinal outcomes in a hospital setting. Acad Manag J. 1993;36:289–318. PMID: 10125121. doi: 10.2307/256524
  • Theorell T, Hammarstrom A, Aronsson G, Traskman Bendz L, Grape T, Hogstedt C, et al. A systematic review including meta-analysis of work environment and depressive symptoms. BMC Public Health. 2015;15:738. Epub 2015/08/02. doi: 10.1186/s12889-015-1954-4
  • Guyatt GH, Oxman AD, Vist G, Kunz R, Brozek J, Alonso-Coello P, et al. GRADE guidelines: 4. Rating the quality of evidence–study limitations (risk of bias). J Clin Epidemiol. 2011;64:407–415. Epub 20110119. doi: 10.1016/j.jclinepi.2010.07.017
  • Chungkham HS, Ingre M, Karasek R, Westerlund H, Theorell T, Xia Y. Factor structure and longitudinal measurement invariance of the demand control support model: an evidence from the Swedish Longitudinal Occupational Survey of Health (SLOSH). PLoS One. 2013;8:e70541. Epub 20130812. doi: 10.1371/journal.pone.0070541
  • Sanne B, Torp S, Mykletun A, Dahl AA. The Swedish Demand-Control-Support Questionnaire (DCSQ): factor structure, item analyses, and internal consistency in a large population. Scand J Public Health. 2005;33:166–174. doi: 10.1080/14034940410019217
  • Mauss D, Herr RM, Theorell T, Angerer P, Li J. Validating the demand control support questionnaire among white-collar employees in Switzerland and the United States. J Occup Med Toxicol. 2018;13:7. doi: 10.1186/s12995-018-0188-7
  • Brouwers A, Tomic W, Boluijt H. Job demands, job control, social support and self-efficacy beliefs as determinants of burnout among physical education teachers. Eur J Psychol. 2011;7:17–39. doi: 10.5964/ejop.v7i1.103
  • Lourel M, Abdellaoui S, Chevaleyre S, Paltrier M, Gana K. Relationships between psychological job demands, job control and burnout among firefighters. North Am J Psychol. 2008;10:489–495.
  • Roelen CAM, van Hoffen MFA, Twisk JWR, Waage S, Bjorvatn B, Pallesen S, et al. Psychosocial work environment and mental health-related long-term sickness absence among nurses. Int Arch Occup Environ Health. 2018;91:195–203. doi: 10.1007/s00420-017-1268-1
  • Bakker EJM, Roelofs PDDM, Kox JHAM, Miedema HS, Francke AL, van der Beek AJ, et al. Psychosocial work characteristics associated with distress and intention to leave nursing education among students; a one-year follow-up study. Nurse Educ Today. 2021;101:104853. doi: 10.1016/j.nedt.2021.104853
  • Flynn N, James JE. Relative effects of demand and control on task-related cardiovascular reactivity, task perceptions, performance accuracy, and mood. Int J Psychophysiol. 2009;72:217–227. doi: 10.1016/j.ijpsycho.2008.12.006
  • Tuomi J, Aimala A-M, Plazar N, Starčič AI, Žvanut B. Students’ well-being in nursing undergraduate education. Nurse Educ Today. 2013;33:692–697. doi: 10.1016/j.nedt.2013.02.013
  • Kim S, Kim H, Park EH, Kim B, Lee SM, Kim B. Applying the demand–control–support model on burnout in students: a meta-analysis. Psychol Schools. 2021;58:2130–2147. doi: 10.1002/pits.22581
  • Chambel MJ, Curral L. Stress in academic life: work characteristics as predictors of student well-being and performance. Appl Psychol. 2005;54:135–147. doi: 10.1111/j.1464-0597.2005.00200.x
  • Schéle I Gendered experiences of work environment : A study of stress and ambiguity among dental students in Sweden [ Doctoral thesis, comprehensive summary]. Umeå: Umeå universitet; 2011.
  • Schéle IA, Hedman LR, Hammarström A. A model of psychosocial work environment, stress, and satisfaction among dental students in Sweden. J Dent Educ. 2012;76:1206–1217. doi: 10.1002/j.0022-0337.2012.76.9.tb05376.x
  • Schmidt LI, Sieverding M, Scheiter F, Obergfell J. Predicting and explaining students’ stress with the demand–control model: does neuroticism also matter? Educ Psychol. 2015;35:449–465. doi: 10.1080/01443410.2013.857010
  • StataCorp. Stata statistical software: release 16. College Station (TX): StataCorp LP; 2019.
  • Agardh A, Priebe G, Emmelin M, Palmieri J, Andersson U, Östergren PO. Sexual harassment among employees and students at a large Swedish university: who are exposed, to what, by whom and where – a cross-sectional prevalence study. BMC Public Health. 2022;22:2240. doi: 10.1186/s12889-022-14502-0
  • Watkins MW. A step-by-step guide to exploratory factor analysis with stata. 1st ed. Routledge.; 2021. doi: 10.4324/9781003149286
  • Comrey AaL HB. A first course in factor analysis. 2nd ed. Erlbaum; 1992. doi: 10.4324/9781315827506.
  • Curran PJ, West SG, Finch JF. The robustness of test statistics to nonnormality and specification error in confirmatory factor analysis. California (USA): American Psychological Association; 1996. p. 16–29. doi: 10.1037/1082-989X.1.1.16
  • Tobias S, Carlson JE. Brief report: bartlett’s test of sphericity and chance of findings in factor analysis. Multivar Behav Res. 1969;4:375–377. doi: 10.1207/s15327906mbr0403_8
  • James BH, Gregory JM. Exploratory factor analysis: basics and beyond. Hoboken, NJ, USA: John Wiley & Sons, Inc.; 2012. doi: 10.1002/9781118133880.hop202006
  • Tabachnick BG, Fidell LS. Using multivariate statistics. 7th ed. NY: Pearson; 2019.
  • de Winter JCF, Dodou D. Factor recovery by principal axis factoring and maximum likelihood factor analysis as a function of factor pattern and sample size. J Appl Stat. 2012;39:695–710. doi: 10.1080/02664763.2011.610445
  • Bandalos DL. Measurement theory and applications for the social sciences. New York: The Guilford Press; 2018.
  • Child D. The essentials of factor analysis. 3rd ed. London: Continuum; 2006.
  • Gorsuch RL. Factor analysis. 2nd ed. Hillsdale (NJ): Erlbaum; 1983.
  • Mertler CA, Vannatta RA. Advanced and multivariate statistical methods: practical application and interpretation. Los Angeles: Pyrczak Publishing; 2001.
  • Tavakol M, Wetzel A. Factor analysis: a means for theory and instrument development in support of construct validity. Int J Med Educ. 2020;11:245–247. Epub 20201106. doi:10.5116/ijme.5f96.0f4a.
  • Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53–55. doi: 10.5116/ijme.4dfb.8dfd
  • Ursachi G, Horodnic IA, Zait A. How reliable are measurement scales? External factors with indirect influence on reliability estimators. Procedia Econ Finance. 2015;20:679–686. doi: 10.1016/S2212-5671(15)00123-9
  • Karasek R, Choi B, Ostergren P-O, Ferrario M, Smet PD. Testing two methods to create comparable scale scores between the job content questionnaire (JCQ) and JCQ-like questionnaires in the European JACE study. Int J Behav Med. 2007;14:189–201. doi: https://doi.org/10.1007/BF03002993
  • Freyd JJ. A plea to university researchers. J Trauma Dissoc. 2012;13:497–508. doi: 10.1080/15299732.2012.708613