265
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Continuity and Change in Undergraduate Assessment in Sociology in the UK: A Response to Harrison and Mears

Abstract

Harrison and Mears draw a fundamental conclusion from their work: that there has been more continuity than change in undergraduate assessment in Sociology in the UK. That there has been some change is clear. The sense that this is slower than hoped for and, furthermore, that the change that has occurred has been greater in the ‘post-1992s’ than the ‘pre-1992s’ and also that the external examining system (one of the potential mechanisms for changing this) is to some extent bifurcated, raises questions about the nature and scale of the innovation which has occurred.

CitationHarrison & Mears (2013) draw a fundamental conclusion from their work. While small scale and providing for the methodological challenges of engaging with busy university departments in the context of fluid and complex environments, they conclude that there has been more continuity than change in undergraduate assessment in Sociology in the UK. That there has been some change is a clear finding, with 60% agreed or strongly agreed in the ‘pre-1992s’, and 73% agreed or strongly agreed in the ‘post-1992s’. The sense that this is slower than hoped for and, furthermore, that the change that has occurred has been greater in the ‘post-1992s’ than the ‘pre-1992s’ and also that the external examining system (one of the potential mechanisms for changing this) is to some extent bifurcated, raises questions about the nature and scale of the innovation which has occurred.

This response reflects upon these findings, considering, first, the forces both for and against innovation and, second, the shift to innovation managed by corporate policies and projects (for example the rise in e-learning and, second, the employability agenda), which are one of the sources of change in the curriculum and its delivery. These provide a shaping context for the pedagogical practices of assessment and feedback, and are becoming widespread if not ubiquitous, spanning a wide range of institutional categories.

The lack of recognition of this may be widely shared including by the student body itself, who may not successfully recount the skills developed and the learning journey undertaken. Much recent innovation sits within those agendas, yet there may be a lack of visibility for sociology. This lack of visibility may be as much of a problem as the lack of innovation itself, and this response concludes by asking how successful sociology departments are at communicating their innovation to the student body and engaging with the rest of the sector for funding, dissemination and contribution. The successes of this, and the journey overall not withstanding, this invisibility is concluded to be a great pity, given the contribution which sociologists can uniquely make to higher education in leading innovative forms of assessment and feedback.

The force-field of innovation in assessment

While the sector is more diverse than it first seems, and the distribution of, and the quality and quantity of, change is more uneven than suggested by the shorthand used to construct binary divides, such as the pre- and post-1992 labels, or the use of the mission groups to describe the subsectors of higher education, the trail blazers are found scattered around the sector both over time and in the present day. Both the previous Centres for Excellence in Teaching and Learning (CETL) and funded projects today (for example Joint Information Systems Committee; JISC), were/are as likely to be found in Exeter as Edinburgh as East London. However, institutions do differ in mission and there are hypotheses that might set out some reasons for the findings. One of these relates to the much better documented and well understood problem of the parity of research and teaching and the overall institutional priorities of the two sectors identified by CitationHarrison & Mears (2013), the pre-1992s and the post-1992s. It can be no accident here that this shorthand may describe not only a binary divide in external examiners but an assumed, i.e. socially constructed binary divide in the prioritisation of research, its social construction, its place in the material and symbolic culture of institutions and its contribution to brand and reputation as much as contribution to the student journey. On the ground, at departmental level, there are material as well as cultural consequences, such as prioritising the continual and aggressive search for research grants needed to offset the real costs of running research-active departments where in truth research intensity is variable in quality, and the research excellence framework results will not pay for it. The perhaps startling finding that external examiners are drawn largely from pools limited by the binary divide of pre- and post-1992s suggests that some progress could be made in terms of checking the comparability of standards and the spread of best practice, if this specific practice changed.

CitationHarrison & Mears (2013) are right to look for institutional assessment policies and strategies and right to see their uneven presence and application around the sector. However, the pre-1992s have their share including some long-standing models of best practice. The pre-1992s and post-1992s have historically enjoyed different levels and kinds of risks in terms of retention and poorer final degree outcomes, and managing risk (rather than responding to students or the impulse toward the democratisation of learning) has contributed to the assessment agenda. The pre-1992s may also bear risk, in terms of the difficulties of managing to improve research performance while simultaneously managing students and, in some cases, the fluctuations in student demand. Even so, that there was more to do has been quite a long-standing theme, and the project Assessing Sociologists in Higher Education (CitationHarrison & Mears 2001), identified good practice in innovation that could be put to use to stimulate change in both attitudes to taking risks and substantive practice in sociology with examples and findings which could be made available more widely. The consequences of expanding student numbers have been the wave of ‘initiatives … to improve the ways in which students are taught and assessed’ (CitationHarrison & Mears 2013, p3). CitationHarrison & Mears (2013) attribute the rising visibility of assessments in universities to a number of issues, two of which are seen as key: the Parliamentary enquiry into university standards in 2009, which can be captured by the standards debate and the student satisfaction challenge, exemplified by the institutional scramble for the National Student Survey (NSS). To these matters, I now turn.

The standards debate: the bona fide graduate and employability

As CitationHarrison & Mears (2013) rightly suggest, the framing of higher education by the standards debate and the paradigms which have informed the reviewing and attempted reforms of qualifications for example at A-level, in part has also played a significant role in driving forward views and expectations of assessment and feedback in higher education, for example the focus on traditional examinations as a specific form of assessment. The 2009 Parliamentary enquiry into standards in universities looked at modes of assessment, grade inflation, and the comparability of standards.

To this, I suggest, we must now connect the change in funding regimes and the potentially rising tide of expectations of all parties, everywhere (parents, students, sixth forms, colleges, employers, government and so on) as to the quality of higher education, which must be guaranteed in terms of both the student experience and its outcomes by equal measure if we are to place students at the heart of the system (see CitationBusiness Innovation Skills [BIS] 2011). This means enjoying and benefiting from the degree experience while it is occurring, hence the renewed interest in student services and facilities, and the academic student experience being enjoyable and desired in the context of universities competing for students. Further to this is the need to secure the outcome, which is the need to transition successfully out from higher education carrying a passport to what is generally positioned as being the next phase in life. This is conventionally defined at its narrowest as a successful working life as a graduate. To be effective, the passport must clearly indicate the identity and status of its owner as a bona fide graduate carrying an award which condones the carrier with a bundle of skills and attributes. In sociological terms, this ensures that border crossings may be smoothly effected such that class reproduction or mobility is assured for the individual (and their community), and the needs of the economy are fulfilled. However, ‘needs’ may come in different forms, and could include, for example, the expectations of employers and providers of higher professional training and development regarding a graduate's previous experience and competencies regarding assessment tasks.

There is diversity in employment outcomes and this, for all stakeholders, is a matter of some interest. While this has long been the case, the rise of student loans and graduate unemployment has focused the sector somewhat. There is a considerable focus in both the pre-1992s and the post-1992s on the employability agenda. There have been substantial innovations in the curriculum and pedagogical practice of sociology across the sector. These commonly include courses and extra-curricular activities which are aimed to build awareness of and develop employability-related cultural, social and economic capital, for example by undertaking placements, the assessment of which takes the form of reflective logs; providing ‘sociological’ services to local agencies such as investigations and report writing; and an increase in participation in the life of the university, for example through student representation systems. Today this includes major contributions such as to the Quality Assurance Agency (QAA) institutional review processes as well as to social and pastoral service provision, including for example contribution to academic societies and participation in think tanks and to research events, which may be pitched at different levels.

Sociologists are well placed to provide excellent employability-related opportunities to undergraduates, and many do. These employability initiatives are often framed by specific policies, and carry a resourced infrastructure for delivery (anecdotally there appears to be a tension between central organisation and what resources are made available at departmental level, in some cases). Rather as in-house teaching and learning development units blossomed, we now see a plethora of employment-related services and opportunities in train. There is undoubtedly some interesting learning going on with the adventures in operationalising related frameworks in modules and programmes. Students taking placements are writing up some interesting reports, and learning to communicate these to the organisation in which they have been placed. These opportunities are not limited to any particular subsector of higher education, and it is in connection with some of these initiatives that innovation in assessments and feedback are occurring. This does not mean that more could not and should not be done, however, again, some of these activities are either ubiquitous or presented or received by the student audience in terms which may lead to a misrecognition of the task and skill such that students may be diversely skilled in their narrativisation of the curriculum and pedagogical practices, perhaps not always connecting the learning done to the terms used by professionals to describe them.

The NSS

CitationHarrison & Mears (2013) are correct to point out that the scoring in the NSS shows persistent patterns of poorer scores awarded for assessment and feedback over time, there are some differences between different subjects in the portfolio and demographic characteristics of students. Within institutions the accompanying context includes the on-going challenges of managing plagiarism on the one hand and negative feedback from students, including those which become actual student complaints, on the other.

NSS results show the persistence of a risk often realised that it is the assessment and feedback questions which can elicit a relatively negative and dissatisfied feedback from the undergraduate student body, compared with all other categories with the exception of the Student Union. The CitationNSS scores of 2013 continue the general underlying trend (two years are given in for interest).

Table 1 NSS scores 2012–13.

A particular issue arises around the ‘timeliness’ of feedback, which brings together the anecdotally reported difficulty of turning around large amounts of assessment with feedback which is of a good quality in terms of depth and aid to further work (CitationNicol 2010). The view that the function of feedback is in part what CitationDuncan (2007) calls feed forward lends weight to the need to improve the quality of feedback given its importance in pedagogical practice. A further challenge lies with student expectations of what counts as period of time deemed short enough, given the wider context of social media and instant messaging.

This is echoed in module-level evaluations, which in turn find their way into programme monitoring and evaluation reporting, and is sometimes commented upon in external examiner reports published on the internet. At this level there appears to be much greater diversity with individual module feedback reflecting the differing experiences and preferences of students for different kinds of work, standards of course organisation and the approaches of tutors.

Is innovation sought and recognised?

Having noted the dissatisfaction with assessment, it is unclear from Harrison and Mears' account as to the extent to which students care for or demand a specifically innovative approach in a whole-hearted manner. Student comments both in this report and a whole lot of others are not filled with demand for innovative and exciting opportunities per se (even if the local leaderships of the students union will support such moves). The barriers include a sometimes instrumental approach taken by the student body towards assessments, this being linked to the time investment and resource to undertake learning, as well as a risk aversion to deviation from the tried and tested, and in some cases years of preparation, training if not coaching in how to address certain kinds of assessment tasks. This may be particularly so in the pre-1992s, given the student intake, which is more closely aligned to one traditional model of engagement with higher education, being a transition made into university during the late teens or early twenties.

The post-1992s may in some cases have a more diverse student body, whose diversity is not only that defined by demographic characteristic but also by previous experience of assessment and in parallel with their pre-1992 cousins, a history of learning in preparing for assessments. While this is to some extent leading to a hypothesis, which this paper cannot explore further, the prevalence of the continuation of an assessment model, broadly described as exam plus coursework (perhaps with a journal article review exercise at one end and a research prep module leading to a dissertation at the other), which is found to be more prevalent in the pre-1992s (see Table 2 in CitationHarrison & Mears 2013), is not so surprising when this is taken into account. It is further interesting to note that new entrants to higher education, such as private providers, have not sought specifically innovative practices. Rather, in contrast to their innovation in entering what, for them, may be new markets, the practices of academic work and assessment tend to be traditional. Here, then, the task may be about the preparation for innovative practices as much as the actual practice itself. Expectations and previous experience may be powerful drivers in shaping student expectations, and confidence in approaching different kinds of tasks.

However, this is not the whole story. One change which has occurred relates to the use of new technologies and software. This is not merely a matter of developing digital literacy (widely accepted as a core attribute of graduateness) but also refers to the shifts towards digital expression and scientific exploration, and the integral nature of new technologies to learning and teaching in a myriad of ways. The resource used to be the tutor in the classroom plus the lab or library shelves. Today a democratisation of learning has led to a view which positions resources to include the student body in terms of both the experience brought to the classroom and the power of its collective and dialogic assembly (see CitationBurnett & Frame 2008). Resources today include the myriad of online resources, software facilities and services and creative possibilities of technological interaction. It is here I would like to suggest that a major area of innovation has occurred in assessment and feedback. Perhaps the e-learning revolution has become such an ubiquitous aspect of curriculum and pedagogical practices that the students have not commented at length upon it and we do not see it for what it is. The digital literacy challenge is sometimes overstated but we might ask who is most challenged, staff, student or institution? Some students at least may arrive with extensive experience to bring to projects, and to share with others. As one of the comments recorded by Harrison and Mears suggests, it could be that the students are far ahead of the staff, arriving with their mobiles, apps, and socially influenced expectations of time and social interaction to be nonplussed by institutional settings which still do old-world routines, which cannot easily provide personalised learning experiences, let alone personalised services and facilities. It is here perhaps that we do see a major pressure from the student body for continued change and it is this demand which institutions are finding difficult to cope with in some regards.

Further drivers for and against changes in assessments and feedback

However, I would like to suggest that some other drivers do and could yet push assessment to the top of the agenda even more. These include changes which may act as powerful structural drivers in the policy and practice of academic work, such as the diversification of suppliers of higher education provision in the form of private providers; the development of hybrid higher and further education colleges; and the establishment of the most recent wave of universities created by the award of Degree Awarding Powers (DAPs) to colleges of various kinds. These moves in turn have been prompted by many drivers including the changes in government policy regarding, for example, the funding of higher education in the UK, which has been felt by the student body, incurring both loans and unemployment (or unappealing employment), and by institutions as corporate entities, which see both risk and opportunity in the new environment. Overall we might suggest that this has led to an increase in engagement with the quality of the student aka customer aka citizen aka student (as the case may variously be argued to be), and the kind of dialogic citizenship that students can expect to enjoy on a basis which must be made explicit (for example one of partnership and co-production, which suggests a particular kind and mode of mutual engagement). This is one of the factors which drives a concern about what is taught, how and why, and how this can be assessed, with what impact and with what outcomes. However, this has not yet produced a groundswell of blue skies thinking in assessment and feedback, to date at least.

Other sections of the institutional and ‘civil society’ of higher education have also been potential forces for change. The National Union of Students and regulatory bodies are both interested in the quality and value for money as well as the social value that higher education provides, and systems such as student representation and an adult, dialogic approach to higher education would suggest that there are plenty of opportunities for students to demand better quality education, at least in terms of provision, irrespective of the success of the engagement policies that go with it.

Meanwhile, the learned societies and the British Academy have also been active, for example the British Sociological Association (the BSA) as CitationHarrison & Mears (2013) report, has adopted and established a Teaching Group which begins, in its own small way, to join up a fragmented sector where a student's journey through sociology is made by a navigation of institutions of wildly different ethos, expectations and organisation, and supported in different ways. The BSA's role in embarking on the task of joining up the dots is not without its challenges, given the reality of the decline of the school teachers in sociology and the pressures on schools: teachers may be one deep and not necessarily sociologists by birth, trade or choice. Likewise the BSA has also been active in developing its links with further education providers, and is working on the Sociology in the Community initiative to reach out to, and draw in, a wider audience composed of many different kinds of publics. Bringing teachers, colleges and university staff together through forums such as the BSA at least opens the possibility of sharing not only best practice but, at a more basic level, of more clearly developing the understanding in all parties of what the experience of assessment and feedback is at different parts of the journey, an essential precursor to any alignment of this at grassroots level.

A further powerful set of agencies at play include those such as the Leadership Foundation for Higher Education and the subject group system of the Higher Education Academy, both with their steady stream of projects, monies, reports and events (for example CitationHounsell & Rigby 2013), sit alongside the academic research and debate found in the learning enhancement units, the books and articles produced and conferences put on, and that which is represented in journals such as this one, setting out the rich world of investigation, analysis and critical reflection from which all of us may learn.

Given this, the on-going concern over the pattern identified in the literature, that innovation walks out of the door with the individual who developed it, remains a cause for concern. What is perhaps most alarming about the Harrison and Mears paper in some regards is what it does not say about the spread of good practice by professional frameworks, the agencies and literature, possibly because it can't be said.

Conclusion

This response has reflected upon the paper, acknowledging that the findings from the Harrison and Mears piece of research suggest less innovation has occurred than might have been expected given all of the drivers at play. However, some good progress has been made including the major streams of innovation, which are not recognised. However, the problem of recognition may be there for students as well as staff.

The wider context can be read as a change in government policy as well as the myriad of pressures for enhancement from within the sector. While we can't say that enhancement has failed, we should note that inertia, vested interests and the play of the timescapes of higher education have all contributed to Harrison and Mears' findings.

In conclusion, we see that the lack of visibility for the part which sociology has played in innovations raises questions about communication and dissemination, and that greater engagement with developing innovation where appropriate, seeking support from funding and professional bodies, undertaking projects, and investing in profile building, media engagement and information sharing more generally might be helpful. Diversifying the pool of external examiners would also ensure both the comparability of standards and the spread of innovation.

Introducing new ways of doing things are a challenge but might be an essential one to grasp if sociology is to remain a vibrant subject of relevance and curiosity to undergraduate students all over the sector. Sociologists have a special contribution to make to learners' journeys and should not fight shy of doing so. One way of changing a trend is by taking charge of it, and setting the agenda.

References

  • Burnett, J. and Frame, P. (2008) Using auto/biography in teaching and learning (Paper 120). London, UK: Staff Educational Development Association [SEDA].
  • Business Innovation Skills [BIS] (2011) Higher education: students at the heart of the system ( Department of Business, Innovation and Skills CM1822). London, UK: HMSO.
  • Duncan, N. (2007) Feedforward: improving students' use of tutors' comments. Assessment and Evaluation in Higher Education 32 (3), 271–283.
  • Harrison, E. and Mears, R. (2001) Assessing sociologists in higher education. Aldershot, UK: Ashgate.
  • Harrison, E. and Mears, R. (2013) The changing face of undergraduate assessment in UK sociology. Enhancing Learning in the Social Sciences 5 (3), 15–29.
  • Hounsell, D. and Rigby, S. (2013) Leading change in assessment and feedback: case examples and a guide to action. London, UK: Leadership Foundation for Higher Education.
  • National Student Survey [NSS] (2013) see Higher Education Funding Council England [HEFCE]. Student satisfaction at a nine year high. Available at http://www.hefce.ac.uk/news/newsarchive/2013/news82928.html.
  • Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education. Assessment and Evaluation in Higher Education 35 (5), 501–517.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.