3,068
Views
2
CrossRef citations to date
0
Altmetric
Research Article

Geographic visualisation: lessons for learning and teaching

, , &
Pages 6-13 | Published online: 15 Dec 2015

Abstract

This paper outlines a pedagogic project funded by the former GEES Learning and Teaching Development Fund, exploring students’ attitudes to, and learning through, visualisation as a method of assessment in a core undergraduate geography module. Student expectations and experiences of this assessment, together with reflections on learning and teaching methods more widely, were investigated using participatory appraisal, and follow-up face-to-face feedback. Student perceptions of visualisation as assessment mixed an uncertainty about what was expected, with a sense that visual work might be comparatively ‘easier’. Responses after the assessment recognised the difficulty of the method, the focus on data and the ability to address complex topics. Students also compared their experiences with visualisation to other assessment methods, with many finding the visual approach stimulating and effective for their learning, and module marks were higher than in previous years. We have retained the assessment in the module and extended some of the lessons, especially the use of show-and-tell critique sessions for formative feedback, to other modules.

Introduction

Geography has always been a visual discipline, unique in “the way it has relied and continues to rely on certain kinds of visualities and visual images to construct its knowledges” (CitationRose 2003, p212). This project grew out of an interest in how we represent the world visually, as well as exploring the effectiveness of visualisation to enable students to investigate and understand complex geographical ideas. ‘Visualisation’ refers to a variety of practices, from the production of landscapes, for example, through the languages of cartography, to cultures of media representations. Moreover, it is inherently political, complex and contested, and geographers need to be attuned to the ‘comparative power of vision’ (CitationMatless 2003, p222; see also CitationRyan 2003). In this project, we specifically refer to visualisation as an assessment method relying primarily on graphics, images, diagrams and/or 3D constructions, rather than text on a page or the spoken word. Any visual representation must help convey the ideas and data related to the topic under investigation, rather than simply being a passive platform.

Each of the authors comes to this from a different perspective. Jon Swords’ research utilises visualisation of socio-economic data (CitationSwords 2011), Kye Askins works with participatory diagramming in research and teaching (CitationAskins 2008), Mike Jeffries draws on photography and comics in a photography based module, and Catherine Butcher uses a wide range of visual techniques in participatory appraisal (see www.peanutplus.org, and later herein). We have experimented with visual methods in a third-year undergraduate module ‘Geophotography’ and, less often, students have drawn upon photography and/or psychogeography mapping in Geography and Environmental Management dissertations (see CitationJenson et al. 2010). Our informal observations suggested that students responded positively to these research and learning methods, revealing reflective creativity, engagement and learning. However, we had never formally explored students’ learning through visual methods, and this project was intended to try to evidence whether such deeper learning actually happens.

The project introduced visual assessment into the long-standing ‘Geographies of Global Change’ module in the BA (Hons) Geography programme at Northumbria University, in 2010–11. Students were tasked to work in small groups, decide on and research a relevant topic, and represent the data and concepts visually. Meanwhile, the pedagogic research element investigated their understanding of, and attitudes to, the use of visual methods before the assessment, and the extent to which subject and skills learning occurred post-assessment. This was led by colleagues in Peanut (Participatory Evaluation and Assessment Newcastle upon Tyne), using participatory appraisal in workshops and focus groups, which we outline in more detail below.

Our objectives were to:

  1. evaluate undergraduate students’ understanding of visual methods both before and after engaging with such approaches;

  2. develop and use a visually based assessment within a conventional geography module; and

  3. evaluate the usefulness of visual methods against other more familiar teaching and learning techniques.

The project proved a stimulating and informative exercise. Results from the visualisation assessment, and the general impact of the changed module, revealed powerful and creative learning by the majority of students. In addition, the evaluation of visual methods provoked insights from students about other learning and teaching approaches we use. As a result, we have retained the visual assessment in the module, while associated activities, e.g. the use of a show-and-tell critique as formative assessment, have already been introduced in a final year ecology module.

Visualisation in learning and teaching

GE0133 Geographies of Global Change is a year-long, 20 credit, core second year module within our BA (Hons) Geography degree programme, averaging 60 students. The module had previously been assessed summatively by a 2000 word essay and a two hour end-of-year exam, weighted at 50% each. We decided to introduce visualisation as an element of assessment, allowing us to evaluate its utility in deeper learning, before making wholesale changes to the module. It is crucial to consider assessment as central to learning, in line with CitationBiggs’ (2003) concept of ‘alignment’, to ensure that assessment drives learning and enables a deeper engagement with the subject. Thus, the essay was retained but reduced to 30%, and the exam replaced by an assessment requiring students to work in small groups to visually represent an aspect of globalisation, and write a supporting 2000 word essay on their chosen topic (worth 70% combined).

The concept of visualisation was introduced to students in a lecture, and a series of three workshops were run to help students engage with a range of visualisation techniques, such as charts, graphs, maps, pictures, photographs, diagrams and 3D models. The workshops ensured that visualisation was not parachuted in as a distinct activity, but grounded in this conventional subject based module: each of the three workshops was replicated three times, to run with approximately 20 students at a time, such that students could discuss and engage in ‘hands on’ activities that directly linked module topics to the visualisation of data and ideas.

Information about the assessment was given verbally and as a visualisation (see ). The practicalities of visualisation were supported through the workshops, run fortnightly in the second semester of the year, along with a show-and-tell critique (outlined below) for groups to show draft versions of work and receive formative feedback. As independent learners, students were encouraged to further explore techniques and graphics in their own time, in line with CitationSimm’s (2005, p16) argument that ‘reflective observation’ and ‘active experimentation’ are central in “deep-seated learning and a sound foundation for independent research work at higher levels”. To aid this, a resources compendium was produced, including many websites devoted to visualisation. Students had access to graphics packages on University IT systems, such as Adobe Illustrator and Photoshop. Moreover, they were reminded to think about the ways such packages themselves, and the visualisations produced through them, are caught up in complex (geo)politics of visual cultures (CitationRose 2007).

Figure 1 Assessment hand-out for the visual representation of an aspect of globalisation.

Groups were formed after the introduction to visualisation. Group work can be a contentious issue for students (CitationBrown et al. 2000). In Geography and Environmental Management at Northumbria, students have formally raised concerns via student representatives at staff–student liaison committees, particularly about group members who do not pull their weight. Informally, students have also told us that it can be logistically difficult to arrange meetings for group work on other modules (first and second year), given commitments outside of university. For this module, we consequently allowed self-selected groups and included group learning agreements that provided space for personal reflection on contribution.

Evaluating student perceptions and experiences

Peanut were commissioned to explore students’ expectations of, attitudes to and learning experiences of the visualisation assessment, and wider approaches to learning. Although Peanut were based at Northumbria University, they were independent researchers outside the Geography and Environment teaching team. Sessions were held in January 2011, before the introduction of the visualisation assessment (‘benchmark’), and then in May 2011, just before the end of module exhibition (‘evaluation’). Peanut work with small groups, using diverse methods, which help participants express, depict and record their responses and opinions, including visual techniques. Students were split into three groups of twenty, with each group participating in both a benchmark and evaluation two hour workshop. These before and after workshops involved activities such as:

  • ‘lines of preference’, where students were asked to position themselves along imaginary lines in response to statements about learning styles, expressing themselves in words versus visuals, and working in groups;

  • ‘graffiti walls’, on which students posted their comments on ‘what do you understand by the term ‘visualisation” and ‘what are your expectations of the assessment’;

  • ‘comparison charts’, in which students assessed which learning and teaching approaches they individually perceived to a) help them learn and b) they prefer; and

  • ‘H-forms’, wherein students considered the positive, negative and potential improvements to both how they are taught on the degree programme broadly, and their learning and visualisation assessment on the Geographies of Global Change module.

In addition, the evaluation workshops gathered personal reflection via ‘people maps’. More detail on these methodologies is given in CitationPeanut (2011). Attendance was patchy at the workshops: 35 students attended the benchmark sessions but only 11 attended the evaluations. Thus, follow-up face-to-face feedback was gathered with another 20 students at the end of the module, which involved short (5–10 minutes) ‘mini-interviews’ conducted by one of the authors with a selection of students.

Formative assessment: show-and-tell critique

A ‘show-and-tell’ critique (crit) was designed into the teaching schedule from the outset, to provide students the opportunity to receive verbal, formative feedback on their progress, from staff and each other. Show-and-tell critiques are routinely used across design and architecture (sub)disciplines and practices, in particular regarding visual work, to enable students and practitioners to garner feedback in an iterative process of design development (CitationDoidge et al. 2000). CitationMroz (2009, p25) argues that:

engaging with students formatively can be stimulating [with] rewards for both students and staff in setting tasks that move beyond the standard pedagogy of transmitting knowledge and then testing students’ acquisition of it.

In addition, CitationWheater et al. (2005, p13) state that “[a] growing number of pedagogical and practical arguments support the use of peer-assessment in higher education”, with key benefits including critical thinking regarding their own work (see also CitationHughes 2001).

The critique session took place six weeks after the visualisation assessment was introduced, and two weeks before the final exhibition and hand-in. The use of show-and-tell was a novel experience for all of the students. We provided guidance concerning what to expect, and held the event in a large University hall allowing work to be displayed on boards, the floor or podia, whatever suited the designs. Groups were encouraged to circulate, asking questions of each other about their work, as well as getting feedback from staff - including staff not immediately involved in the module or visualisation project (see ). The topics that groups addressed proved diverse, including global arms trade, the coffee industry, retailing, the Olympics, environmental issues, and the globalisation of football.

Figure 2 Show-and-tell critique.

Students were nervous but also excited about the show-and-tell. The face-to-face feedback later revealed that this was partly due to unfamiliarity with the format and partly due to discomfort regarding assessed work being made public:

The visualisation really stretched me … I really didn’t know what it would involve and I was worried about the crit because I thought our first go was a bit rubbish and people would laugh

Our group was […] apprehensive at first, but we worked through that … the more you talked to the others about your own work and then theirs, you got used to it

We had loads of ideas … without the crit it would have still been a bit of a mess at the end I think. But it was nervous, showing it to everyone before we’d got it further

Nevertheless, many of the works in progress were impressive, showing imagination and skill in developing a visual method, as well as thoughtful links between format and underpinning data and concepts. A couple of groups had made limited progress, and a beneficial effect for these was being stimulated by peers’ comments: comparing final work with that presented at the crit, it was evident that extra work by these groups had been put in to catch up.

Summative assessment: the exhibition

The exhibition of student work (see an example in ) was held in the School of Built and Natural Environment Hub, a shared open access space/IT cluster used by students for group working and the collection of marked work. Specifically, this space was chosen as it has a constant flow of staff and students, across the whole School, making the exhibition an ‘event’ beyond those involved with the module, or even Geography and Environmental Management. In particular, first year students were encouraged to attend, especially if doing Geographies of Global Change in their second year. Students displaying their visualisations were thus constantly engaged in conversation regarding their work, providing the kind of feedback not available through traditional modes of assessment. Quite clearly, to our observations, these second years were now comfortable discussing their work, having experienced a level of critique at the show-and-tell (Figure 4).

Figure 3 Final student work by Hannah Matterson, Dwyer Ogbonson, Chris Gath and Theo Fitzharris (for more examples see http://www.flickr.com/photos/gemsnorthumbria/).

Figure 4 Student interaction during the show-and-tell critique.

Lessons learned: visualisation as method

Key themes that emerged from the Peanut evaluation and face-to-face feedback were that the assessment had enabled creative learning by the majority of students. In particular, ‘freedom’ to experiment with topics and different visual techniques and types was mentioned by most students, after the final exhibition:

The freedom was great, we really learned from trying out loads of different stuff”

I really liked the creative aspect of it, it made me think much more about the data, getting data and showing data

Certainly, the outcomes of the benchmark and evaluation workshops are data rich and their detail is beyond the scope of this paper (see CitationPeanut 2011). While we recognise the fragile form of analysis of discursive methods, what is key here is that broad trends are evident.

Benchmark sessions revealed:

  • perceptions that the visualisation assessment would be ‘easy’ or ‘easier’ than text-based assignments;

  • diverse expectations combining interest in the novelty but also uncertainty about what was required;

  • wariness of group work;

  • understandings of visualisation as 2-D formats (images, graphs, posters, video);

  • some sense of the effectiveness of visual methods as ‘more exciting’; and

  • a predominant focus on learning as ‘passive’, with perceived content of visualisation limited to taught topics such as migration, health, retail.

Evaluation sessions clearly showed significant shift among the cohort:

  • visualisation was ‘difficult’ and ‘challenging’ - not so easy after all;

  • the group work was largely enjoyable and effective, especially regarding the sharing of skills and knowledge;

  • a broader appreciation of the range of visualisation, including 3-D models and sculpture for example;

  • greater awareness of the power of good visualisation to convey complex ideas;

  • a predominant focus on learning as ‘active’, with recognition of the importance of an iterative process, and a strong sense of data as central to content.

Furthermore, findings regarding teaching methods more widely revealed a strong liking for interactive-discursive activities, such as fieldwork and practical activities, and a dislike of lectures where students are ‘talked at’. Overall, a key positive of the assessment was the freedom of students to explore and develop ideas through a formative process. Negatives included the need for more help with design software, limited contact time/workshops, and the stress of combining familiar geographies with unfamiliar visualisation. Despite these concerns, the average mark for the module for the two years it has run improved: 58% for 10/11 and 11/12 versus 53.5% in 09/10, 55% in 07/08 and 51% for 06/07 (data for 08/09 is unavailable). While not statistically significant, we see these marks as an indication of the effectiveness of the assessment.

Conclusion

The project found that prior to the visual work, students had an uncertain but interested attitude to the new assessment, dominated by 2-D visualisation and basic geographical topics. By the end, students were more tuned in to the process and power of visuals, and had become aware of how challenging visual methods are. In general, they reported increased engagement, effective group working, and that they gained new skills, including critical analysis of complex data sets, and creatively utilising IT software and/or designing 3-D visualisations to represent these – useful skills for geographers in a range of future employment. We believe that the visualisation added substantially to the students’ learning and teaching experience, and this project evidences that students can use visual methods as an illustrates tool to represent complex geographical ideas. These positive responses have encouraged us to retain the visualisation assessment in the Geographies of Global Change module.

Further, the show-and-tell crit as a means of formative feedback was successful in changing the processes students go through in preparation for assessment: it emphasised the need for drafting and redrafting work, which was revelatory to many students, who acknowledged that it was a useful process to go through. This was a key point for staff frustrated with piecemeal and last minute practices students may easily slip into. Thus, the show-and-tell has been extended to improve coursework in a final year ecology module GE0195 Applied Ecology and Conservation.

References

  • Askins, K. (2008) In and beyond the classroom: research ethics and participatory pedagogies. Area 40 (4), 500–509.
  • Biggs, J.B. (2003) Teaching for quality learning at university: what the student does. Maidenhead: Society for Research into Higher Education and the Open University Press.
  • Brown, S., Race, P. and Rust, C. (2000) Using and Experiencing Assessment. In Assessment for Learning in Higher Education ( ed. Knight, P.), pp75–85. London: Kogan Page.
  • Doidge, C., Parnell, R. and Sara, R. (2000) The Crit: an architecture student’s handbook. Oxford: Architectural Press.
  • Hughes, I. (2001) But isn’t this what you’re paid for? The pros and cons of peer- and self-assessment. Planet 2, 20–23.
  • Jenson, A., Swords, J. and Jeffries, M. (2010) Geographies of Skateboarding - Newcastle upon Tyne and Gateshead, UK. Human Geography 3 (3), 147–150.
  • Matless, D. (2003) Gestures around the Visual. Antipode 35 (2), 222–226.
  • Mroz, A. (2009) Update the time-tested models. Times Higher Education, 29 January 2009.
  • Peanut (2011) Teaching Geographic Visualisation. Evaluating student understandings of visualising geographic knowledge, Northumbria University: http://nrl.northumbria.ac.uk/13616/1/Visualisation_Report_(Peanut_Final).pdf (accessed 15 June 2012).
  • Rose, G. (2003) On the Need to Ask How, Exactly, is Geography “Visual”? Antipode 35 (2), 212–221.
  • Rose, G. (2007) Visual Methodolgies: An Introduction to the Interpretation of Visual Materials. London: Sage.
  • Ryan, J.R. (2003) Who’s Afraid of Visual Culture? Antipode 35 (2), 232–237.
  • Simm, D. (2005) Experiential learning: assessing process and product. Planet 15, 16–20.
  • Swords, J. (2011) Featured graphic: Labour's three-term spending record, 1997–2010. Environment and Planning A 43 (2), 255–257.
  • Wheater, C.P., Langan, A.M. and Dunleavy, P.J. (2005) Students assessing students: case studies on peer assessment. Planet 15, 13–16.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.