Giving Students, Parents and Employers Confidence: Geography’s Experiences of Accreditation

Dr Richard Hodgkins, Senior Lecturer in Physical Geography, has recently received a Vice-Chancellor’s Award from Loughborough University for his contribution to Learning and Teaching. In this post, Dr Hodgkins details the recent experiences in gaining accreditation for some, rather different, programmes offered by the Department of Geography at Loughborough.

On the face of it, some academic disciplines, with more obvious career pathways, lend themselves naturally to accreditation, and others less so. However, all degree programmes benefit from being able to display some kind of quality stamp.

These programmes are the MSci (Hons) Geography and BA (Hons) Geography, both also available as sandwich programmes, the latter leading to the additional qualification of Diploma in Professional Studies (DPS) for those undertaking an industrial placement, or Diploma in International Studies (DIntS), for those undertaking study abroad. The main goal of each is to offer the most appropriate curriculum and outcome for somewhat different communities of potential geography students. The MSci takes the route of specialisation, being a four-year integrated Masters’ programme with a strong focus on physical and environmental geography. The BA takes the route of generalisation, stemming from the nature of geography as a diverse discipline spanning the sciences and humanities, offering those favouring its social and cultural aspects the opportunity to graduate with a qualification which, more closely than the current BSc, reflects the content they have pursued.

What are the challengers ?

It’s difficult to persuade an accreditor to look favourably on your programmes if you don’t have a clear sense of their strengths, which can be articulated cogently. So for each programme, it’s been important to step back, and to see the wood for the trees. Why offer it? What are the real benefits for students: are they being offered a distinctive curriculum with a clear sense of purpose and outcome, rather than a mash-up of pre-existing modules? The MSci is therefore specified to provide a pathway to environmental employment through a focused, practically-orientated and progressive menu of physical geography modules, which engage extensively by design with both contemporary research and with environmental monitoring for the purpose of effective management. The BA, on the other hand, is specified to provide the widest coherent menu of options possible, given that a significant proportion of geography students (particular those aspiring to become teachers) prefer to study both human and physical aspects of the discipline. The latter is consistent with the unique nature of geography as the integrated study of landscapes, peoples, places and environments, and is a view of geography that is strongly favoured by the Royal Geographical Society (with the Institute of British Geographers)(RGS-IBG), of which more below.

What are the benefits of offering a diverse range of programmes? 

From a departmental perspective, these recently-approved programmes have manageably diversified our offering, which contributes to admissions robustness. From a student perspective, enhanced satisfaction is the aim, through offering more tailored outcomes with specific awards. From the personal perspective of a departmental Director of Studies, there is a lot to be learned about matters that can get taken for granted, such as understanding how curricula should be consistently mapped to appropriate ILOs for different communities of students, and how Subject Benchmarking informs this process. I’m not under the illusion that ILO mapping is the stuff of dreams, but it’s vital that we retain the coherence of our programmes in the face of change and churn, so that students actually get what they believe they’ve signed up for, and so that accreditors can express their confidence in what they see.

We obtained accreditation for the MSci from the Institution of Environmental Sciences (Committee of Heads of Environmental Sciences, CHES) in May 2016. The key to the case was demonstrating, with evidence, how the modules aligned clearly with Subject Benchmarks, and with the specific expectations of the accrediting body; for instance, CHES places a particularly strong emphasis on environmental career development and links with professional practice, so it was important to establish in some depth that our modules did in fact do this in a substantive way that was both assessed and credited. In September 2016, we similarly obtained accreditation for the MSci and three other of our programmes – BA/BSc (Hons) Geography and BSc (Hons) Geography with Economics, including their DPS/DIntS versions – from the newly-established scheme of the Research and Higher Education Division of the RGS-IBG, now the key accreditor for the discipline. All four programmes were among the very first to be accredited: only 20 departments nationally achieved this distinction. In its evaluation, the RGS-IBG noted that the case contained “Clear and detailed description of aims achieved through core and optional modules… cross-referencing to the benchmark statement is evident”, underlining the value of all that ILO mapping, and that this is an ongoing process shared by all teaching staff. This is a significant accomplishment in a discipline a very wide range of alternative career pathways in which accreditation has not traditionally played an important role.

In our efforts to build our profile, Loughborough Geography can now justifiably claim a quality assurance “Kitemark” from the UK’s flagship accreditor. By the same token, our graduates – our ambassadors! – can be confident that their degrees are well-regarded when they pursue further study or enter the jobs market.

Degree Attainment Gaps and New Research at Loughborough University

In this blog-post for the Centre for Academic Practice, Nuzhat Fatima, LSU Welfare and Diversity Executive Officer, discusses the Black and Minority Ethnic student attainment gap in UK higher education institutions, and introduces a new research project at Loughborough entitled ‘Experiences in the Classroom and Beyond: The Role of Race and Ethnicity’

What is the ‘degree attainment gap’?

The ‘degree attainment gap’ is often described as a national crisis within the education system. The Equality Challenge Unit describes the degree attainment gap as “the difference in ‘top degrees’ – a First or 2:1 classification – awarded to different groups of students. The largest divergence can be found between BME (Black Minority and Ethnic Students) and White British students. Leaving an education institution with lower grades has lifetime effects; this limits BME students into pursuing a potential post-graduate education where the requirements generally tend to be a 2:1 or above. Most graduate employers will require a 2:1 or above also.

The problem arises as many BME students enter university with the same grade classification as their white counterparts. However, BME students leave university with significantly lower grades in comparison to their white peers.

“In 2012/13, 57.1% of UK-domiciled BME students received a top degree when compared with 73.2% of White British students’ – an overall gap of 16.1%” (ECU).

Homogenising all minority students is unhelpful as they are a diverse group with differing outcomes. For example, Black and Caribbean students are the worst affected group at a national level. When observing the national breakdown of the BME category (2012/13), it can be seen that Black and Caribbean students are the most affected ethnic group. Students from Pakistani, Chinese and Indian backgrounds are also affected.

  • 4%of Indian students were awarded a top degree (a degree attainment gap of 8.8%)
  • 9%of Chinese students (a gap of 9.3%)
  • 2%of Pakistani students (a gap of 19.0%)
  • 8%of Black Other students (a gap of 29.4%)” (ECU).

A reliance on a meritocratic model to understand academic achievement has meant that the BME attainment gap was, and sometimes still is, framed as a problem caused by a limitation in the students themselves. This is also known as a deficit model. However, the attainment gap would not be a national problem if it were a meritocratic issue only. This raises the question of whether there are conditions within our educational institutions that negatively impact BME students both culturally and academically, and which contribute to the existence of the attainment gap.

Potential contributors

There is no sole contributor to the attainment gap. Multiple factors contribute to students being unable to reach their potential and attain a top degree. It can be due to geographical location, institutional insensitivity towards culture, a Euro-centric based curriculum, methods of assessment, and experiences of racism which go beyond the classroom and have a lasting impact on student life. Additionally, social interactions within clubs and societies can also impact on academic performance. These points are often dismissed as generalisations that potentially impact all students; however, to tackle the BME attainment gap one must consider how these factors work together in a negative way to disproportionately affect BME students.

What can be done? A way of tackling this is institution specific research, which does not homogenise institutions and lived experiences. Such research can become a catalyst for tackling the BME attainment gap on a structural and an institutional level.

What is Loughborough proposing to do?

 Loughborough prides itself on being an inclusive university and is aiming to tackle this national problem on an institutional level! Together with brilliant academics such as Dr Line Nyhagen (Reader in Sociology & School Champion Athena SWAN) and Dr James Esson (Lecturer in Human Geography), I have contributed to the proposal for a newly funded student led pedagogical research project. This research project will be carried out so that we as an institution can further our progress towards making education inclusive by raising standards and aspirations of all!

The project will examine BME and other students’ own learning experiences at Loughborough University in relation to the curriculum content and more broadly, including their take-up of individual consultations with lecturers, relationships with peers, and take-up of opportunities that can enhance their learning experience (e.g., student rep positions; student ambassador jobs).

I want to congratulate Loughborough University for putting diversity on the agenda and I am thrilled to have support from the University and the above academics who are committed to learning from the experiences of students in order to deliver the best education possible.

Information taken from the ECU: http://www.ecu.ac.uk/guidance-resources/student-recruitment-retention-attainment/student-attainment/degree-attainment-gaps/

Image may contain: 1 person, standing

Nuzhat Fatima has been the Welfare and Diversity Executive Officer at Loughborough Students Union for 2016/17

CAP Forum: Embedding Research in Teaching

This year’s first CAP Forum focused on the topic of embedding your research in your teaching. As a result, we invited one of this year’s Research-informed Teaching Award winners to present on how and why she embeds her research into her teaching, and what her research is about. In 2002, Dr Cheryl Travers set up a module to fill what she perceived as a gap in Learning and Teaching from her experience of being an academic occupational psychologist. This gap was the extent to which the SBE finalists have developed their ‘soft’ skills in their final year after their placement.

Her research is about her ‘Reflective goal setting model’ and the module puts this into practice- asking students to reflect on themselves, set goals, use the ‘power of written reflection’ to measure the impact of those goals. She asks the students to write a diary which for the first time this year will take the form of an electronic portfolio thanks to her new innovative system for students to log their thoughts.

The discussion that followed focused mostly on her actual pedagogic research, and how other disciplines can apply her reflective goal setting model, from Arts students to STEM students, and even students wishing to learn a language while at University.

Overall, it was an enjoyable afternoon with lively discussion, an abundance of food, and a wonderful talk by Dr Cheryl Travers. The session was lecture captured, which you can find here, and you can also find Cheryl’s papers on her research around goal setting, as well as her recent TEDx talk that she delivered at Loughborough Students’ Union below.

Dr Travers’ papers – 

Self reflection, growth goals and academic outcomes: A qualitative study

Unveiling a reflective diary methodology for exploring the lived experiences of stress and coping

Technologies for delivering feedback

Feedback blog pageI was invited to the University Learning and Teaching Committee yesterday to give a short presentation on how learning technologies can be used to deliver feedback to students. This is an area in which students themselves (and their Union representatives) can see opportunities for much more use of technology.

In my presentation I covered a range of tools, many of which are already being used effectively by academic colleagues here at Loughborough. The tools include Camtasia and ReVIEW (Echo 360) Personal Capture, which allow tutors to make screen recordings in which they use highlighting tools to pick out features of assignments, with an audio narrative added on top. Grademark, part of the same suite of tools as Turnitin, also allows you to make audio comments on submitted assignments, but with online marking functionality and the facility of creating your own custom text comment banks.

 

For more details on the full range of tools, and links to case studies, see the new Feedback page in the Tools for Teaching section of this blog.

Peer assessment of conceptual understanding of mathematics

Peer assessment of maths understandingIan Jones and Lara Alcock in the Mathematics Education Centre recently undertook an innovation designed to assess undergraduates’ conceptual understanding of mathematics. Undergraduates first sat a written test in which they had to explain their understanding of key concepts from the module. They then assessed their peers’ work online using an approach called Adaptive Comparative Judgement (ACJ). The ACJ system presented undergraduates with pairs of their peers’ work and they had decide which of the two answers demonstrated the better conceptual understanding of the test question. Each student completed 20 such paired judgements and the outcomes were used to construct a rank order of students.

The researchers wanted to evaluate how well the students did at assessing their peers’ work. Mathematics experts were also asked to rank the scripts using ACJ and the outcomes of the peers and the experts were compared. Statistical analyses revealed that rank orders produced by both the peers’ and the experts’ were reliable and internally consistent. Moreover the peers performed similarly to the experts, showing that they were able to assess one another’s work to an acceptable extent. Due to the newness of the innovation grades were awarded to the students on the basis of the experts’ rank order, but in the future it should be possible to use the peers’ own ranking to assign grades.

To further evaluate the approach non-experts also produced a rank order using ACJ. The non-experts were non-maths PhD students who had never studied maths beyond GCSE level or equivalent. Surprisingly, even these non-experts performed quite well, matching the rank order produced by the experts to some extent. This suggests that generic academic skills such as clear presentation and structure are quite informative about students’ conceptual understanding of mathematics. Nevertheless, the non-experts did not perform as well as the undergraduates or experts, and unsurprisingly an understanding of maths is important for properly assessing conceptual understanding. This finding was backed up by follow-up interviews with participants. Experts typically talked about mathematical content when asked to explain how they came to their judgement decisions, whereas non-experts typically talked about surface features.

At this early stage the evaluation focused on technical implementation, as well as reliability and validity. However a key attraction of using ACJ for peer assessment is the potential learning benefits of students comparing pairs of their peers’ work, and this is something we intend to evaluate in future work. As one undergraduate said in a follow-up interview: “It is hard to judge other people’s work … Sometimes we think we understand, but we have to make sure that if someone else reads who has no clue what the concept is, by looking at the question they should be convinced it answers the question. So it is important to write in a good way. It is an improvement for me for my future writing.”

See previous blog post on ACJ.

[Footnote: The ACJ system used in the study is “e-scape”, owned and managed by TAG Developments.]

Learning about Learn online quizzes

Felipe Iza uses Learn Quiz for formative assessment with Loughborough University first year Electronic, Electrical and Systems Engineering students.  He and his colleagues introduced a series of formative assessments throughout the course of their module designed to motivate and provide feedback to students, and to keep the module tutors informed of progress.  Using Learn, the University VLE as the delivery mechanism, the tutors were able to automate the process to accommodate a large cohort of students.
Year of students: First year (Part A)
Number of students in cohort: 150

Felipe discusses the reasons behind introducing the quizzes, the positives and negatives, and the response to the quizzes from both students and staff which would be interesting to all considering using an online quiz.  He has also used a template to generate multiple randomized questions within his subject area.

For further information please view a presentation doc from Felipe at: Felipe_Iza_Learn_presentation
Or you are welcome to contact Felipe directly on: F.Iza@lboro.ac.uk

Assessing Creativity and High Order Thinking Skills with High Reliability

Screenshot from TAG Developments ACJ system

Traditional marking of students’ work struggles to assess creativity and high-order thinking skills reliably. A recent technological innovation, called Adaptive Comparative Judgement (ACJ), potentially offers a solution to this.

ACJ is based not on atomistic mark schemes but on holistic expert judgements of pairs of students’ work. The outcomes of many such pairings are used to construct a rank order of students.

Previous studies using ACJ  have shown the final rank orders to have a high reliability (typically >.95) and to correlate strongly with other measures of student achievement.

On Wednesday 28th Sept, Dr Ian Jones from the Mathematics Education Centre here, and Matt Wingfield from TAG Developments, will be giving a presentation on ACJ in the Library. E-mail me if you’d like to attend: c.f.g.shields [at] lboro.ac.uk .

Addendum: the session was captured using the ReVIEW (Echo 360) lecture capture system and can be viewed at http://tinyurl.com/lboroACJpresentation .

OMR – the Cinderella service

It’s easy to dismiss OMR (or Optical Mark Recognition) as technologically outdated – surely everything can be done online now?

But don’t dismiss OMR as it can still be very useful. In the case of CAA (Computer Assisted Assessment), it’s a relatively easy, low-cost approach to automating the process of marking large numbers of exam papers. For this reason, it continues to be a popular approach here at Loughborough.

My colleague Tim Baseley who supports OMR on campus has just passed on the following information about use of OMR in Semester 2:

  • 51 exams were delivered via OMR.
  • This represents a 2% increase over Semester 2 2010.
  • The total numbers of individual candidate papers printed was 6954.

From an ‘efficiency and effectiveness’ perspective, the saving in academic marking workload this makes possible is tremendous. For more on this aspect of OMR see my previous post on this subject.

Tim and colleagues are currently planning to move Module feedback forms to Remark OMR. This should improve scalability, flexibility and reporting whilst reducing overall costs and administration/processing time.

TurnItIn UK User group

TurnItIn UK User Group, 3/2/11, Aston University

Bryan Dawson

 This is a six-monthly event for TurnItIn (TII) Administrators and power users.  The system continues to grow, with over ½ million submissions per month to the system in the last quarter of 2010.  The UK branch of iParadigms (the U.S. company that built TurnItIn) now has responsibility for all of the world except the Americas, and will probably change its name from ‘iParadigms Europe’.

 New developments for TurnItIn Originality Checking include:

Large documents don’t now cause the system to seize up.  This has never been a problem for us, even with dissertations.

We were promised better Quality Assurance for new releases – the revised user interface introduced in the autumn is only now fully functional.

A Moodle 2.0-compatible plugin is promised for the middle of 2011.

Work is under way on allowing multiple markers of an assignment.    This would allow for double-marking of coursework, which explains why double blind anonymous marking is not currently under development.  There were many requests for this feature.

It was confirmed that TII submissions will still be visible to submitting institutions even after 5 years (we started using TII in 2005).

 The GradeMark online marking system received much emphasis at the meeting.  It is only just starting to be used by Lboro tutors. 

  • It is now possible to import and export rubrics (marking criteria), so rubrics can be deployed over several assignments, and the same rubric can be used by several tutors.
  • An ‘e-rater’ will be added to provide spelling and grammar checking for online submissions.  However, TurnItIn is an American product, so it is not clear whether UK English spelling, punctuation, grammar and usage will be checked.
  • A translation facility can be introduced to check for the possibility that a foreign-language source has been translated into English and used in a submission.
  • Feedback files can be uploaded to GradeMark for retrieval by the student.  These could be audio or video files e.g. a lecture-captured demo of a worked answer.

 A Plagiarism Reference Tariff was introduced by Jo Badge from Leicester Uni.  This aims to provide consistency across Academic Misconduct cases by applying a formula to establish the ‘severity’ of the case.  Details are available from http://www.plagiarismadvice.org/

 An online marking Case Study was presented by Cath Ellis of Huddersfield University.  She noted that the OU uses GradeMark and in the NSS it gets 100% satisfaction scores for feedback and marking.

At Huddersfield there was a strong steer from HoDs to use online marking.  The option for paper marking has always been kept open, but its use is now in the minority.  Online marking is the default, and tutors have to opt IN to paper marking. 

After a quite short learning curve (hours, not days) TurnItIn and Grademark had been integrated into the work flow of processing submitted student coursework.  Because marking is a fairly frequent activity, the new skills were kept refreshed and didn’t need to be re-learnt every semester. Using the GradeMark online marking tool was found to be:

  • Quicker, with marking throughput anything up to twice as fast as paper methods;
  • Better, because by using stock comments for common errors, the tutor was concentrating on what was being said, not how it was being said;
  • Easier, because you didn’t have to keep track of lots of pieces of paper; and
  • Safer because the marked-up coursework was automatically archived on a server.

Students liked the fact that they could submit from home, and didn’t have to make the trip to the campus just to hand in a piece of paper.  They appreciated the private and unhurried view of the feedback that had been provided, and felt they got more detailed feedback than before.

Admin staff set up duplicate copies of coursework to allow double-marking, but Anonymous marking is not used.