Developing consistent marking and feedback in Learn

Background

More and more Schools within Loughborough University are looking at ways in which they can develop consistency within marking and feedback. Additionally, they are moving towards online submission to support this. As a result, colleagues are looking at ways that they can use rubrics or grid marking schemes to feedback electronically in an efficient and timely manner.

Philip Dawson, (2017) reported that:

“Rubrics can support development of consistency in marking and feedback; speed up giving feedback to save time for feed-forward to students; and can additionally be used pre-assessment with students to enable a framework for self-assessment prior to submission.”  (p. 347-360.)

There are several types of rubrics and marking guides available within Learn and these take on different forms within different activities. Each has different requirements and results. This can make the process of transitioning to online marking a daunting process and, as we found recently, requires a carefully thought out approach.

Loughborough Design School recently made the move to online submission and online marking using the Learn Assignment Activity. Following this decision, we ran several workshops to assist staff with making the transition and specifically a rubric workshop. This blog post explores, explains and offers some options to the issues we encountered in the School and that we are facing more widely across the University.

What is the challenge?

Staff are already using hard-copy versions of feedback sheets that replicate the aims of having a rubric (i.e. consistency of marking and feedback), but many of these existing rubrics do not neatly transition into the Learn Assignment Activity and require a blend of features.

For example, a common feature of rubrics is that as well as providing a set of levels for criteria they often have a space provided to put in a specific mark e.g. 9 out of ten for a specific piece of criteria. This level of granularity can be the difference between a 1st class honours degree and a 2:1 class degree and, crucially, it allows students the opportunity to see where they can gain marks. Rubrics in the Learn Assignment Activity do not allow for this type of granularity – you can assign a range to a level e.g. 60-70% but not a specific mark within this range.

What’s the difference between the Learn Assignment Activity and Turnitin Feedback Studio rubrics?

What’s the difference between a rubric and a marking guide?

A rubric aligns marking criteria with fixed levels of attainment. For instance, a rubric may feature several criteria with attainment levels stretching from Fail, Poor, Average, good and excellent and within these levels a description will inform the student (and tutor) of where they have been awarded and lost marks:

A marking guide is more flexible and simplistic in what it offers. You still have criteria, but instead of levels, the tutor is expected to give a qualitative summary of how they feel the student performed and a mark for the criteria:

 

For both the rubric and marking guide, the criteria can be weighted to reflect the components importance in the overall mark.

Moving forward

The Centre for Academic, Professional and Organisational Development plan to offer a new Rubric workshop in Semester 2 of the 1819 academic year. The aim of this workshop will be to provide clear guidance on the benefits, use and technical considerations behind rubrics and marking guides. Existing workshops can be found on the following page: https://www.lboro.ac.uk/services/cap/courses-workshops/

We’ll continue to work with Schools and support academics on a one-to-one basis where requested. We recognise that every case is different and recommend getting in touch with the Technology Enhanced Learning Officer and Academic Practice Developer within your School for further support.

Discussions will also continue with Turnitin.co.uk and the Moodle (the system behind Learn) community so we can stay ahead of changes and new rubric features as they arrive.

References

[Phillip Dawson (2017) Assessment rubrics: towards clearer and more replicable design, research and practice, Assessment & Evaluation in Higher Education, 42:3, p.347-360.]

 

 

Supporting students’ learning using STACK: Application on a Finance module

Kai-Hong Tee (School of Business and Economics) has been using the STACK question type in Learn to generate mathematical questions. In the post below, Kai explains how it works and why it benefits students. 

Background
I am currently teaching “Financial Management” which is offered as a core module for our second-year students studying for the “Accounting and Financial Management” (AFM) and “Banking and Financial Management” (BFM) degree programs, which are currently one of the best in the UK, with the latest ranking from the Guardian at the second place after University of Bath. The AFM degree is also accredited by the professional accounting body, we follow strict criteria when teaching the students and appropriately assessed them. However, as this module is also opened to other students whose degree programs are less mathematical in nature, such as Marketing and International business, it is important that “Financial Management” must provide sufficient support taking care of a wider range of students of different capabilities. This prompted me to re-think and re-design an existing self-assessment exercise that already has been available to students on LEARN. Figure 1 shows the questions and answers I have been using to help students on Learn. This assessment is based on one topic (equity valuation) in which most students don’t feel very confident, based on my observation and years of experience teaching “Financial Management”.

Figure 1. 

 

As you can see from figure 1, this self-assessment exercise consists of limited questions. The aim, however, is to encourage students to gain a better understanding on the topic through practicing doing the problems. However, for the weaker students, having just 4 or 5 questions may not be enough to develop the skill required for this topic.

Why use the STACK question type?
To efficiently supply larger number of questions so as to allow students to have more chances to practice them to acquire the skills, I adopted the STACK question type in the Learn Quiz activity to enable mathematical questions to be generated automatically. Currently, STACK has been applied in different schools, including the subject areas of engineering and mathematics. Figure 2 shows a snapshot of a question set up using STACK on LEARN. Figure 3 shows that using STACK it is possible to generate a similar question to be done again with different information. This then allows students to have additional practice.

Figure 2.

Figure 3. 

I’ve applied this idea to “financial management”. Figure 4 is an example of a question. Figure 5 shows that students will receive feedback on the answers they provide.

Figure 4.

Figure 5. 

New questions will be generated automatically and can be viewed by clicking on “start a new preview” (see figure 3), the implementation of STACK therefore involves some program coding. Figure 6 gives a snapshot of the “program coding” screen. Here, every question is treated as a “model”, and the “numbers” in the question as “variables”, since they will be different in each attempt of the students to “start a new preview”. Figure 6 shows the range of “variables” (inputs) that you want to set for your questions, representing the “new” question each time a new preview is attempted.

Figure 6.

What’s next?
One issue I face here is that the “new” question won’t have the text changed, just the “numbers”, there may therefore be a familiarity after a few rounds of practicing the questions if there are insufficient questions. Therefore, to increase the effectiveness, more questions can be included to reduce biases arising from “getting right answers because of familiarity”. These questions of different level of difficulties are then re-shuffld for each new round of attempt made by the students.

An area that is worth further developing is to incorporate adaptive learning pathways. Basically, I would only need, for example, 5 questions of different levels of difficulty, and then work on them to create pathways based on the feedback (answers) from the students. From the feedback, this then indicates which level of difficulty is reasonable to further assess the students. It may be possible for STACK to be developed in a way that students are led to attempt questions of reasonable level so that their standards are monitored and matched so that appropriate skills could be developed alongside sufficient practices of more suitable questions. This will be an area for future investigation. If workable, this should support the weaker students better. This implies that students could practice more questions and become more skillful based on their level of understanding, rather than simply practice any available questions without adequate understanding of their standards.

Further information at STACK can be found on the following page:
 http://www.stack.ed.ac.uk 

Tutorials on STACK are available on:  https://stack2.maths.ed.ac.uk/demo/question/type/stack/doc/doc.php/Authoring/Authoring_quick_start.md 

The STACK project team also recently won a HEA CATE award to disseminate STACK across other institutions: https://www.heacademy.ac.uk/person/open-university-stack 

EAT – it’s good for you!

Loughborough University and the Higher Education Academy Community of Practice: on Assessment and Feedback are pleased to offer a one-day event focusing on developing and implementing a self-regulatory approach to assessment. The event is taking place on Wednesday 20th September 2017 in the Stuart Mason Building and is being facilitated by the Centre for Academic Practice.

The day will be split into two parts:

Developing a Self-Regulatory Approach to Assessment: The EAT Framework (10.30 – 12.30, SMB 0.14)
Professor Carol Evans, University of Southampton

Assessment practice is a key driver in promoting high impact pedagogies and student engagement in learning. A step change is needed to advance how higher education institutions implement assessment practice to enhance student engagement and to maximise student learning outcomes. The session will describe how the EAT self-regulatory framework, a holistic inclusive assessment feedback framework, has evolved and how it can be used to support student and staff development of assessment literacy, feedback and design in partnership with each other as part of sustainable assessment feedback practice. Core to the development of this approach is an understanding of cognitive, metacognitive, and emotional regulation of learning informed by the Personal Learning Styles Pedagogy Framework (Waring & Evans, 2015).

Lunch will be provided from 12.30 – 1.30

Implementing EAT: Key lessons in scaling-up (1.30 – 3.30, SMB 0.02)
Professor Carol Evans, University of Southampton

This session is designed for Associate Deans and all those responsible for leading assessment and feedback practice. In the session, key considerations in scaling-up assessment and feedback practices mindful of institutional and faculty priorities and specific disciplinary needs will be explored with the intention of identifying strategies to support key priorities as an integral part of ‘ the fabric of things’ within the university. The potential of being a core member of the HEA Assessment and Feedback online Community of Practice will also be highlighted.

You can book onto this event on My.HR by following this link: https://myhr.lboro.ac.uk/tlive_ess/ess/index.html#/summary/careerdev/scheduledlearningactivity/474418AXK5

CAP Forum: Research-informed curriculum design: successes and challenges

Our most recent CAP Forum focused on research-informed curriculum design. As a recent Research-informed Teaching Award winner, Dr Line Nyhagen took us through some of her wonderful successes and some of the challenges she has faced in four specific innovative teaching practices which were designed to enhance student engagement.

  • The first is a field visit to a local mosque in order to allow her students to understand ‘lived religion’, where she emphasised that it is important that the pedagogic intention of any field visit is clear. Previously, there had been no field visits in the Social Sciences Department, and so she sought advice from the Geography department on the basics and reflected on what went well and what she could improve after the first year of running the trip. The trip was very successful; the feedback from participating students was overwhelmingly positive, alongside a post on the department newsfeed talking of its success. However, the main challenge she faced was that the attendance on the trip was quite low. The following year, Line took on board feedback on that particular issue and added organised transport and included an assessment element related to the trip that was worth 10%, which dramatically increased the attendance.
  • The second example discussed was a ‘Coursework Topic Approval Forum’ which was used instead of a list of essays from students to select from. It involved students having to use a forum on Learn to get approval and feedback for their coursework title which could be about any topic they were interested in on the module. This fostered the sharing of ideas and allowed transparent formative feedback to be given to all students. Although this had many successes, it generated quite a lot of additional work for Line, and made a small proportion of students uncomfortable. Upon reflection, this year Line has chosen to produce both a list of essay titles and allow students to choose their own titles if they wish, nonetheless they must use the new general coursework forum for any questions related to coursework so that formative feedback continues to be shared among all students. A lot of the discussion afterwards focused on this area and suggested ideas such as having the group as a whole come up with the list of questions and queried why it was online and not in person in a session which was agreed would also work.
  • Line also spoke about ‘Memory Work’ as a method to teach gender and other identities, which is a research method she has used in her own research. This encouraged students to see themselves as both the researcher and the research subject, and allowing students to feel an ownership of the material being used to teach as it was generated by themselves. This in turn increased student engagement. This topic also generated lots of questions and discussion about how the technique could be applied to teaching in other areas, for example as an aid to reflecting on group assignments.
  • The final topic discussed was her ‘In-class Policy Awareness Event’ which she used as a new technique for increasing student engagement this year. She did this by trying to find topics directly relevant to her students, and this year chose sexual harassment policy due to the recent focus of the NUS on the topic, as well as it being one of her students’ dissertation topics last year. She took the students through the University’s Zero Tolerance policy, conducted research in-class using a quick SurveyMonkey questionnaire with results immediately available in the classroom. She also asked her students to come up with campaign ideas and proposals for increasing awareness, which was an identified problem. As an unintended consequence of this session, Line was able to take these suggestions to the Athena SWAN Team in her the school, which she leads. She has also shared the class findings and policy proposals with the Director of Student Services.

If you have any questions for Line about her experiences please feel free to contact her at l.nyhagen@lboro.ac.uk or take a look at her twitter at @Line_Nyhagen. Alternatively, if you have any ideas of topics you would like to deliver on or hear about for future CAP Forums, please let us know by emailing Dr Glynis Perkin at G.Perkin@lboro.ac.uk or take a look at our Twitter at @LboroCAP.

 

Further Information:

The department’s newsfeed about the mosque visit:

http://www.lboro.ac.uk/departments/socialsciences/news-events/2017/leicester-central-mosque-march-2017.html

A blog post related to Dr Line Nyhagen’s research:

http://www.lboro.ac.uk/departments/socialsciences/news-events/2017/leicester-central-mosque-march-2017.html

Dr Line Nyhagen’s staff webpage:

http://www.lboro.ac.uk/departments/socialsciences/staff/line-nyhagen/

Bridging the Feedback Gap

It is a common occurrence to hear staff express concerns about how feedback is used, but it’s often unclear what the expectations around feedback are for both students and staff.

Simon Martin, Department of Materials (AACME), recently a conducted a survey that was aimed at establishing just how much student and staff attitudes to feedback differ, and how these gaps might be bridged. With the help from the Materials’ Programme President, Alex Marrs, a short on-line survey was sent out to students within the Department. Materials staff were invited to take part in an identical survey.
Bridge
Concerns and issues experienced by staff and students surrounding assessment feedback indicated many similarities and a few differences giving potential clues to ways forward to improve the effectiveness of feedback.

The results of the survey were shared with School staff at a recent lunchtime Learning and Teaching workshop aimed at finding ways to make feedback more relevant, effective and meaningful for students whilst also making it manageable and sustainable for academics to deliver.

AACME’s regular L&T workshops focus on considering, challenging and developing practice.

If you wish to know more about the survey results, methodology and indicated outcomes Simon Martin is happy to be contacted directly (s.j.martin@lboro.ac.uk) for further information.

Feedback practice was also the focus of a staff/student Teaching Innovation Award last year in SSEHS. The final report of Harry Lane, Emma Giles, Dr Emma Haycraft and Dr Hilary McDermott’s project ‘Developing a common language: Enhancing communication and feedback’ is available on the 2015 awards section of the CAP website (http://www.lboro.ac.uk/services/cap/procedures-schemes/teaching-awards/teaching-innovation-awards/)

Quick reference guide for staff on feedback

Focus on feedback

In this last blog post in ‘Focus On… Feedback’ month, Dr Valerie Pinfield, who is a lecturer in Chemical Engineering, shares her thinking on giving feedback to students.

I wanted to produce a quick-reference guide for staff to check the feedback that we/they are giving to students, and ensure that it has the effect of improving performance. I based the mnemonic on articles by Bright (2010) and Nicol and Macfarlane-Dick (2006) and on a resource by the University of Reading (2015). These articles do not provide a simple easily-accessed summary of what feedback should look like, so I compiled a list of keywords to construct the mnemonic below. Bear in mind that the resource was intended for staff in chemical engineering, so one of the elements is “technical” which may not apply in other subject areas.

ASPECTS reference cardf

 

Perhaps this easy guide to feedback could be of use in your own department? Any feedback on it will be welcome.

Bright, K (2010) Providing individual written feedback on formative and summative assessments. Higher Education Academy, UK Centre for Legal Education. Available at http://78.158.56.101/archive/law/resources/assessment-and-feedback/effectivefeedback/index.html Accessed on 20/02/2015.

Nicol, D and Macfarlane-Dick (2006) Rethinking formative assessment in HE: a theoretical model and seven principles of good feedback practice, Higher Education Academy, Available at: http://www-new1.heacademy.ac.uk/assets/documents/assessment/web0015_rethinking_formative_assessment_in_he.pdf Accessed on 20/02/2015.

University of Reading (2015) Engage in Feedback, including Feedback Audit Tool. Available at http://www.reading.ac.uk/internal/engageinfeedback/efb-Home.aspx Accessed on 20/02/2015.

QuickMark(ing)

Focus on feedbackThere is accumulating evidence that on-line marking and feedback can be more effective and efficient than traditional paper-based methods. Thus using technology, to deliver feedback can have several benefits:

  • It allows us to provide more feedback in less time through the use of repeated comments;
  • It helps provide more detailed feedback through the use of in-text annotations and general comments;
  • It can help develop a stronger link between marking criteria, feedback and the grade through the additional use of a rubric.

What is GradeMark?

GradeMark is an on-line marking system which is part of Turnitin (often used to check originality), and can be accessed via Learn. Continue reading

Helping students engage with feedback

Focus on feedbackA common concern expressed by staff is that students do not make full use of the feedback provided.

Given the huge investment of time and energy in the assessment process by staff and students, this does seem to be an area worthy of further consideration.

Research indicates that staff and student views about feedback do not always coincide – staff believe that the feedback they provide will aid student learning yet students state that they do not understand the feedback nor do they know how to seek support to improve. This situation is compounded by students being unsure of their role within the assessment process and being unprepared to receive and act on feedback.

So what can be done? Continue reading

Feedback Research – CEDE

Focus on feedbackThree years ago, a number of Engineering Schools approached The Centre for Engineering and Design Education (CEDE), requesting the Centre’s help to unpick, from the students’ perspective, the assessment and feedback issues that were being highlighted by the National Student Survey (NSS).  Their goal was to identify what the School could do strategically to enhance further the quality of the feedback given to their students.  This research has led to two further follow on studies.  Together, the three studies explored a number of areas, including but not limited to:

  1. Undergraduate and taught post graduate student expectations of assessment and feedback and how these expectations may differ
  2. How students use the feedback they receive
  3. The factors that impede student use of feedback

This work, although based on research undertaken in the Engineering Schools at Loughborough University, has findings and outputs that may be of relevance to staff in Schools across campus.

A feedback digest was produced from the findings of the research and circulated to teaching staff in the participating Schools.  This resource provides advice on creating effective feedback and contains annotated examples of feedback that, from the student perspective, meet, exceed or fall below their expectations.  This resource may be found at http://eden-share.lboro.ac.uk/id/item/59/, please login first with your University Login at http://eden-share.lboro.ac.uk/ before clicking the preceding link.

Findings from the research have been presented at the HEA STEM Annual Learning and Teaching Conference 2013: Where practice and pedagogy meet which was hosted by the University of Birmingham from 17th to 18th April 2013. The paper is available here.  The most recent findings will be presented at the INTED2015 9th International Technology, Education and Development Conference which will take place in Madrid from the 2nd – 4th March 2015.

Key findings from the studies include:

  1. The NSS appears to be the first real opportunity for students to consider their overall feedback experience.
  2. Students recognise a variety of forms of feedback, not just individual written feedback.
  3. The provision of feedback in more than one form helps to raise the perceived standard of the feedback quality and has the potential to delight students.
  4. Postgraduate students look for the same things in feedback as undergraduates. However, they expect more from their feedback, of key importance is detail.
  5. Students do not always know how to use their feedback to feed forward effectively.
  6. The environment within which students receive their feedback is critical. In cultivating a positive feedback environment, consideration needs to be given to how (setting, timing, individual copies) feedback is returned to students.