Developing consistent marking and feedback in Learn

Background

More and more Schools within Loughborough University are looking at ways in which they can develop consistency within marking and feedback. Additionally, they are moving towards online submission to support this. As a result, colleagues are looking at ways that they can use rubrics or grid marking schemes to feedback electronically in an efficient and timely manner.

Philip Dawson, (2017) reported that:

“Rubrics can support development of consistency in marking and feedback; speed up giving feedback to save time for feed-forward to students; and can additionally be used pre-assessment with students to enable a framework for self-assessment prior to submission.”  (p. 347-360.)

There are several types of rubrics and marking guides available within Learn and these take on different forms within different activities. Each has different requirements and results. This can make the process of transitioning to online marking a daunting process and, as we found recently, requires a carefully thought out approach.

Loughborough Design School recently made the move to online submission and online marking using the Learn Assignment Activity. Following this decision, we ran several workshops to assist staff with making the transition and specifically a rubric workshop. This blog post explores, explains and offers some options to the issues we encountered in the School and that we are facing more widely across the University.

What is the challenge?

Staff are already using hard-copy versions of feedback sheets that replicate the aims of having a rubric (i.e. consistency of marking and feedback), but many of these existing rubrics do not neatly transition into the Learn Assignment Activity and require a blend of features.

For example, a common feature of rubrics is that as well as providing a set of levels for criteria they often have a space provided to put in a specific mark e.g. 9 out of ten for a specific piece of criteria. This level of granularity can be the difference between a 1st class honours degree and a 2:1 class degree and, crucially, it allows students the opportunity to see where they can gain marks. Rubrics in the Learn Assignment Activity do not allow for this type of granularity – you can assign a range to a level e.g. 60-70% but not a specific mark within this range.

What’s the difference between the Learn Assignment Activity and Turnitin Feedback Studio rubrics?

What’s the difference between a rubric and a marking guide?

A rubric aligns marking criteria with fixed levels of attainment. For instance, a rubric may feature several criteria with attainment levels stretching from Fail, Poor, Average, good and excellent and within these levels a description will inform the student (and tutor) of where they have been awarded and lost marks:

A marking guide is more flexible and simplistic in what it offers. You still have criteria, but instead of levels, the tutor is expected to give a qualitative summary of how they feel the student performed and a mark for the criteria:

 

For both the rubric and marking guide, the criteria can be weighted to reflect the components importance in the overall mark.

Moving forward

The Centre for Academic, Professional and Organisational Development plan to offer a new Rubric workshop in Semester 2 of the 1819 academic year. The aim of this workshop will be to provide clear guidance on the benefits, use and technical considerations behind rubrics and marking guides. Existing workshops can be found on the following page: https://www.lboro.ac.uk/services/cap/courses-workshops/

We’ll continue to work with Schools and support academics on a one-to-one basis where requested. We recognise that every case is different and recommend getting in touch with the Technology Enhanced Learning Officer and Academic Practice Developer within your School for further support.

Discussions will also continue with Turnitin.co.uk and the Moodle (the system behind Learn) community so we can stay ahead of changes and new rubric features as they arrive.

References

[Phillip Dawson (2017) Assessment rubrics: towards clearer and more replicable design, research and practice, Assessment & Evaluation in Higher Education, 42:3, p.347-360.]

 

 

Your new and improved Co-Tutor is coming – September 2014.

CT_logoFrom 2014/15 a new and improved Co-Tutor will be rolled out across Loughborough. CEDE would like to celebrate and thank the many colleagues that have been involved along the way and will be hosting a launch event on the 24th September 2014 at 12 noon.

Register for the event here: http://www.eventbrite.co.uk/e/co-tutor-launch-event-tickets-5255206464

For the past year Co-Tutor has been undergoing developments as part of a HEIF funded commercialization project.  Phase I of the project has been such a success that the JISC and HEIF are continuing to fund a Phase II of further developments.

Co-Tutor’s new additional features include:

  • Refreshed and responsive new interface, optimized for tablets
  • You can cc’ in co-tutorrecord@lboro.ac.uk to add comments to Co-Tutor straight from your Outlook.
  • Access more information within a student’s record, e.g. timetables and Learn activity.

The Co-Tutor team would like to welcome Charles Shields, Jenny Narborough and Sasha Dosanjh (E-Learning team) to Loughborough’s support team.  They will be the first point of contact for enquiries and training.  Allison Dunbobbin (Careers and Employability Centre) will also be on hand to support the placement management features within Co-Tutor.

For more information on new features and upcoming events and further announcements see http://co-tutor.lboro.ac.uk/new/

Hanging Out

photo 5A colleague and I took part in a Google Hangout session yesterday with six other people from other institutions. Google Hangout is a video chat tool.  I was a bit weary about using this at first because a) I’ve never used it before, and b) there were 10 people expected at this virtual meeting so I wasn’t sure how this would be controlled.

In the past, when I have taken part in virtual gatherings, it has gone one of two ways; either one or two members predominantly speak for the duration of the session or second, everyone is talking over one another and it just turns into a bit of a mess. Fortunately, though, this session worked really well. The sound quality was excellent, as was the video. We didn’t experience any latency issues with sound (despite the fact we were using wifi) whereas this has been an issue with specific web conferencing tools such as Blackboard Collaborate we have used in the past.

Everyone got an opportunity to speak, and there was a common understanding of virtual meeting protocols which kept things in order (this may be because we are all in a Learning Technologist or similar kind of role). We quickly realised the importance of the need to mute, or unmute, when you are not talking (talking), otherwise you will be distracted by the constant video change when sound is picked up from another mic.

Aside from conducting virtual meetings via Google Hangouts, and using it privately with friends and family, you can join other publicly accessible Hangouts ranging from Language Practice Hangouts to paid-for hangouts offering live cooking classes!

Which video chat facility do you like to use with your colleagues and/or students? Please let me know below.

[form form-1]

Potential problem using TurnItIn's iPad App

Rob Howe from Northampton University reports a problem with the TurnItIn iPad app which results in loss of data.  The full description is in this blog posting

Essentially, if the iPad user changes their iPad profile during a marking session, they will lose the data already marked, because the iPad thinks it belongs to somebody else.  The data cannot be recovered.

Rob’s advice – to make sure you sync the data often, particularly at the start of a session – seems sound.

 

Miracasting from Android devices

Miracasting from Android devices

In parallel with the Tablets in Teaching project, we have also been evaluating various Android devices as alternatives to Apple iPads.

With the advent of Android 4.2, it has been possible to wirelessly project an Android screen onto an HDMI display.  By this we mean that anything on the mobile device’s display is replicated on the remote display.

Whereas the Apple AirPlay solution requires access to an established network to function – with all the procedural problems that can entail – Miracasting sets up its own mini-network between the mobile device and the receiving dongle plugged into the remote display.  This is more hassle-free than AirPlay and is independent of the podium PC in a lecture theatre – modern podiums have an HDMI input socket which can be used if you don’t have direct access to the display’s connections.

We tested two source devices and two receivers.  Both sources worked well with both displays and were able to display wirelessly whilst running live BBC iPlayer over WiFi.  However, the phone’s SIMs had to be turned off to force it to use WiFi, otherwise iPlayer slowed down to a crawl.

We used the second source device to attempt to hi-jack the remote display.  In no case did this succeed, so even if students have Miracast-equipped devices in the lecture, they may be able to see the receiver but not connect to it.

We used a tablet and a smartphone, with each of the two receivers.

Google Nexus 7

Butterfly 920

Android version

4.3

4.2.1

Connectivity

WiFi, Bluetooth

WiFi, Bluetooth, 2 SIMs

Native display

7”, 1920 x 1080 HD

5”, 1280 x 720

Remote display 1

22” Iiyama HD widescreen monitor

Remote display 2

Podium HDMI connector to HD data projector

Remote display 3

42” Brockington study pod HDTV

 

The two receivers were a Phone2tv dongle (eBay, £32) and a Netgear Push2tv dongle (Amazon, £60).  In both cases a USB connection is needed to provide power to the receiver and both had extension cables so that the dongle did not need to be physically attached to the display.

Of the two, the Phone2tv receiver was slightly quicker to set up but the Netgear was slightly better at buffering the incoming data stream, so played with fewer jerky interruptions.

Both devices carry audio as well as video.

podium_1

 

In use, the HDMI and USB connections are made and the Wireless Display settings used on the source device to initiate the connection.  In the example shown, the Aux HDMI input has been selected to feed the graphics through to the data projector.  The other sources – PC, Laptop, Visualizer and Blu-Ray player are still available and can be selected as usual.

The wireless display link works up to at least 5 metres from the receiver, giving the presenter the freedom to move around and interact with the class.

 

TurnItIn shifts focus

The September TurnItIn UK User Group meeting was hosted by Leeds Uni this year.
It followed the usual format of a position statement by the host, an overview of the development roadmap and case studies from users.

There are big changes under way at TurnItIn, in response to a massive increase in the demand for their services which have exposed weaknesses in their 10-year old systems. Such is the pace of change that a rolling upgrade programme will see all aspects of the service improved by Spring 2014. Amongst other things, the performance issues that affected some of our users in the summer should be fixed.

The expectation is that by 2016 GradeMark will have taken over from Originality Checking as the main part of the TurnItIn package, so the focus of the software is shifting from Originality Checking towards online marking, without losing any text-matching functionality.

Leeds

It was good to note that Leeds’ policies on Plagiarism are closely aligned with our new Code of Practice (and what happens elsewhere e.g. Bristol and Cranfield).

For example their objectives are
1) To ensure equal vigour in the detection and treatment of plagiarism across all subjects and
2) To provide equal support for students in referencing study skills across all subjects. They have a standard Plagiarism Study Unit which all freshers take in their first semester as part of their tutorial activities.

Development Roadmap

As part of their upgrading effort, TurnItIn are recruiting development programmers in the UK, and we were asked to pass this on to anyone who may be interested in database development work in Newcastle.

  • The TurnItIn iPad app has been well-received and has been updated several times since the initial release so if you are using it, please check for updates.

By Christmas 2013:

  • Colour printing will added to the Document Viewer
  • PowerPoint files will be accepted for Originality Checking
  • Submission to TurnItIn from Goodle Docs or DropBox will be possible

Early 2014:

  • GradeMark gets criterion-based marking without the complexity of a full rubric

Spring 2014:

  • GradeMark will use overlays, which could be used for marking themes e.g. ‘marks for methodology’, ‘marks for analysis’ etc OR for individual tutors.  Visibility of each layer can be controlled, so double-blind marking will be possible for the first time in any online marking tool.

 

The dates above carry the usual health warnings, of course!

The TurnItIn App for iPad

You may have heard that there is a new iPad app for using TurnItIn – including the GradeMark paperless marking tool – available for free from the App Store. This may be of interest to tutors who already have an iPad, and already use GradeMark. It probably isn’t a ‘killer App’ that by itself makes it worth rushing out and buying an iPad.
A key advantage of the app is that you can download the whole class’ assignments to the iPad, and mark them offline, re-syncing when you are back in WiFi range, whereas with the PC version you need to be online all the time.
Almost all of the functionality of the desktop version is available, and some iPad users may find this to be a convenient and quick way of getting marking done in circumstances where it may not otherwise be possible.
The TurnItIn app can be added to all of the Tablets in Teaching iPads, but it will need personal credentials setting up before it can be used.
There are no plans for an Android version.
Setting up such an app with the proper security is always going to be complex, but once set up, the app works very slickly.
Our early experiences indicate:

• The app is set up by default for the US version of TurnItIn, and the iPad setting for the app needs to be changed to TurnItInUK before doing anything else.
• Changes you make can be manually uploaded by re-syncing the iPad, or will automatically be sent if a WiFi connection is available.
• I preferred using the iPad to the iPad Mini because my touches were more accurate and the text was larger and easier to read at the default scaling (you wouldn’t want to have to adjust the display for each assignment you mark, so it’s important that the defaults work well)
• It took 6 ¾ minutes to download 39 essays onto the iPad, so with large cohorts, allow plenty of time for the download.
• If you select the ‘Unlink iPad from TurnItIn’ option, you not only log out of the system, but also delete all of the downloaded assignments. Useful if you are sharing an iPad (does anybody?) but a disaster otherwise.
• Screen rotation (portrait/landscape) works in the normal way. Many screens will re-size using stretch/pinch, but some don’t.

Accessing the submissions
There are two methods of accessing your class’ assignments:

Either

If you are already registered as a TurnItInUK user, you can log in with your email address and TurnItIn password. (Not sure if you’ve been registered already? Use the Retrieve Password link at http://www.submit.ac.uk . If you had a password, you are registered and can use the tool to set a new password. If you are not registered, it will tell you it has failed to find your details.).
Once logged in, you will presented with a list of all your modules, from which you can pick the one with the assignment to be marked.

OR


If you have never been registered as a TurnItInUK user (and most tutors haven’t), you need to:
1. Log into Learn
2. Go to the module
3. Enter the Assignment activity
4. View any one of the submissions by clicking on its Originality Score
5. Once in the Document Viewer, look for a rectangular icon in the bottom left corner. Click the icon to get a 16-character access code for all of the student submissions for this assignment.
6. On the iPad, use this access code to display the submissions for this particular assignment.
7. Because the access code only works once, you’ll need to Sync all the submissions i.e. download them to the iPad, otherwise you’ll need a new access code when you resume marking. If an assignment has multiple markers, each marker will need to get their own access code, and sync the assignments that they have to mark.

Bryan Dawson and Farzana Khandia

Free tools for teaching – Doodle scheduling

Doodle Screenshot(This post follows on from the Free tools for Teaching – name randomiser post.)

Here Radmehr Monfared talks about how he uses the free Doodle scheduling tool to organise lab sessions:

Have you ever set up lab sessions for students when there are many sessions but each student has to attend only one? You usually end up with one very busy session and a number of quiet ones.

Some lecturers balance the numbers into groups (forcefully) and if someone complains then they deal with it. However, I realised that if you give choice to students they usually are free for more than one session. Then I can balance the lab load based on their availability.

Doodle Scheduling http://www.doodle.com/ is exactly the tool for this sort of case and is free. I have been using this for many years for arranging meetings and scheduling personal events. However I used it last term for balancing my lab sessions. Most students are familiar with this website and I had no problem collecting data and compiling my lab time table.

Before using it, I checked with IT to see if there is any equivalent tool in the university, but there was none at the time.

This is how it works – You arrange your available lab times in the columns of a table on the web and email the link to students. Students add their name and tick the time slots that they can attend. Then based on availability you distribute students equally within the lab sessions. It worked great for me in the last term.

There are various other mechanisms that you could use for this purpose (including the Face-to-face activity in Learn / Moodle) but Doodle has the benefits of being simple, effective and familiar to many students.

Free tools for teaching – name randomiser

Name RandomiserDr Radmehr Monfared is a Lecturer in Intelligent Automation within the Wolfson School. In conversation with a Teaching Centre colleague Radmehr mentioned a couple of free online tools he has been using to support his teaching. Here is Radmehr describing the first scenario:

It is always a dilemma how to choose a student to answer a question while maintaining the fairness and equal opportunity to everyone, and also not making the student nervous.

I have come across the “name randomiser” idea many years ago. The idea is to rotate through the students’ names on the screen and randomly stop at one name.

This has been proved an ice breaker and a fun activity, while that chosen person has to answer the question. Students certainly like it.

Back in the days when our lecture rooms didn’t have internet access, I use to use a simple VB program that did the job for me, but filling the student list was a problem.

But these days, I use the following website, which is fun (with lots of interesting noises) to take pressure/stress off from the students.

http://www.classtools.net/education-games-php/fruit_machine   The advantage is that I can copy and paste the student list from my excel sheet very easily.  

Another one that I particularly like and used is   http://primaryschoolict.com/random-name-selector/  . This one also allows running a timer for students to answer the question.

[The latter is the tool shown in the screenshot above and it’s interesting to note that it was intended for primrary school use but can be useful even in HE!]

Evaluating the quality of web resources

TurnItIn have released two new tools to support the evaluation of web-based resources; a review of the sources actually used and an interactive tool for the evaluation of resources.  These should be useful as tutorial-level discussion pieces and lend some objectivity to assessing the worth of the Web.

“Open access to this new interactive rubric helps educators teach students proper research and source evaluation.

Turnitin worked with educators to develop The Source Educational Evaluation Rubric (SEER), an interactive rubric to analyze and grade the academic quality of Internet sources used by students in their writing. Instructors and students who use SEER can quickly evaluate a website and arrive at a single score based on five criteria scaled to credibility: Authority, Educational Value, Intent, Originality, and Quality.

“Recent research shows that students rely heavily on websites of questionable academic value,” said Jason Chu, senior education manager for Turnitin. “We believe that widespread usage of SEER will help educators teach students the importance of using quality resources in their research.”

This interactive rubric, when opened in Adobe Reader, allows you to adjust criteria weight and simply click to score each criterion with a rubric score and percentage automatically calculated.”

1) What’s wrong with Wikipedia?

2) The Source Educational Evaluation Rubric (SEER tool)