| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

SALT Case Study: Self-Paced Polling

Page history last edited by Matthew Allen 9 years, 2 months ago

Academic: Hayley Harris - School of Management
Librarians:
SALT Team: Stuart Henderson

 

 

 

 

The Context

This exercise is in the Financial & Management Accounting module for MSc Management students. Due to the varying levels of English language skills within the cohort, some of the students have issues with speed of reading and understanding language at the pace of a lecture. This was an important consideration when choosing how to run the midterm assessment.

Whist this module is designed as a basic introduction, previous experience of students on the module ranges from those who have no previous experience whatsoever right through to those who already have an Accounting & Finance degree. This leads to a great disparity in abilities. There was a desire to negate some of the linguistic disadvantages seen by overseas students, as well as to help with improving the speed of marking and results analysis for such a large group.

The assessment had always been paper based, taking the format of 15 questions with a blank space for students to input calculations and answers. These had been completed in invigilated sessions held under exam conditions.

 

The New Approach

The module had been assessed through a mid-term in class test (30%) and an unseen exam in January (70%). This exercise was to replace the previous version of the in-class test.

The original idea was to use Turning Point to be able to write one set of questions and to randomise these to allow students to sit exams in class. This developed into using clickers to show the questions via PowerPoint and for students to input their answers as each slide was shown. Business school has the advantage of being able to use the advanced clickers that allow for short answer and numerical input if needed.

Two main issues were identified with this concept:

  1. Some students would not be able to get through all 15 questions in the time allowed (3.33 mins per question) by the scrolling slide display.
  2. In this methodology there is only the option of a right answer (2 marks), or a wrong answer (0 marks), whereas, the old system allowed the students the opportunity of getting a single mark for showing knowledge of the correct methodology by showing their workings, whilst still getting the answer wrong.

The self-paced polling assessment was adapted to take account of these factors. Two differently ordered versions of the paper were produced for the test. These were distributed in an alternating pattern, in order to make copying difficult. The students were given 50 minutes to complete the test on paper, showing their calculations. The Turning Point receivers were already on at this point, meaning that students could, if they wanted, input their answer to the system as they went along. At the end of this 50 minute period they were told to stop writing and they were given a further 10 minutes to input any answers that they had not already inserted into Turning Point using the clickers.

Whist using paper copy in tandem with Turning Point would seem to negate some of the advantages sought of the clicker technology, there is still conceivably a time gain for the markers, as they only have to consider the paper version for those questions where the answer has been wrong in order to ascertain whether the student can be awarded a mark for the correct methodology. These are easily identified from the individual Turning Point reports as correct answers show up as green and incorrect as red.

 

Outcomes for Learners

Use of the paper copy ...

This gave us positive outcomes; Students liked the idea of being able to gain marks for methodology even where their calculations had gone awry; some of the students were nervous about the use of the technology, having the paper as a back-up was reassuring. Whilst some had had reservations, most students found the use of the clickers easy enough, although some seemed to take time over their data input.

 

Tutor observations and reflections ...

If the clickers Powerpoint had been used with normal polling there would have been issues of time pressure that would have particularly affected the students with language issues. Using self-paced polling, with the added time at the end for data input helped this issue.

Marking paper copies to check for partially correct methodology reduces the advantage gained from automating the system. If adopting Turning Point in other exercises or disciplines, lecturers would have to decide whether this aspect is necessary for their purposes.

In order to minimise the variation in answers input, only questions with numerical answers were asked, this meant less theory questions were asked. These would have required word based answers. As this was the part of the test that students had traditionally found more difficult, students obtained higher marks than in previous years, which needs to be considered for future assessments.

 

What level of support was required ?

Due to the type/size of spaces available to run the tests, invigilators were required to ensure students were not looking at neighbours papers (even though the questions were randomised). Also, some students needed considerable help using the clickers, so, more invigilators with "clicker" knowledge would have been useful.

 

Would I recommend it ?

Yes, particularly for larger classes and where students may work at considerably different paces.

 

Main strengths:

  1. The Turning Point System marks the papers, which makes marking quicker. However, time needs to be taken going through the physical papers where answers are incorrect to check if any marks can be awarded for method.
  2. The analysis of the answers is excellent. The frequency of each right and "wrong" answer can be seen giving statistics. This allows the lecturers to know which the most common mistakes are, so these very specific particular areas can be covered in future lectures.
  3. When going through papers to give feedback to each individual student, the reports that TurningPoint produces are excellent as they record the students own answer to each question and mark it green for correct & red for wrong. This allows the lecturer to move quickly through the actual paper with the student to the questions that need to be discussed rather than looking through the paper for incorrect answers.

 

Main weaknesses:

  1. Students need to be in a larger room so that they can be spaced out more to avoid the temptation to try to look at each other’s papers, even though time was short so students did not have free time and the papers were randomised.
  2. Students were not used to imputing anything other than multiple choice letters or single numbers into the clickers which made this part of the test slow.

 

Future changes:

  1. Two or three separate versions with completely different questions.
  2. Students would transfer just their answers to a separate back sheet and then the main paper containing their calculations would be removed. Then the separate sheet could be used to input into the clickers. This would avoid the temptation to continue working or adjust answers when students should have been purely paper answers into the clickers.
  3. Although a practice test was done, students need to spend more time inputting longer numerical answers into the clickers during the preceding lectures. Some students were exceptionally slow entering their answers which caused an issue in a room of 200 people who wanted to leave. This should become less of an issue as students become more accustomed to using the devices.
  4. More theory questions should be used, perhaps by having some multiple choice word based questions. Again, manual input short answers should become more viable as students gain experience with the clickers.
  5. Students were confused on how to enter certain answers. For a particular question the answer was a percentage. Students were asked to input to one decimal place but without the % sign, students were expected to input 40.0 but many input 0.4 which Turningpoint then marked as incorrect. (Students were asked to input to one decimal place just in case they had an answer of say 36.5% rather than the 40.0%). This could be eliminated by running more questions like this in practice clicker sessions.

 

 

 

 

 

Comments (0)

You don't have permission to comment on this page.