SALT Case Study: using Clickers in Anatomy


Dr Sam Webster

Academic: Dr. Sam Webster - College of Medicine
Librarians: Dr. Sam Oakley, Dr. Ian Glen and Susan Glen
SALT Team: Chris Hall

 

 

 

 

Context

We teach a series of 10 embryology lectures to a group of 72 students in their first year of studying medicine. We also teach anatomy in a small group environment and link embryology to their anatomical and paediatric studies.

 

Many methods can be used to engage students when teaching in a large lecture theatre, but it is difficult to ensure that the whole audience is involved. More gregarious students (particularly in medicine) tend to stand out when throwing questions to the group or when giving short tasks. Quieter students often have less involvement and it can be awkward to involve them without picking on them specifically. Embryology is often a complex subject because we work both temporally and spatially, jumping around from system to system from week to week, and it is difficult to gauge how much information students take from a lecture before their summative exams.

 

To help us with both of these issues we tried using an interactive voting system from Turning Technologies.

 

 

Our approach.

 

At the start of the series we outlined the plan for a series of quizzes to the students, each to be held at the end of each embryology lecture. The quiz would usually include 4 questions based upon material from the lecture and 1 question from outside the lecture, testing their existing knowledge of the field. We introduced the Turning Point remote devices, and students took them randomly to encourage anonymity and full participation. We asked the students if they would be happy to compete as 2 teams: boys versus girls. The challenge was accepted.

 

We used the Turning Technologies software to easily produce quick, multiple choice questions, to separate the students into 2 teams, and to show the score as it developed. After each question we reproduced questions from the lecture to show to the students why they were right or wrong and reinforce their understanding. The result of each quiz was to be posted to my blog where a running total would keep the teams up to date.

 

 

The outcomes.

 

Students became very animated and noisy when we switched over from lecture to quiz mode. Their postures changed and almost all of them appeared to become very focussed on the task. With each question and answer they became more involved and noisier as they discovered whether they were right or wrong. As the voting was anonymous, no student knew whether their neighbours had given the right or wrong answer unless they discussed it with them.

The tougher questions produced discussion within the teams as to which was the correct answer, making them recall, think and explain to each other their reasoning.

 

After each question we reviewed slides from the lecture that explained the right answer, enabling me to reinforce the learning of the whole group of either core or difficult concepts. This step often prompted questions from the group.

 

After 4 questions the scores revealed which team was in the lead, provoking more noise and friendly accusations of cheating. The result of the fifth question rarely changed the overall result but students appeared to work exceptionally hard on this final question, particularly as it usually required knowledge from outside the lecture.

 

As the lecturer I was given immediate feedback by the results charts and could see what proportion of the students in the lecture chose the right or wrong options. I was often surprised by how well they had picked up tricky ideas or remembered detailed descriptions, particularly when it had often appeared that many of the students had been half-asleep during these sections of the lecture. If a large proportion of the group (never more than 25%) had failed to grasp an area of learning I was able to immediately recover that material in a different way. If the competition hadn’t been so intense between the boys and girls teams I could have asked the question again to test their understanding.

 

 

Did it work?

 

The use of an interactive voting system has been very helpful in getting all students engaged with a lecture, in motivating students to listen (and to a small extent, to prepare for a lecture) and in indicating to the lecturer how well the students have understood the topic. It has also helped highlight and reinforce important or difficult areas of study.

 

The students have enjoyed the series of lectures and the quizzes. Long days of consecutive lectures are common in undergraduate medicine, and breaking up these lectures with exercises like this have been helpful and fun. We intend to continue with this method and hope that the feedback from students will encourage other lecturers in the School of Medicine to use this technology for similar or quite different tasks within large group lectures.

 

At the time of writing the boys are trailing 3:5 to the girls with 2 lectures to play.