Constructing effective online assessment

I have worked alongside my academic colleagues Isabelle Marcoul and Svenja Erich of the Centre for Language Studies at City University London for the last two years to help develop effective online assessment. This project has now been written up for the recently published Learning at City Journal Vol 2 (2). You can download a full copy of our article here for free.

I’m providing a summary of the article here, focusing on the way the technology was used and how we measured the effectiveness of a multiple choice Moodle quiz.

Background

City University London runs a programme of language modules, some for course credit, some are extra curriculum. The languages taught are French, Spanish, Mandarin, Arabic and German. Before they can join a class the students need to be assessed and assigned to the language course appropriate to their level of linguistic competence, ranging from beginner to advanced levels. In 2011 more than 1000 students took a diagnostic test.

Prior to 2011, the language tests were handed out in a printed format and marked by language lecturers. The administrative burden for this was heavy with very tight marking deadlines, a lot of administrative work to assign students to the correct course, communicate this to students etc. It was concluded that an online system would help automate this, ensure the students received immediate feedback about which level and class was appropriate to them and would speed up the administrative process.

Practicalities

Each year the university runs a Language Fair during Freshers week. Traditionally this was when students took the written test and completed questionnaire (to gather basic information e.g. degree course etc). In September 2011 this assessment was done via multiple choice quiz on Moodle, the questionnaire was also online in a googleform. This meant that

  • a computer room was needed for the language fair
  • an audio/visual component was deemed to be difficult to manage as a large number of headphones would be required so listening was not part of the test

Design of the test

The languages team wanted to assess different types of language ability while being restricted to using a multiple choice online system. Each language had a test comprising of 100 questions. Please see the article for a full description of the choice of question type and what was assessed.

As a learning technologist I was very interested in how the languages department wrote their multiple choice questions in order to assess different types of language ability. For example, students were asked to read a generic text in the source language and were given comprehension questions to see how much they had understood. Some of the questions also asked that the students not only understand the words but also the cultural context and concept in order to get the answer right.

e.g.

What would you like as a main course?
A sorbet with strawberries
Six oysters
Steak and kidney pie with chips

To answer this question students needed to demonstrate understanding of it and the choices and to pick the correct answer from their own knowledge.

In the article Isabelle writes about how we construct language and how we can assess higher order thinking skills using online assessment methods so please do access the article if you are interested in this.

Use of Moodle and googleforms

City University London uses Moodle as it’s virtual learning environment. This was seen to be the perfect platform for the language testing. I met with the lecturers that would be preparing the questions for the test and explained how the Moodle quiz tool worked. This was to help them understand the types of question that would and would not be appropriate.

Once the questions had been written we had a two hour hands-on training session where the staff were trained in using Moodle quiz and then used it to add their questions with my support. I would recommend this approach. It meant that I could immediately troubleshoot any problems and the staff involved have been successfully using Moodle quiz ever since.

We also needed to collect some personal data from the students e.g. name, degree course etc. We used a googleform for this as they are very easy to set up and the data can be exported in excel format which the administrator requested.

Effectiveness of the language diagnostic multiple choice test

Effectiveness of the test was measured by the number of students that stayed in the group/level they were identified as during testing i.e. the language level of the course matched the language level that the student tested at. We were very pleased to see that the test proved very accurate in determining level for French, German and Spanish (small numbers of students took Mandarin and Arabic so the data was not conclusive).

This shows that an online test can effectively measure language ability in the majority of cases with very little movement of students between levels.

You can download a copy of the full article here

A case study: Moodle quizzes

Multiple choice question in Moodle quizMoodle’s quiz function allows you to create a range of questions including multiple choice, matching, true/false, short answer and essay formats. They can be an effective tool for formative or summative assessment as questions are saved in a bank to allow you to reuse them, and feedback can be given for each answer.

Dr Victoria Serra-Sastre from City University’s Department of Economics has been using Moodle quizzes with her Masters students and second year undergraduates. In this interview she shares her experience of using this tool.

How often do you set quizzes for students?

I have set quizzes once or twice per term, depending on the module taught. Quizzes on Moodle have been used in addition to other assessment tools like mid-term tests and take-home exercises. I have been using quizzes as the first assessment exercise that students had to face.

How have you done this in the past?

We piloted the use of Moodle for an UG course in second term of academic year 2009/2010. We received the support of the Educational Support Team who helped us through the process of setting up the first quiz. Once the first quiz is designed and uploaded on the system, it proves a very easy tool to use.

Why did you decide to use the Moodle quiz tool?

As I first learned about the online quizzes there were practical reasons to test it. Although the uploading of the questions is time-consuming, once the quiz is set up Moodle will mark it automatically and therefore the time saved on marking outweighs the time taken to post the quiz online.

What feedback have you had from students?

Positive feedback from students. They have access to questions and answers to the quizzes that help them to prepare for mid-term test and for final exams. It is also a very helpful tool as students are required to keep up to date with the material covered during the lectures

Will you continue to use Moodle quizzes in the future?

Yes, I will use them as part of assessed coursework. They are extremely useful and simple to use tools to assess students’ progress. Since I first used a quiz, the functioning of quizzes has been improved and adapted to the needs of students to facilitate their learning process.

What advice would you give to someone who was considering setting a quiz in Moodle?

At first you may require some time to train how to use quizzes but once learned it is a very helpful for students and for lecturers. In case of questions or problems in setting up the quiz lecturers always have the support of Educational Team. Also, Moodle quizzes are quite flexible as to accommodate the different needs of different module types.

Top Ten Tips for developing MCQs

Top ten tips

Bull and McKenna (1999) describe a Multiple Choice Question (MCQ) as a question with a ‘choose from a list’ of options answers. Moodle and Clickers provide opportunities to develop, deliver, mark and feedback on formative exercises for consolidation of knowledge and summative assessments.

We have produced the ten top tips to help you in creating more effective and challenging Multiple Choice Questions.

MCQ Terms

Before we start here is a guide to the terminology used in developing an MCQ.

Stem The text of the question
Key The right answer
Distracter The incorrect answers
Options The list of answers which includes the key and the distracters.

Top Ten Tips

  1. The text of each question (stem) should be presented as a clear statement or question that does not give any clue to the answer. (e.g. do not use ‘an’ at the end of the stem if only one of your options begins with a vowel) (Bull and McKenna, 1999)
  2. The stem should be presented in a positive form. Use negatives sparingly and if you need to use negatives ensure they are highlighted (bold and CAPITALISE) (Bull and McKenna, 1999, UKCLE, 2010)
  3. The incorrect answers (distracters) must be plausible. Implausible distracters can ruin a good question. Higgins and Tatham (2003) use the following example to highlight this point.

Which US state was the third state to accede to the Union in 1787?

  • New Cardy
  • New Woolly
  • New Jersey
  • New Jumper
  1. Avoid the choices “All of the above” and “None of the above” in your options. If you need to use them, make sure that they appear as right answers some of the time. (Bull and McKenna, 1999) Be extra careful of these options if you are randomising answer options with Moodle as these choices may appear on the top of the list and confuse students.
  2. Effective distracters are options that address common misconceptions or are statements which are only partially correct. Don’t confuse students who know the right answer by creating a distracter that is too close to the correct answer. (CAA Centre 2002)
  3. Extend the MCQ to test application of knowledge by creating a scenario which is new to the students that develops over a series of questions. A great example is provided by UKCLE (2010)
  4. Extend the MCQ to test the students’ analysis and application of knowledge through interpretive exercises which begin with a picture; a passage of text or a series of figures that are followed by a series of questions  that test students’ analysis of the data provided.
  5. Extend the MCQ by designing an assertion reason question. This is a “question [which]consists of two statements, an assertion and a reason. The student must first determine whether each statement is true. If both are true, the student must next determine whether the reason correctly explains the assertion. There is one option for each possible outcome.” (CAA Centre, 2002) Assertion reason questions are commonly used in Prince 2 Project Management qualifications and you can view examples of these on PPC’s Prince 2 training website.
  6. Use the Clickers to increase interaction in-class by posing MCQs. Have a look at YouTube video from Professor Eric Mazur, Harvard University on how he uses Clickers to facilitate peer instruction to promote understanding of key concepts.
  7. Online MCQs can help you to provide effective feedback to your students quickly. You can use your feedback as an opportunity to providing links to additional resources to correct student understanding. (UKCLE, 2010).

References:

Bull, C. and McKenna, J (1999) Designing effective objective test questions: an introductory workshop [online] Available from: http://caacentre.lboro.ac.uk/dldocs/otghdout.pdf (Accessed: 17.3.11)

CAA Centre (2002) CAA Centre Website. [online] Available from: http://www.caacentre.ac.uk/index.shtml (Accessed: 19.4.11)

Higgins, E. and Tatham, L. (2003) Exploring the potential of Multiple-Choice Questions in Assessment [online]Available from:  http://www.celt.mmu.ac.uk/ltia/issue4/higginstatham.shtml  (Accessed 17.3.11)

PPC (2010) PRINCE2 Assertion-Reasoning Questions. [online] Available from: http://www.prince2training.net/component/option,com_madblanks/Itemid,516/mbcsr197configid,3/mid,197/task,showmbmod/ (Accessed: 29.3.11)

UCKLE (2010) How can I write effective MCQs? [online] Available from: http://www.ukcle.ac.uk/resources/assessment-and-feedback/mcqs/ten/ (accessed: 17.3.11)