Language diagnostic testing moved to Moodle

Background
The Centre for Languages here at City University London runs a large scale language diagnostics programme every September in order to assess the levels of students wishing to study a language. The languages that we offer are Arabic, French, German, Mandarin and Spanish. Students can take a language as an optional module or as part of their course in some cases. Previously students have taken the test on paper, the paper has been marked and passed on to administration for them to assign the student their level and send out their timetable. This was a time consuming process for academics and administrators alike and caused delay in assigning students to the correct language level. Isabelle Marcoul, Head of the Centre for Language Studies at City University London, contacted me to see how this process could be enhanced through the use of Moodle. After numerous meetings and discussion with those involved it was agreed that the current tests would be put online as a Moodle quiz. This meant that students could get their result immediately and know which level they were and which class they could attend. Students also completed an online form (which had been set up using googleforms) to collect the information that was needed by administration.
This process was implemented for the first time in September 2011. Below are the thoughts of the Exams and Assessment Coordinator, the School Registrar and myself from Educational Technology, about how the project went and improvements we will make for next year.

Svenja Erich, Exams and assessment Coordinator for Centre for Language Studies, City University

This year you used Moodle’s quiz function to run your language diagnostic tests. Why did you decide to do this?
We decided to use it because the previous system seemed out-dated, ineffective, required a lot of resources and storage space. Also, it was hard work in terms of marking, administration and organisation.

How did you administer the tests in previous years?

We had an A3 (folded into A4, so there were four pages) paper-based test with multiple choice questions (two pages) and a writing section (one page). Student’s details (name, email address, studies etc.) had to be filled in on the front page of the test.
Tests were given out to students in Fresher’s Week – at the annual Languages Fair for the Languages for General Purposes (LGP) students and at orientation meetings at CASS and SoA/ SoSS for the Language for Specific Purposes (LSP) students. The multiple choice section could to be marked using transparencies, whereas the writing section had to be looked at by a lecturer of the target language.
The marked tests were passed on to the administration and the students’ details were entered manually onto the registers. Once this had been done, students were sent an email with the course information.

What are the advantages of running the tests in Moodle?

Example of questions from French diagnostic test

We don’t have to mark the test anymore, the mark is calculated automatically. This saves a lot of time. Staff at the Languages Fair can concentrate on giving advice regarding the courses rather than marking piles of tests. The result of the test comes up immediately once it has been completed. If students are briefed properly, they know straight after taking the test if they a have been admitted to a course and which class they should to go to.

Screenshot of googleform students complete when taking the diagnostic test

From an administrative point of view, running the test on Moodle is a lot more effective and less time-consuming. Students enter their details onto Moodle which can be easily exported, no need to enter data manually. Also, data are accessible to anyone in the languages team and test results/ students’ details can be looked up easily without having to communicate with the administrator.

In principle students can do the assessment test at home or anywhere at the university, without having to pick a copy up/ drop it off at the CfLS. Students who were not able to attend the Language Fair are now sent the details on Moodle along with an advice sheet. This makes it a lot easier for latecomers to take the test and enrol on the course which had a positive impact on enrolment numbers.

Overall, the new system is a lot more reliable and effective than what we had before.

Are there any disadvantages? How did you solve these?

The only downside really is that we need rooms equipped with computers at the Language Fair and we depend on IT services doing their job properly.  For peak times at the Language Fair, we had booked several computer labs across the university. We had to make sure that one support person was present in each of these rooms and that plenty of advice sheets on how to access the test were available. This was important because students are not necessarily as confident with the technology as one assumes.

Have you had any feedback from staff or students?

Our staff was extremely happy with the change. Apart from all the positive factors mentioned above, there is also the feeling that we have arrived in the 21st century with our assessment methods. The paper-based copy had become a little embarrassing.  Students appreciated the flexibility of the test as you can start completing it and resume later.  The only problem we had, had nothing to do with the technical side but concerned the content of the French test and can easily be corrected.

Will you use the Moodle quiz again next year? If so, what changes would you make?

Yes, we certainly will use it again. It would actually be a nightmare to go back to the paper-based copy. The only change we will make is that we will ask LGP students to self-enrol (through Moodle) on the course. This will even further simplify the administrative process.

I am extremely grateful to Anna Campbell who made the project possible. Anna understood all our needs and made sure that they were met in the best possible fashion. Moreover, she had extremely useful ideas and suggestions we had not thought of before. She made the technology accessible to us and offered excellent training to the lecturers designing the tests. All those who were less confident with the technology received individual support from her through email/ over the phone after the training. It was very easy to communicate with her. She also offered hands-on support for the Language Fair which was very re-assuring for me on the day.

Simon Barton, School registrar, School of Arts

How did the change in process impact on administration?
Moodle revolutionised the way we administered the language placement tests. It made what was a 3-4 day job (over the weekend!) a single days work (not on a weekend!). No more data entry or marking tests by hand, Moodle sorted all that out for us and left me the much more manageable task of filtering the spreadsheets and emailing students with their group allocations.

What will you do differently next year?
Next year, I would want to move the parts that go on Google docs to excel spreadsheets downloaded from Moodle as not all of the team working on the data had access to this information or necessarily felt comfortable with Google docs. And where we’ve got two documents that collect the information, I would aim to make it just a single downloadable spreadsheet. In addition to these small changes I would also introduce passwords on the free LGP courses to enable tutors to give these out in class so students are able to register themselves.

Anna Campbell, Educational Technologist for the Schools of Arts and Social Sciences

How do you feel the project went?
I’m really pleased with the way the project went this September. It was great to be able to see a tangible benefit to using Moodle in this way for all involved. I also am pleased that the staff teaching the various languages really got to grips with using Moodle quiz and therefore started to see the further benefits of using Moodle.

Any technical Moodle issues?
We had to set up the Moodle module that the tests were in as self enrol. We don’t tend to do that at City, we normally enrol students onto modules via SITS but in this case the whole of the university and staff could take the tests and that was not feasible.

What improvements will you make?
I think that Simon and Svenja have covered them. There is an issue of the Moodle quiz result in the gradebook and being able to match it up with the personal information on the form. I haven’t figured out an easy way to match those two up (apart from getting students to enter the grade they got in their test onto the googleform which is not foolproof). I’m still thinking about that!

A case study: Moodle quizzes

Multiple choice question in Moodle quizMoodle’s quiz function allows you to create a range of questions including multiple choice, matching, true/false, short answer and essay formats. They can be an effective tool for formative or summative assessment as questions are saved in a bank to allow you to reuse them, and feedback can be given for each answer.

Dr Victoria Serra-Sastre from City University’s Department of Economics has been using Moodle quizzes with her Masters students and second year undergraduates. In this interview she shares her experience of using this tool.

How often do you set quizzes for students?

I have set quizzes once or twice per term, depending on the module taught. Quizzes on Moodle have been used in addition to other assessment tools like mid-term tests and take-home exercises. I have been using quizzes as the first assessment exercise that students had to face.

How have you done this in the past?

We piloted the use of Moodle for an UG course in second term of academic year 2009/2010. We received the support of the Educational Support Team who helped us through the process of setting up the first quiz. Once the first quiz is designed and uploaded on the system, it proves a very easy tool to use.

Why did you decide to use the Moodle quiz tool?

As I first learned about the online quizzes there were practical reasons to test it. Although the uploading of the questions is time-consuming, once the quiz is set up Moodle will mark it automatically and therefore the time saved on marking outweighs the time taken to post the quiz online.

What feedback have you had from students?

Positive feedback from students. They have access to questions and answers to the quizzes that help them to prepare for mid-term test and for final exams. It is also a very helpful tool as students are required to keep up to date with the material covered during the lectures

Will you continue to use Moodle quizzes in the future?

Yes, I will use them as part of assessed coursework. They are extremely useful and simple to use tools to assess students’ progress. Since I first used a quiz, the functioning of quizzes has been improved and adapted to the needs of students to facilitate their learning process.

What advice would you give to someone who was considering setting a quiz in Moodle?

At first you may require some time to train how to use quizzes but once learned it is a very helpful for students and for lecturers. In case of questions or problems in setting up the quiz lecturers always have the support of Educational Team. Also, Moodle quizzes are quite flexible as to accommodate the different needs of different module types.

Top Ten Tips for developing MCQs

Top ten tips

Bull and McKenna (1999) describe a Multiple Choice Question (MCQ) as a question with a ‘choose from a list’ of options answers. Moodle and Clickers provide opportunities to develop, deliver, mark and feedback on formative exercises for consolidation of knowledge and summative assessments.

We have produced the ten top tips to help you in creating more effective and challenging Multiple Choice Questions.

MCQ Terms

Before we start here is a guide to the terminology used in developing an MCQ.

Stem The text of the question
Key The right answer
Distracter The incorrect answers
Options The list of answers which includes the key and the distracters.

Top Ten Tips

  1. The text of each question (stem) should be presented as a clear statement or question that does not give any clue to the answer. (e.g. do not use ‘an’ at the end of the stem if only one of your options begins with a vowel) (Bull and McKenna, 1999)
  2. The stem should be presented in a positive form. Use negatives sparingly and if you need to use negatives ensure they are highlighted (bold and CAPITALISE) (Bull and McKenna, 1999, UKCLE, 2010)
  3. The incorrect answers (distracters) must be plausible. Implausible distracters can ruin a good question. Higgins and Tatham (2003) use the following example to highlight this point.

Which US state was the third state to accede to the Union in 1787?

  • New Cardy
  • New Woolly
  • New Jersey
  • New Jumper
  1. Avoid the choices “All of the above” and “None of the above” in your options. If you need to use them, make sure that they appear as right answers some of the time. (Bull and McKenna, 1999) Be extra careful of these options if you are randomising answer options with Moodle as these choices may appear on the top of the list and confuse students.
  2. Effective distracters are options that address common misconceptions or are statements which are only partially correct. Don’t confuse students who know the right answer by creating a distracter that is too close to the correct answer. (CAA Centre 2002)
  3. Extend the MCQ to test application of knowledge by creating a scenario which is new to the students that develops over a series of questions. A great example is provided by UKCLE (2010)
  4. Extend the MCQ to test the students’ analysis and application of knowledge through interpretive exercises which begin with a picture; a passage of text or a series of figures that are followed by a series of questions  that test students’ analysis of the data provided.
  5. Extend the MCQ by designing an assertion reason question. This is a “question [which]consists of two statements, an assertion and a reason. The student must first determine whether each statement is true. If both are true, the student must next determine whether the reason correctly explains the assertion. There is one option for each possible outcome.” (CAA Centre, 2002) Assertion reason questions are commonly used in Prince 2 Project Management qualifications and you can view examples of these on PPC’s Prince 2 training website.
  6. Use the Clickers to increase interaction in-class by posing MCQs. Have a look at YouTube video from Professor Eric Mazur, Harvard University on how he uses Clickers to facilitate peer instruction to promote understanding of key concepts.
  7. Online MCQs can help you to provide effective feedback to your students quickly. You can use your feedback as an opportunity to providing links to additional resources to correct student understanding. (UKCLE, 2010).

References:

Bull, C. and McKenna, J (1999) Designing effective objective test questions: an introductory workshop [online] Available from: http://caacentre.lboro.ac.uk/dldocs/otghdout.pdf (Accessed: 17.3.11)

CAA Centre (2002) CAA Centre Website. [online] Available from: http://www.caacentre.ac.uk/index.shtml (Accessed: 19.4.11)

Higgins, E. and Tatham, L. (2003) Exploring the potential of Multiple-Choice Questions in Assessment [online]Available from:  http://www.celt.mmu.ac.uk/ltia/issue4/higginstatham.shtml  (Accessed 17.3.11)

PPC (2010) PRINCE2 Assertion-Reasoning Questions. [online] Available from: http://www.prince2training.net/component/option,com_madblanks/Itemid,516/mbcsr197configid,3/mid,197/task,showmbmod/ (Accessed: 29.3.11)

UCKLE (2010) How can I write effective MCQs? [online] Available from: http://www.ukcle.ac.uk/resources/assessment-and-feedback/mcqs/ten/ (accessed: 17.3.11)