Constructing effective online assessment

I have worked alongside my academic colleagues Isabelle Marcoul and Svenja Erich of the Centre for Language Studies at City University London for the last two years to help develop effective online assessment. This project has now been written up for the recently published Learning at City Journal Vol 2 (2). You can download a full copy of our article here for free.

I’m providing a summary of the article here, focusing on the way the technology was used and how we measured the effectiveness of a multiple choice Moodle quiz.

Background

City University London runs a programme of language modules, some for course credit, some are extra curriculum. The languages taught are French, Spanish, Mandarin, Arabic and German. Before they can join a class the students need to be assessed and assigned to the language course appropriate to their level of linguistic competence, ranging from beginner to advanced levels. In 2011 more than 1000 students took a diagnostic test.

Prior to 2011, the language tests were handed out in a printed format and marked by language lecturers. The administrative burden for this was heavy with very tight marking deadlines, a lot of administrative work to assign students to the correct course, communicate this to students etc. It was concluded that an online system would help automate this, ensure the students received immediate feedback about which level and class was appropriate to them and would speed up the administrative process.

Practicalities

Each year the university runs a Language Fair during Freshers week. Traditionally this was when students took the written test and completed questionnaire (to gather basic information e.g. degree course etc). In September 2011 this assessment was done via multiple choice quiz on Moodle, the questionnaire was also online in a googleform. This meant that

  • a computer room was needed for the language fair
  • an audio/visual component was deemed to be difficult to manage as a large number of headphones would be required so listening was not part of the test

Design of the test

The languages team wanted to assess different types of language ability while being restricted to using a multiple choice online system. Each language had a test comprising of 100 questions. Please see the article for a full description of the choice of question type and what was assessed.

As a learning technologist I was very interested in how the languages department wrote their multiple choice questions in order to assess different types of language ability. For example, students were asked to read a generic text in the source language and were given comprehension questions to see how much they had understood. Some of the questions also asked that the students not only understand the words but also the cultural context and concept in order to get the answer right.

e.g.

What would you like as a main course?
A sorbet with strawberries
Six oysters
Steak and kidney pie with chips

To answer this question students needed to demonstrate understanding of it and the choices and to pick the correct answer from their own knowledge.

In the article Isabelle writes about how we construct language and how we can assess higher order thinking skills using online assessment methods so please do access the article if you are interested in this.

Use of Moodle and googleforms

City University London uses Moodle as it’s virtual learning environment. This was seen to be the perfect platform for the language testing. I met with the lecturers that would be preparing the questions for the test and explained how the Moodle quiz tool worked. This was to help them understand the types of question that would and would not be appropriate.

Once the questions had been written we had a two hour hands-on training session where the staff were trained in using Moodle quiz and then used it to add their questions with my support. I would recommend this approach. It meant that I could immediately troubleshoot any problems and the staff involved have been successfully using Moodle quiz ever since.

We also needed to collect some personal data from the students e.g. name, degree course etc. We used a googleform for this as they are very easy to set up and the data can be exported in excel format which the administrator requested.

Effectiveness of the language diagnostic multiple choice test

Effectiveness of the test was measured by the number of students that stayed in the group/level they were identified as during testing i.e. the language level of the course matched the language level that the student tested at. We were very pleased to see that the test proved very accurate in determining level for French, German and Spanish (small numbers of students took Mandarin and Arabic so the data was not conclusive).

This shows that an online test can effectively measure language ability in the majority of cases with very little movement of students between levels.

You can download a copy of the full article here

Language diagnostic testing moved to Moodle

Background
The Centre for Languages here at City University London runs a large scale language diagnostics programme every September in order to assess the levels of students wishing to study a language. The languages that we offer are Arabic, French, German, Mandarin and Spanish. Students can take a language as an optional module or as part of their course in some cases. Previously students have taken the test on paper, the paper has been marked and passed on to administration for them to assign the student their level and send out their timetable. This was a time consuming process for academics and administrators alike and caused delay in assigning students to the correct language level. Isabelle Marcoul, Head of the Centre for Language Studies at City University London, contacted me to see how this process could be enhanced through the use of Moodle. After numerous meetings and discussion with those involved it was agreed that the current tests would be put online as a Moodle quiz. This meant that students could get their result immediately and know which level they were and which class they could attend. Students also completed an online form (which had been set up using googleforms) to collect the information that was needed by administration.
This process was implemented for the first time in September 2011. Below are the thoughts of the Exams and Assessment Coordinator, the School Registrar and myself from Educational Technology, about how the project went and improvements we will make for next year.

Svenja Erich, Exams and assessment Coordinator for Centre for Language Studies, City University

This year you used Moodle’s quiz function to run your language diagnostic tests. Why did you decide to do this?
We decided to use it because the previous system seemed out-dated, ineffective, required a lot of resources and storage space. Also, it was hard work in terms of marking, administration and organisation.

How did you administer the tests in previous years?

We had an A3 (folded into A4, so there were four pages) paper-based test with multiple choice questions (two pages) and a writing section (one page). Student’s details (name, email address, studies etc.) had to be filled in on the front page of the test.
Tests were given out to students in Fresher’s Week – at the annual Languages Fair for the Languages for General Purposes (LGP) students and at orientation meetings at CASS and SoA/ SoSS for the Language for Specific Purposes (LSP) students. The multiple choice section could to be marked using transparencies, whereas the writing section had to be looked at by a lecturer of the target language.
The marked tests were passed on to the administration and the students’ details were entered manually onto the registers. Once this had been done, students were sent an email with the course information.

What are the advantages of running the tests in Moodle?

Example of questions from French diagnostic test

We don’t have to mark the test anymore, the mark is calculated automatically. This saves a lot of time. Staff at the Languages Fair can concentrate on giving advice regarding the courses rather than marking piles of tests. The result of the test comes up immediately once it has been completed. If students are briefed properly, they know straight after taking the test if they a have been admitted to a course and which class they should to go to.

Screenshot of googleform students complete when taking the diagnostic test

From an administrative point of view, running the test on Moodle is a lot more effective and less time-consuming. Students enter their details onto Moodle which can be easily exported, no need to enter data manually. Also, data are accessible to anyone in the languages team and test results/ students’ details can be looked up easily without having to communicate with the administrator.

In principle students can do the assessment test at home or anywhere at the university, without having to pick a copy up/ drop it off at the CfLS. Students who were not able to attend the Language Fair are now sent the details on Moodle along with an advice sheet. This makes it a lot easier for latecomers to take the test and enrol on the course which had a positive impact on enrolment numbers.

Overall, the new system is a lot more reliable and effective than what we had before.

Are there any disadvantages? How did you solve these?

The only downside really is that we need rooms equipped with computers at the Language Fair and we depend on IT services doing their job properly.  For peak times at the Language Fair, we had booked several computer labs across the university. We had to make sure that one support person was present in each of these rooms and that plenty of advice sheets on how to access the test were available. This was important because students are not necessarily as confident with the technology as one assumes.

Have you had any feedback from staff or students?

Our staff was extremely happy with the change. Apart from all the positive factors mentioned above, there is also the feeling that we have arrived in the 21st century with our assessment methods. The paper-based copy had become a little embarrassing.  Students appreciated the flexibility of the test as you can start completing it and resume later.  The only problem we had, had nothing to do with the technical side but concerned the content of the French test and can easily be corrected.

Will you use the Moodle quiz again next year? If so, what changes would you make?

Yes, we certainly will use it again. It would actually be a nightmare to go back to the paper-based copy. The only change we will make is that we will ask LGP students to self-enrol (through Moodle) on the course. This will even further simplify the administrative process.

I am extremely grateful to Anna Campbell who made the project possible. Anna understood all our needs and made sure that they were met in the best possible fashion. Moreover, she had extremely useful ideas and suggestions we had not thought of before. She made the technology accessible to us and offered excellent training to the lecturers designing the tests. All those who were less confident with the technology received individual support from her through email/ over the phone after the training. It was very easy to communicate with her. She also offered hands-on support for the Language Fair which was very re-assuring for me on the day.

Simon Barton, School registrar, School of Arts

How did the change in process impact on administration?
Moodle revolutionised the way we administered the language placement tests. It made what was a 3-4 day job (over the weekend!) a single days work (not on a weekend!). No more data entry or marking tests by hand, Moodle sorted all that out for us and left me the much more manageable task of filtering the spreadsheets and emailing students with their group allocations.

What will you do differently next year?
Next year, I would want to move the parts that go on Google docs to excel spreadsheets downloaded from Moodle as not all of the team working on the data had access to this information or necessarily felt comfortable with Google docs. And where we’ve got two documents that collect the information, I would aim to make it just a single downloadable spreadsheet. In addition to these small changes I would also introduce passwords on the free LGP courses to enable tutors to give these out in class so students are able to register themselves.

Anna Campbell, Educational Technologist for the Schools of Arts and Social Sciences

How do you feel the project went?
I’m really pleased with the way the project went this September. It was great to be able to see a tangible benefit to using Moodle in this way for all involved. I also am pleased that the staff teaching the various languages really got to grips with using Moodle quiz and therefore started to see the further benefits of using Moodle.

Any technical Moodle issues?
We had to set up the Moodle module that the tests were in as self enrol. We don’t tend to do that at City, we normally enrol students onto modules via SITS but in this case the whole of the university and staff could take the tests and that was not feasible.

What improvements will you make?
I think that Simon and Svenja have covered them. There is an issue of the Moodle quiz result in the gradebook and being able to match it up with the personal information on the form. I haven’t figured out an easy way to match those two up (apart from getting students to enter the grade they got in their test onto the googleform which is not foolproof). I’m still thinking about that!