Adobe Connect for Videoconferencing and Recording Teaching Sessions

connectI have recently used Adobe Connect in a variety of ways to help lecturers record teaching sessions and give their students opportunities to meet and talk to key figures in the industries they are training in. Connect is a videoconferencing platform which allows people to communicate online by watching and listening to each other via webcam, and sharing documents or their computer screen with each other. Participants can also use chat functions to send messages, and answer questions in polls. No software is needed, as everything is done via a webpage. At City, our licence for Connect means that any member of staff can log on to and set up an online meeting room.

My first use of Connect to support teaching and learning was last summer, before the exams period, when a key revision session for the first year undergraduate Sociology students needed to be recorded. A number of students could not attend because they were out of the country. Instead of simply recording the session and giving students access to this via Moodle, we live streamed the class using Connect and over 20 students joined in from several different countries (around 50 students were present in the “real” class). While the lecturer took questions from the students who attended in person, I hosted the online meeting room online and participants used the chat function to ask questions which I relayed to the lecturer (although it’s simple enough for one person to host an online meeting and lead the session at the same time). We got some really great feedback from students during this session. Those watching from home were impressed that we’d gone to the trouble of letting them join the meeting live rather than having to watch a recording.

JOM834 Adobe Connect JOM952 Adobe Connect

Last term, a number of lecturers, particularly in Journalism, organised press conferences during which their students have an opportunity to talk to and question key figures from industry. While a web-based Voice over IP (VoIP) service like Skype could also be used for this, Connect also allows computer screen and document sharing. Sessions can also be easily recorded and stored on the Adobe Connect server, so that students can watch them again later. Access to recordings can be controlled quite closely.

We’ve also used Adobe Connect’s screen-sharing function, combined with the recording function, as an alternative to lecture capture in rooms which aren’t currently equipped with recording hardware, but where a need for specialist software means we can’t use our Personal Capture laptop kits. Connect doesn’t do a perfect job of lecture capture, because the online meeting room and recording have to be set up each time it’s used, and recordings must be retrieved from the system and posted on Moodle manually. Further, the recordings are not perfect quality and since they are Flash they won’t play back on all devices. However, by setting up a meeting room, connecting a microphone (and/or webcam) and recording the meeting, we have the ability to record a teaching session anywhere in the university. For this purpose, no-one else joins the meeting room; we simply record the session and share the desktop of the computer, which means that whatever the lecturer shows on the computer screen will be recorded along with their voice. The recording can either be downloaded as a stand-alone flash video file or linked to on Moodle.

ECEL Conference and Poster: A learner-centred induction to Moodle


The “teaching pod” in the University of Groningen’s main hall

Back in October 2012 I attended the 11th European Conference on E-Learning (ECEL) held in Groningen, the Netherlands. This was the first time I’d attended this conference. I presented a poster on our work with the Department of Psychology on their induction programme for the BSc course in September.

The conference was interesting, mostly because it gave some insight into how City’s educational technology perspective is much more teaching-and-learning focused than many other universities’. Many of the presentations I attended were very technology-orientated and I heard people commenting that they would have preferred them to be more so!

I attended some useful sessions: methods for sending students notifications on Moodle updates via Facebook and Twitter; the changing role of the academic in the Web 2.0 world; and an interesting case study on use of blogging for portfolio development.

My poster summarised the word I had done with the BSc Psychology Director of Undergraduate Studies Marie Poirier, to redesign elements of the induction programme for undergraduate Psychology students. There was agreement between the department and our team that induction would be improved by being more learner-centred, less “information overload”, and by giving students more opportunities to get to know staff and each other. Accordingly, we redesigned many of the activities the students take part in during the week. My involvement was mainly in the first day “orientation activities” and in the Moodle induction on the fourth and final day.

Throughout the whole induction week the main principles for the activities were:

  1. To reduce information overload
  2. To manage students’ expectations and help them understand what is expected of them
  3. To start building a sense of cohort community
  4. To build a sense of subject-specific identity

The students’ first task on their first day at university in September was to get into their tutorial groups and meet the other students who had been allocated the same personal tutor as them. After a welcome and a brief introductory talk from Marie they were divided into small groups of four or five. Each group was loaned an iPad which they used to go off and make short videos about each other and about their personal tutor. Many of the students were quite excited and impressed to be given these devices to work with on their first day – however, the main reason we used the iPads was because they allow quick and easy shooting, editing and uploading of video (via iMovie and pre-created private Vimeo accounts), and because students can also use them to research their personal tutor. We had run a similar activity the previous year with Flip cameras and laptops: the iPads made the whole process quicker and easier.

The approach we took with the Moodle induction was to redesign it as a task-based fact-finding activity requiring students to work in the same groups as they had been in on the first day. The groups were given access to a tailor-made induction module which contained activities such as quizzes, choices, questionnaires and practice assignment submission points. (They could also watch the videos they had shot on Monday). The idea behind this approach was to have students simultaneously learn about and use Moodle: to learn how to use its tools and functionality by finding out something about it. We also wanted to address many of the concerns and questions students have about their new course by giving them the chance to find out the answers to some common questions (How do I find out my timetable? How do I submit my assignments? How do I connect to the wifi?). Finally, we wanted the induction activity to be clearly and explicitly tailored for Psychology students: we included links to commonly used Psychology resources and included contact details for key members of staff in the department. Without too much work, similar approaches could be taken for other departments in the School.

For details of how the project was evaluated, click on the link below to view a copy of the poster. If, as a member of staff in the School of Arts and Social Sciences, you’d like to try something similar for your induction, please get in touch.

Poster Final

Constructing effective online assessment

I have worked alongside my academic colleagues Isabelle Marcoul and Svenja Erich of the Centre for Language Studies at City University London for the last two years to help develop effective online assessment. This project has now been written up for the recently published Learning at City Journal Vol 2 (2). You can download a full copy of our article here for free.

I’m providing a summary of the article here, focusing on the way the technology was used and how we measured the effectiveness of a multiple choice Moodle quiz.


City University London runs a programme of language modules, some for course credit, some are extra curriculum. The languages taught are French, Spanish, Mandarin, Arabic and German. Before they can join a class the students need to be assessed and assigned to the language course appropriate to their level of linguistic competence, ranging from beginner to advanced levels. In 2011 more than 1000 students took a diagnostic test.

Prior to 2011, the language tests were handed out in a printed format and marked by language lecturers. The administrative burden for this was heavy with very tight marking deadlines, a lot of administrative work to assign students to the correct course, communicate this to students etc. It was concluded that an online system would help automate this, ensure the students received immediate feedback about which level and class was appropriate to them and would speed up the administrative process.


Each year the university runs a Language Fair during Freshers week. Traditionally this was when students took the written test and completed questionnaire (to gather basic information e.g. degree course etc). In September 2011 this assessment was done via multiple choice quiz on Moodle, the questionnaire was also online in a googleform. This meant that

  • a computer room was needed for the language fair
  • an audio/visual component was deemed to be difficult to manage as a large number of headphones would be required so listening was not part of the test

Design of the test

The languages team wanted to assess different types of language ability while being restricted to using a multiple choice online system. Each language had a test comprising of 100 questions. Please see the article for a full description of the choice of question type and what was assessed.

As a learning technologist I was very interested in how the languages department wrote their multiple choice questions in order to assess different types of language ability. For example, students were asked to read a generic text in the source language and were given comprehension questions to see how much they had understood. Some of the questions also asked that the students not only understand the words but also the cultural context and concept in order to get the answer right.


What would you like as a main course?
A sorbet with strawberries
Six oysters
Steak and kidney pie with chips

To answer this question students needed to demonstrate understanding of it and the choices and to pick the correct answer from their own knowledge.

In the article Isabelle writes about how we construct language and how we can assess higher order thinking skills using online assessment methods so please do access the article if you are interested in this.

Use of Moodle and googleforms

City University London uses Moodle as it’s virtual learning environment. This was seen to be the perfect platform for the language testing. I met with the lecturers that would be preparing the questions for the test and explained how the Moodle quiz tool worked. This was to help them understand the types of question that would and would not be appropriate.

Once the questions had been written we had a two hour hands-on training session where the staff were trained in using Moodle quiz and then used it to add their questions with my support. I would recommend this approach. It meant that I could immediately troubleshoot any problems and the staff involved have been successfully using Moodle quiz ever since.

We also needed to collect some personal data from the students e.g. name, degree course etc. We used a googleform for this as they are very easy to set up and the data can be exported in excel format which the administrator requested.

Effectiveness of the language diagnostic multiple choice test

Effectiveness of the test was measured by the number of students that stayed in the group/level they were identified as during testing i.e. the language level of the course matched the language level that the student tested at. We were very pleased to see that the test proved very accurate in determining level for French, German and Spanish (small numbers of students took Mandarin and Arabic so the data was not conclusive).

This shows that an online test can effectively measure language ability in the majority of cases with very little movement of students between levels.

You can download a copy of the full article here

EDEN conference poster presentation

We have a poster presentation at the EDEN conference in Porto, Portugal next week. The conference theme is ‘Closing the gap from Generation Y to the mature lifelong learner’. Our poster is a case study of a distance learning course, the PGCert in the Principles and Practices of Translation. The vast majority of courses at City University are blended learning courses (using a mix of face to face and online resources) so we were interested to see how the students on a distance learning course utilsed the tools made available to them in Moodle. The poster outlines how mature students from two cohorts on this course, with different levels of technical experience, have utilised the online resources, focussing on the use of discussion forums and Adobe Connect.

Discussion forums

Evelyn Reisinger, Course Director, set up a news forum and discussion forums in Moodle. These were designed to encourage the students to raise and discuss their own issues as they felt appropriate with minimal interference from university staff. I was interested in whether the students utilised these discussion forums to create a community of practice (or communities of practice as there are a number of language combinations available within the programme). This draws from the work of Etienne Wenger (2006).

I completed a content analysis of the discussion forum postings for each year group. I did this by reading through each post and categorising it in terms of its content.

The contributions on the discussion forums specifically relate to some of the criteria for communities of practice as outlined by Wenger (2006). These include problem solving, requests for information, coordination and synergy and discussing developments (see the percentage interaction for each cohort below).









Adobe Connect

The academic staff were keen to have some face to face tutorial time with the students. This was done using Adobe Connect (AC) web conference software already in use at the university. The sessions were specifically designed to answer student questions just after they had received assessment feedback. The students sent in their questions before the session and the course lecturers answered them in text form on Moodle and through AC.


  • AC live sessions were not well attended
  • Some students suggested that the time of the session wasn’t convenient
  • Some students are studying from different countries so the time zone may have been a factor
  • The sessions were held in the afternoons when many of the distance learning students may have been at work or had childcare issues
  • Students that didn’t attend did access and view the tutorial recordings so the sessions were perceived as worthwhile


From analysis of the usage of the tools, feedback from students, lecturers and the administrator we concluded that
  • Students did use the discussion forums to communicate on many levels and did create communities of practice. In 2009/10 they were focused on the course but in 2011/12 a German to English mothers forum was set up and this includes personal interactions about their lives and similarities. The staff on the course are actively encouraging use of the discussion forums in this way
  • Students were willing to use the online resources made available to them. They had signed up for a distance learning course and were made aware that resources were shared online so this could have led to a self selected IT-confident group
  • Students used discussion forums for a number of different interactions, mostly related to the course but including some personal interaction
  • Age of student was no predictor of their use of the technology
  • Adobe Connect recorded tutorials were accessed if students could not virtually attend at the time of the tutorial so proved a valuable resource type

Interview with Course Director Evelyn Reisinger on using Moodle (recorded during the first year of the programme)

Language diagnostic testing moved to Moodle

The Centre for Languages here at City University London runs a large scale language diagnostics programme every September in order to assess the levels of students wishing to study a language. The languages that we offer are Arabic, French, German, Mandarin and Spanish. Students can take a language as an optional module or as part of their course in some cases. Previously students have taken the test on paper, the paper has been marked and passed on to administration for them to assign the student their level and send out their timetable. This was a time consuming process for academics and administrators alike and caused delay in assigning students to the correct language level. Isabelle Marcoul, Head of the Centre for Language Studies at City University London, contacted me to see how this process could be enhanced through the use of Moodle. After numerous meetings and discussion with those involved it was agreed that the current tests would be put online as a Moodle quiz. This meant that students could get their result immediately and know which level they were and which class they could attend. Students also completed an online form (which had been set up using googleforms) to collect the information that was needed by administration.
This process was implemented for the first time in September 2011. Below are the thoughts of the Exams and Assessment Coordinator, the School Registrar and myself from Educational Technology, about how the project went and improvements we will make for next year.

Svenja Erich, Exams and assessment Coordinator for Centre for Language Studies, City University

This year you used Moodle’s quiz function to run your language diagnostic tests. Why did you decide to do this?
We decided to use it because the previous system seemed out-dated, ineffective, required a lot of resources and storage space. Also, it was hard work in terms of marking, administration and organisation.

How did you administer the tests in previous years?

We had an A3 (folded into A4, so there were four pages) paper-based test with multiple choice questions (two pages) and a writing section (one page). Student’s details (name, email address, studies etc.) had to be filled in on the front page of the test.
Tests were given out to students in Fresher’s Week – at the annual Languages Fair for the Languages for General Purposes (LGP) students and at orientation meetings at CASS and SoA/ SoSS for the Language for Specific Purposes (LSP) students. The multiple choice section could to be marked using transparencies, whereas the writing section had to be looked at by a lecturer of the target language.
The marked tests were passed on to the administration and the students’ details were entered manually onto the registers. Once this had been done, students were sent an email with the course information.

What are the advantages of running the tests in Moodle?

Example of questions from French diagnostic test

We don’t have to mark the test anymore, the mark is calculated automatically. This saves a lot of time. Staff at the Languages Fair can concentrate on giving advice regarding the courses rather than marking piles of tests. The result of the test comes up immediately once it has been completed. If students are briefed properly, they know straight after taking the test if they a have been admitted to a course and which class they should to go to.

Screenshot of googleform students complete when taking the diagnostic test

From an administrative point of view, running the test on Moodle is a lot more effective and less time-consuming. Students enter their details onto Moodle which can be easily exported, no need to enter data manually. Also, data are accessible to anyone in the languages team and test results/ students’ details can be looked up easily without having to communicate with the administrator.

In principle students can do the assessment test at home or anywhere at the university, without having to pick a copy up/ drop it off at the CfLS. Students who were not able to attend the Language Fair are now sent the details on Moodle along with an advice sheet. This makes it a lot easier for latecomers to take the test and enrol on the course which had a positive impact on enrolment numbers.

Overall, the new system is a lot more reliable and effective than what we had before.

Are there any disadvantages? How did you solve these?

The only downside really is that we need rooms equipped with computers at the Language Fair and we depend on IT services doing their job properly.  For peak times at the Language Fair, we had booked several computer labs across the university. We had to make sure that one support person was present in each of these rooms and that plenty of advice sheets on how to access the test were available. This was important because students are not necessarily as confident with the technology as one assumes.

Have you had any feedback from staff or students?

Our staff was extremely happy with the change. Apart from all the positive factors mentioned above, there is also the feeling that we have arrived in the 21st century with our assessment methods. The paper-based copy had become a little embarrassing.  Students appreciated the flexibility of the test as you can start completing it and resume later.  The only problem we had, had nothing to do with the technical side but concerned the content of the French test and can easily be corrected.

Will you use the Moodle quiz again next year? If so, what changes would you make?

Yes, we certainly will use it again. It would actually be a nightmare to go back to the paper-based copy. The only change we will make is that we will ask LGP students to self-enrol (through Moodle) on the course. This will even further simplify the administrative process.

I am extremely grateful to Anna Campbell who made the project possible. Anna understood all our needs and made sure that they were met in the best possible fashion. Moreover, she had extremely useful ideas and suggestions we had not thought of before. She made the technology accessible to us and offered excellent training to the lecturers designing the tests. All those who were less confident with the technology received individual support from her through email/ over the phone after the training. It was very easy to communicate with her. She also offered hands-on support for the Language Fair which was very re-assuring for me on the day.

Simon Barton, School registrar, School of Arts

How did the change in process impact on administration?
Moodle revolutionised the way we administered the language placement tests. It made what was a 3-4 day job (over the weekend!) a single days work (not on a weekend!). No more data entry or marking tests by hand, Moodle sorted all that out for us and left me the much more manageable task of filtering the spreadsheets and emailing students with their group allocations.

What will you do differently next year?
Next year, I would want to move the parts that go on Google docs to excel spreadsheets downloaded from Moodle as not all of the team working on the data had access to this information or necessarily felt comfortable with Google docs. And where we’ve got two documents that collect the information, I would aim to make it just a single downloadable spreadsheet. In addition to these small changes I would also introduce passwords on the free LGP courses to enable tutors to give these out in class so students are able to register themselves.

Anna Campbell, Educational Technologist for the Schools of Arts and Social Sciences

How do you feel the project went?
I’m really pleased with the way the project went this September. It was great to be able to see a tangible benefit to using Moodle in this way for all involved. I also am pleased that the staff teaching the various languages really got to grips with using Moodle quiz and therefore started to see the further benefits of using Moodle.

Any technical Moodle issues?
We had to set up the Moodle module that the tests were in as self enrol. We don’t tend to do that at City, we normally enrol students onto modules via SITS but in this case the whole of the university and staff could take the tests and that was not feasible.

What improvements will you make?
I think that Simon and Svenja have covered them. There is an issue of the Moodle quiz result in the gradebook and being able to match it up with the personal information on the form. I haven’t figured out an easy way to match those two up (apart from getting students to enter the grade they got in their test onto the googleform which is not foolproof). I’m still thinking about that!

Top tips – designing out plagiarism

Plagiarism is a hot issue in education. Rather than just detecting it, there are ways to design your assessment so that it is difficult for students to plagiarise. Here are some ideas (mostly from other people so I hope I reference them correctly!)

1. Consider a change to the format of your assessment. Dr Liza Schuster from City University London has experimented with using individual wikis (using OU wiki on Moodle) for each student. Liza asked each student to write 300 words each week on the topic for that week under the following headings

Screen shot from Global Migration course

Screen shot from the wiki on the Global Migration course

Liza then went into the wiki weekly to look at a selection of student work to comment on. She reported that this meant that she was quick to identify any problems with student understanding, bad referencing and plagiarism.
The students had to put together a 3000 word essay from the weekly work in the wiki. Liza reported that the referencing and writing in the final submissions was of a better standard than when the assignment had purely been a 3000 word essay at the end of the course. The students reported that they preferred this approach as they were more confident they were on the right track and didn’t have a deadline that was the same for all their submissions.

2. Avoid using the same assignment title each year. I know, I know. As a teacher myself I know how much easier it is to mark the second and third year of using the same essay title. If you are loathe to make a big change, perhaps change the focus. Ask students to use the theories to explain a recent case study, for example. In this way, the structure of the marking remains the same but it is much harder for students to plagiarise.
(adapted from Culwin and Lancaster, 2001)

3. Ask students to make a brief presentation to the class based on their written assignments. This doesn’t have to be assessed but would help identify those that don’t understand what they’ve written or, worse, those that have bought their assignment from an essay bank
(adapted from Gibelman, Gelman and Fast, 1999)

4. The best way to design plagiarism out of a course is to teach students about good referencing in their first term of study. Don’t presume that students know how to do this effectively, even at Masters level. If you use a text matching tool e.g. turnitin, consider showing an example of a plagiarised script at the beginning of the course and show how turnitin picks it up. It may scare them into referencing properly if nothing else!

5. Come along to our designing out plagiarism workshop (if you work for the Schools of Arts and Social Sciences for City University London that is!). Click here to find our current workshop dates


Culwin, F. & Lancaster, T. (2001). Plagiarism, Prevention, Deterrence & Detection. Institute for Learning and Teaching in Higher Education, South Bank University, U.K.

Gibelman, M., Gelman, S. R., and Fast, J. (1999). The downside of cyberspace: Cheating made easy. Journal of Social Work Education 35 (3).

The Moodle discussion forum

Debbie Dickinson has many years’ experience in the creative industries sector, and is the director of the Creative Industries degree in the Centre for Cultural Policy and Management at City University London. She uses her background in events promotion and music management to run the Foundation Degree and BA, which culminate in a series of events at Camden’s Roundhouse every year.

In this case study she tells us about her use of the discussion forums in Moodle, which she has used extensively, and which won her an award at City’s Moodle awards for 2010-11. She finds that the discussion forums offer a way for students to extend their discussions and meetings outside the classroom, essential when promoting events such as music gigs. Moreover, this is a way to engage students with Moodle early on, helping to ensure they come to see the VLE as an essential and central part of their studies.