Why ChatGPT has Educators Concerned

(I didn’t ask ChatGPT to write this for me.)

Generative AI has educators worried. I am assuming you have heard enough to know what AI-based tools like ChatGPT can offer and, like me, are thinking about the impact on learning. After a disappointing webinar from Microsoft, where we were effectively told to get used to the idea, I’ve been considering what was unsatisfactory about that suggestion and why educators are concerned about generative AI, which brought to mind Bloom’s Taxonomy.

The following diagram represents Bloom’s taxonomy (1956 version) with a few extras. I showed Bloom’s Taxonomy to a gathering of school IT officers and none recognised it. But I’m certain teachers would as they are trained using this theoretical framework. There are multiple versions, but they all represent levels of understanding that can be measured through assessment.

Bloom’s Taxonomy against technological aids

ChatGPT, and tools like it, have been compared to past technologies that challenged educators at their inception. Calculators were, at first, resisted in schools as they could answer mathematical questions that students needed to answer. The Internet, with its search engines and sites like Wikipedia, has also challenged the way students are assessed.

Generative AI has concerned educators to the same degree as these earlier technologies, if not more. Considering Bloom’s Taxonomy gives insight into what may be going on in the back of teachers’ minds.

  • A quick web search can substitute a student’s recall of knowledge.
  • Sites like Wikipedia can provide information showing a level of comprehension.
  • A calculator allows a student to take solve mathematical equation without demonstrating the application of their understanding.

Generative AI, in comparison, can:

  • take a sophisticated prompt,
  • gather information that demonstrates knowledge and comprehension,
  • contrast views in a way that shows application and analysis, and
  • synthesise these ideas into a cohesive written piece.

As suggested in the diagram above, artificial intelligence can now be used to substitute higher levels of understanding that were previously reserved for the human student. Together with the relatively sudden release of generative AI, I believe this is what has teachers worried.

At this stage, the quality of output from generative AI is not perfect, containing factual errors and bias, but it is improving. It is already at a stage where it could be equated to responses expected from post-graduate students, such as medical and legal students. The output is also relatively unique, so existing plagiarism detection is less effective against it.

Going back to my original question, educators are concerned as ChatGPT threatens their methods of assessing student understanding in a new way, which threatens academic integrity and institutional reputation, but also means they have work to revise their assessment approach.

What can be done?

It seems that generative AI is not going to disappear any time soon, in fact it will probably have deeper impacts on education than we can currently imagine. From what I can see, the possible responses for educators to adapt their assessment to this new paradigm.

Block

Blocking generative AI tools, such as ChatGPT and QuillBot could discourage students from attempting to use these tools. This would be a similar response to when earlier technological tools have been introduced and it has been the response from ACT Education (where I work). However, in the case of ChatGPT, the number of places where it can be accessed, beyond its main website (such as Bing and various chat platforms), is proliferating, so blocking may not be effective.

Control

If generative AI tools are not blocked, controls can be put in place to prevent cheating and, perhaps more importantly, ensure students are able to learn and demonstrate their understanding appropriately.

  • Education
    As with other tools, students need to be equipped to understand the associated ethical standards expected. Plagiarism and other forms of academic misconduct are frowned upon and do not lead to successful learning outcomes for students. There already needs to be information literacy training around content from online sources, to allow students to grow into citizens who can identify trustworthy content, and generative AI output is an extension of this.
  • Task Controls
    In mathematical tasks, we ask students to show their working; in essay tasks, we ask students to note their sources. Similarly, with tasks that could make use of generative AI, students can be asked to give more than the answer to demonstrate how they came to their understanding. Assessment design can be improved to deter (or at least complement) use of generative AI by adding specificity to the task or by asking students to deliver artefacts in multiple modalities (other than written text). Ultimately, the best way to avoid cheating is to make tasks seem achievable by providing clear instruction and appropriately weighting tasks.
  • Technological Controls
    Plagiarism detection seems to have been diminished now that generative AI can synthesise a novel text presentation, with appropriate citations. So what can be done technologically to control the use of generative AI?
    • The makers of OpenAI have released a tool that can help detect AI-generated text, which may be useful, but for now it is hard to tell.
    • It’s possible to ask students to present their work using a shared Word document, Google Doc or OneNote file, which shows a history of how the document was constructed, allowing teachers to scrutinise where content may have been copied and pasted. This is not foolproof, but a useful check for teachers.
    • Quizzes have been shown to allow demonstration of understanding that is as good or better than a written essay. A quiz can elicit responses to various question types, which may be useful to redirect students away from generative AI. Quizzes can also be partially or fully automatically marked, which is always an incentive for time-poor teachers. Adding time constraints and a lock-down browser to a quiz should give confidence for most assessment.
  • Physical Controls
    When authenticity of assessment really counts, it still means asking students to undertake tasks in person, away from computers. That could mean a paper exam or an in-person question and answer session. The utopia of online exams, temptingly closes after Covid19 remote learning, will be challenged by generative AI when institutional reputation is at stake.

Embrace

Educators have the opportunity to employ generative AI as a tool for learning and assessment. Like plagiarism detection (which began as a tool for educators to spot cheating, but became a tool shared with students to learn appropriate citation) generative AI in the hands of students can have learning benefits. The possibilities are already being enumerated and I anticipate we’ll see many pedagogical applications of generative AI over the coming years. Here are some.

  • Providing summaries of written works to gain a better understanding
  • Generating exercises to self-assess understanding
  • Supporting text revision where non-native language skills are lacking
  • Providing a starting point for in-depth research

Evaluation is a level of Bloom’s Taxonomy that I don’t think AI has yet conquered and that leaves room for higher-order thinking to be assessed. A colleague pointed out an article from the Sydney Morning Herald describing tasks being prescribed to medical students, who were instructed to prompt ChatGPT for information and then critique it.

Conclusions

The benefits of generative AI could lead to better student outcomes, if educators allow learning to make use of them. At present, there is significant positive innovation to match the reactions of those wishing to block generative AI.

I don’t expect efforts to block generative AI to last long, especially while they are less than fully effective. Ultimately a point of balance between control and embrace needs to be established, where assessment of understanding can happen and the learning benefits of AI-based tools can be achieved.

Limitations

I need to say that these views are my own and not an official policy or stance of ACT Education.

I haven’t completed a comprehensive survey of opinions from “the coalface”, however I’ve been communicating with people who have been gathering reactions to generative AI and writing briefs and presentations for Education decision-makers, which led me to this understanding.

Real Analytics in a K-12 School

Previously I described a number of myths about analytics I saw circulating when I started working in the school sector after years in the tertiary education sector and the education-related IT industry. Some schools are cobbling together reports on a semi-regular basis and, while useful for a small group of people for a short period of time, these are not analytics. Other schools have access to marks from an LMS, but not a complete picture sufficient to make predictions. Analytics should be:

  • available on demand,
  • predictive and
  • proactive.

This means that data should be collected regularly, processed in the background and ready for use, including calculations that lead to simple results as well as complex trends. When a result or trend meets a threshold condition suggesting intervention is warranted, relevant staff members should be notified to act.

Data available to schools

In tertiary education, mandates for online assessment are strong and use of online systems for remote and blended learning are almost universal. This leads to rich data on assessment and other participation that allows tertiary institutions to conduct interventions, with the goal of retaining students (and their fees).

The conditions of data in schools differ to tertiary, with less in some areas and more in others.

(less) Schools have less granular online assessment and participation data, with even less in early years.
(more) Student effort is often recorded and can be tracked over time.
(more) School students participate in external assessments. In Australia, students participate in NAPLAN testing in years 3, 5, 7, and 9.
(more) Schools are required to collect attendance information, which can be a useful proxy for participation and when combined with timetable information can reveal rich, useful patterns.
(more) Schools collect pastoral information, which reflects behavior that can impact student outcomes.
(more) Some schools check student attitudes or mood on a regular basis.

The value of analytics in schools also differs with the goal of improving student outcomes in grades and other holistic measures, rather than just retention.

Data warehouse

Collecting the data described above is a start, but having the data doesn’t mean it is useful. A data warehouse:

  • collects together data from disparate systems (LMS, SIS, forms,…),
  • conducts calculations on a regular basis,
  • sends alerts when threshold conditions are met and
  • provides views that can be queried quickly when people are drawn in to act.

The creation and maintenance of a data warehouse implies the need for data specialist staff. At CGS we are fortunate to have a database administrator, who is primarily responsible for our analytics, as well as a two web devs, a forms dev and a data systems manager, who work together integrating various systems, including the main analytics system referred to as the “Dashboard”.

Analytics possible in schools

Canberra Grammar School had been wanting analytics for a number of years before I joined in 2016, but were unsure how to achieve them. In a project with pastoral and academic leaders, we have been able to develop the Dashboard, which has been in use since 2018 and continues to grow in potential and use.

The development of the Dashboard followed the PMBOK-based project management process that I have written about previously. The need for analytics will differ depending by school and will be driven by the questions that the school needs to answer. This project involved consultation with various pastoral and academic leaders. We captured and expressed questions as use cases, such as “As a teacher/HoSH/Tutor I want an alert for student birthdays”. The list of use cases was quite long and we are still achieving some as more data becomes available.

The platform to handle the analytics could be achieved in a number of systems. At CGS we use SQL Server for the data warehouse (as well as a data store for most of our other systems), SQL Server scheduled tasks for alerts and background processing and a collection of SSRS Reports to form the Dashboard interface. We investigated PowerBI as an alternative platform but found this cost prohibitive when putting the results in the hands of all staff.

Since its inception, the Dashboard has undergone a number of revisions in response to user feedback. The initial focus was on delivering information focussed on individual students. We have added views to allow discovery of students needing intervention within cohorts.

Proactive alerts

Alerts sent directly to users prompt their action, but must be sent sparingly to avoid creating noise that people will habituate to. Here are some examples of alerts sent by email.

  • At risk students who have not been marked off in a class
  • Students with approved leave needing follow-up
  • Unexplained absences over a period of days
  • Students who report their mood as low
  • Pastoral events (positive and negative), including detentions
  • Reminders to mark assignments
  • Reminders to staff to who have not taken class rolls before a certain time in scheduled periods

Highlighted information

Some lower-priority alerts are shown to staff on entering the Dashboard. These alerts relate to students whose pastoral care they are responsible for.

  • Students with low attendance or lateness
  • Students who make numerous visits to health clinic within a period days
  • Students with co-curricular activities today
  • Student birthdays
Staff landing page

On-demand information

Information displayed on Dashboard

  • Academic results and effort marks and trends over the student’s enrolment
  • Timetable information and attendance for academic and co-curricular activities
  • External assessments results
  • Co-curricular participation including weekly activities and carnivals
  • Student mood history, pastoral incidents and learning needs
  • Cross-reference information for students at risk by matching flags for learning support, medical conditions, etc

View of students in house by year

Looking at an individual student, a staff member can find information quickly and see highlighted information about the student.

student-landing-page

Staff can drill down to specifics.

Comparative information and trends

While individual students are presented in the context of their cohort using z-scores, there is also the capacity to look at cohorts of students to identify student performance changes within the cohort.

Usefulness

The success of any system is measured by its usefulness. Analytics at CGS have proven to be useful for more than strategic decision making and seem to be having an impact on student care and to really improve student outcomes. The Dashboard is reported to be used in the following situations.

  • Teacher’s understanding who students are at the start of term
    • Differentiating students based on:
      • ways they learn,
      • skill sets and
      • past experiences.
    • Determining students who will work well together. (One staff member said, “We could spend a whole term getting to know a kid. Now we know them when we walk into the class on day one.”)
  • Learning Development support
  • Report writing
  • Pastoral/disciplinary issues
    • Following up on students-at-risk
    • Following student mood and acting
  • Subject choice advice
  • Career counselling, uni offers
  • Meeting mandatory requirements (eg attendance tracking)
  • For administrators to find staff responsible for students

Teachers appreciate the immediacy of having analytics available on-demand. One staff member said “It’s all about efficiency. When you’re reacting, having accurate data presented in an instant means you can assess a situation and make judgements rapidly.”

The use of analytics in the School has emphasised the need for accuracy and consistency in data collection. It is obvious when there are holes in the data, which impacts on the clarity of a picture about a student. This has led to drives for better collection of information and management of staff who fail in their recording duties.

Since the system was introduced, there has been a steady rise in its utilisation, year-on-year. While many staff may have searched in systems that feed data into the Dashboard, it is now clear the Dashboard has become the first interface they go to, particularly for new staff. According to the Director of Student Care, this indicates “staff are using student data in a more holistic way”. Projections for the current year show a number of views over 100,000.

Directions forward

We are still developing more predictive analytics. We are working on micro-patterns of attendance, such as a student missing classes in a particular subject. With a drive to bring most assessment to our Learning Management System in all parts of the School, this will give more granular data and hopefully the ability to reach the holy grail of analytics, which is predicting student success within a current course.

With greater access to data, staff can feel they might be missing out on information, particularly as the system evolves. Specific training and encouraging practitioner sharing has become needed to train data-driven teachers and pastoral leaders. This need is increasing.

We are currently working on parent and student views of analytics as a form of continuous assessment information. This informational display will be presented through the School’s LMS.

Continuous assessment

Phishing Training for School Staff

Phishing attacks are a risk to our School and schools like it. They are difficult to mitigate using automated means, so user training is needed.

There are two main types of phishing attacks:

  • an email that leads a user to a page where they provide credentials or other information or
  • an email that attempts to establish a trusted conversation with a user in order to ask them for information or to purchase items that can be transferred to cash (spearphishing).

While financial loss needs to be avoided, the extraction of user credentials has the potential to be most damaging as this can lead to loss of secure information and ultimately data breaches.

At Canberra Grammar School, we’d been wanting to run phishing training for a number of years. We had seen quotes for external vendors to support such training, but to save $12,000 to $15,000 we were motivated to try this ourselves. In this post I will outline how we went about running phishing training at CGS and the outcomes we experienced.

Messaging Around Phishing Training

In order to train users about phishing, you have to think like a phishing attacker. However, it’s important not to take that too far and think of the training as tricking users. Ideally users should become aware of phishing, know how to identify phishing emails and know how to respond when they receive such emails. This all has to be done while maintaining the good relationship between users and IT staff.

Before embarking on this training we created a number of educational and informative communications.

  • A guide within our Knowledge Base describing phishing emails and how to act
  • An email to various School leaders informing them about the program of training and letting them know it was OK to discuss it
  • A series of educational messages sent to staff through our daily announcements
An announcement about the phishing exercise
An announcement about the phishing exercise
  • A quiz in our LMS that offered optional training to those wanting it before our fake phishing exercise and mandatory training for those who responded badly to the exercise

Within our School, staff are encouraged to forward phishing emails to our Service Desk who assist with identifying emails as phishing attacks and blocking senders as well as identifying other recipients and warning them.

Setting Up a Phishing Server

We started creating our own solution using a basic web server and email server, but we then came across a system called GoPhish, which is an open source system for running phishing training.

To run GoPhish, you need a server that is web-accessible. We made sure the server was accessible outside the School as we imagined some staff would receive and respond to fake phishing emails from home and this needed to work as well as it did within the School. This could be a VM hosted locally or one from an external hosting provider.

We also took the step of purchasing a domain name close to our School’s own domain name (registrar link). We had seen such a domain used in a very sophisticated phishing attack on School community members and we wanted to use a similar domain for some of our fake phishing emails and landing pages, targeted to staff with riskier roles.

GoPhish is simple to install and get running. Download the zip file, unzip it and run the executable. We used a self-hosted Windows VM and when GoPhish first ran, a firewall warning popped up, asking to allow the software through.

Command prompt output from GoPhish on an initial launch
Command prompt output from GoPhish on an initial launch

When the server first runs, it generates a password that is displayed on the command prompt window. I grabbed a copy of that. The IP and port of the server’s web interface is also given, which I copied into your browser’s URL bar. My browser warned me the site is insecure; clicking Advanced and Proceed and I could see the admin interface. I used the default username “admin” and the generated password to log in, then set a new password when prompted.

The GoPhish system with facilities to build campaigns
The GoPhish system with facilities to build campaigns

The VM and the server needed to keep running while configuring the system, sending phishing emails and afterwards to track and redirect users.

Campaigns

On the left-hand menu there are items for setting up components of a campaign. The nice part about setting these up separately is that this allows you to mix-and-match different combinations of components when you put together a campaign.

I don’t think it would be wise to share our templates here, but if you are from a school, feel free to call or email me and I will share mine with you.

Sending Profiles

Each campaign needs a sender.

  • For credential campaigns that lead a user to a page, replying is not important, so we used a sender that looked real according to where the email was supposed to be coming from (eg no-reply@ato.gov.au).
  • For spearphishing campaigns, the sender profile should represent a person of authority in your organisation, leading replies to a fake email account outside your organisation that looks like it could be that person’s personal email address. We set up a Gmail account for this. For added realism, we used the real email address of the impersonated sender in the From field, but add a Custom Header field with the name Reply-To directing replies to the fake email account. (We only used this level of sophistication on our most at-risk staff. Normally such an attack shouldn’t be possible as the School blocks domain emails coming from non-domain accounts, but they may experience this in personal emails.)

Email Templates

We set up a number of email templates representing credible communications from authorities who might contact a person out of the blue. The key message is a request for urgent action. We added content in the template editor’s HTML tab (you can also click the Source button to toggle WYSIWYG mode). For images, we embedded these as base64 encoded content so they are sent within the email, rather than links or attachments (see tools below). The email template can embed recipient names if you want it to be more sophisticated; these names are uploaded with email addresses. We ticked the box to add a tracking image to each email, which allowed us to know if a user has viewed a message (assuming they downloaded the linked tracking image, which is not entirely reliable).

We created templates that suggested contact from the Tax Office, from the School asking for urgent login and another suggesting the user’s inbox was full, needing action.

An example phishing email with problematic elements highlighted
An example phishing email with problematic elements highlighted

For spearphishing, you are sending emails to tempt recipients into replying. We used a sender profile for a person who could be identified as an authority from the School’s website. Our email template suggested urgency with some unstated action needed. The email included a signature, including an embedded base64 image, but it was deliberately different to the School’s defined email signature format. We spoke to the person we were impersonating to get their support and cooperation and people did contact that person directly.

A spearphishing email, pretending to be a School Director, but including spelling errors as well as signature and image inconsistencies
A spearphishing email, pretending to be a School Director, but including spelling errors as well as signature and image inconsistencies

Landing Pages

For capturing credentials, users will be led from a link in an email to a landing page. Landing pages should include inputs for users to put in their credentials. To create a fake landing page we went to the real page, copied its HTML, brought linked CSS into the document and converted linked images into embedded base64 images so the page is entirely self contained. GoPhish can help you create such landing pages if you provide it the URL of a real landing page, but we didn’t try this. Some skill with HTML will help you make it work. Don’t worry about where the form is set to send it’s information (the <form> action attribute), the phishing system overrides this to bring the traffic back into itself before redirecting users. We ticked the box to capture submitted data, but don’t tick the “Capture Passwords” checkbox.

A fake landing page, requesting credentials, with problematic elements highlighted

Redirection

In a phishing attack that leads a user to a page to provide their credentials, there is an option to redirect the user to a page afterwards. Rather than this being within the phishing site, which should be suspect to users, we redirected users to a page hosted on the server of our School LMS, so the domain would be familiar and acceptable security was in place. The page could be accessed without authentication so as not to disrupt the effect of the exercise. The page could be passed URL parameters so that information about the specific attack could be recalled to illustrate what the user missed in the attack. At the bottom of the page, users were directed to the phishing training quiz.

Page users were redirected to after providing their credentials.

Uploading Users

Groups need to be created in a spreadsheet. We used full-time staff only and divided them up into sensitive and non-sensitive groups according to financial capabilities and levels of access to sensitive information. For each group we created a CSV file with the following columns, including a header row. All fields except Email can be blank, but you can draw on these other fields in templates, if you wish.

First Name,Last Name,Email,Position

During our initial setup, we had groups that included EdTech department staff for testing.

Launching a Campaign

We started by warning our Service Desk we were are about to start a campaign and how to respond to users reporting phishing emails. We provided text to make responding to these emails easy and consistent.

For our spearphishing campaign, we warned the person we were impersonating that we were about to start. We discussed how to respond to queries that came to them.

We created a new campaign from the Campaigns page.

  • We picked an appropriate combination of:
    • email template + sender profile,
    • landing page (irrelevant if spearphishing) and
    • user group.
  • For the URL, we mixed up the address similar to the School’s actual domain (trickier) with the IP address of the phishing server (less tricky).
  • We set an end time about an hour later using the “Send Emails By (Optional)” setting and the system spaced out sending the emails.
Bringing together the elements of a campaign
Bringing together the elements of a campaign

Once launched, we kept an eye on various campaigns through the Dashboard page. Campaigns were conducted with groups over a period of three days using different email+landing page combinations.

Replies to the spearphishing emails went to a fake email account. To each reply we informed the person replying that they had been sent a fake phishing email, what they should have done and that they should then undertake training. Again, a pre-determined text response was used.

See Also

Image conversion tools…

Docs…

Our Experience

On the whole, the experience of running phishing training was generally positive. Most staff expressed a keenness to be tested when the exercise was discussed publicly. Only a couple of staff said they felt uncomfortable after failing the exercise. Responses from staff allowed us to improve our advice during training and for a future repetition.

It’s hard to know exactly how many emails were received by users as some would have been filtered by users’ email clients. It’s also hard to know how many people looked at the emails as the tracking images would have been hidden by most users email clients, unless they chose to reveal them. We did get some numbers, which rounded out as follows.

262 Staff involved
524 Fake phishing emails sent
42 Voluntarily took practice quiz
142 Reports to Service Desk
21 Replies to phishing emails
1 Gave away credentials

While not all staff responded in an ideal fashion, awareness was raised and people who failed the exercise were directed to remedial training. Ultimately a positive education exercise was achieved.

The testing also shows our email system responds appropriately to most phishing emails and can be trained when users identify emails as Junk. This was also good training for Service Desk staff.

The phishing system can be reused in future if this exercise needs to be repeated.

Plugins we have Developed for our School in Moodle

If we had used an off-the-shelf LMS, our potential to customise the system would have been limited. Part of the reason our School chose Moodle was to allow us to make it our own. This potential is a double-edged sword: it takes time and effort to customise a system, especially when School leaders get in the habit, but the results allow us to achieve a system we would not have been able to achieve otherwise.

One trap to avoid is modifying core Moodle code as doing so will affect your future upgrade prospects. So far we have managed to avoid this and have been able to make all our modifications by creating plugins.

An Announcements System

Moodle allows you to use the Forum module as a mechanism for sending messages to users at course and site levels, however when planning for an LMS change, stakeholders were asking for particular features the Forum module doesn’t deliver, so we started developing our own Announcements system and it’s been the largest single developed solution we have created.

Our announcement system allows us to:

  • target single users, groups, courses, campuses, year levels and more;
  • combine audiences in union (eg all Year 7 and 8 students) or intersection (Football players in Year 10);
  • send messages to parents (mentors) relative to students;
  • moderate messages sent to large audiences;
  • impersonate senders (eg the PA to the Head sending on behalf of);
  • brand single message and digest emails;
  • see an infinitely scrolling list of past announcements;
  • see context specific announcements (eg course-related) in that context;

…and many more subtle tweaks.

Repos: Local plugin, Block

Student Timetables

Students (and their parents) need ready access to their timetables for the day. We provide that with links to courses for each period. The display changes over the course of the day as time passes. It’s possible for users to skip forward by days, according to timetable information provided by our SIS.

In the Primary school, timetables are simpler. We show a link that takes students straight to their year-level course. We also show unusual activities such as specialist classes (art, PE, music, etc) and individual tutoring sessions.

We add these blocks in a region we’ve added at the top of the Home page.

Repos: Timetable block, My day block

Mood Survey

Particularly during remote learning, we were needing a way of gauging staff and student mood, to assist pastoral care staff and counsellors. Our solution was to create an overlay on the front page that asked users how they were feeling.

Responses are channelled into a table that allowed us to generate alerts and reports.

Repo: Block

Mentees+ for Parents

The core Mentees block shows only a student names to a parent, with links to the students’ profiles. We have created an enhanced version of this with photos and direct links to the student’s courses (which we allow parents to access). This provides parents with quick access to all their children’s involvement in an obvious fashion.

The block is placed in an added region at the top of the Home page.

When a parent has many children, the block can be collapsed down. There’s also an option to hide the content of the block for teachers who are parents and don’t want students in their classes to see details.

Repo: Block

Targeted Quick Links

For our School, the LMS also acts as a portal to other systems. We therefore created a quick links block that allows buttons and text links to be added for specific audiences that can be targeted by combinations of site role, campus and year level, giving a personalised experience.

Repo: Block

Past Courses in Boost

The Boost theme is more responsive than previous themes and we’ve embraced it. One downside to the simplified navigation is that only current courses are listed. To allow access to past courses we created a plugin that adds them in an expanding menu below the normal list of courses.

Repo: Local plugin

Transcoding for Apple Devices

The RecordRTC features of Moodle provide convenient audio and video recording, but the open formats it uses are not respected by Apple devices. We looked at the Poodll plugins, but the downside was that our recordings would need to sit on servers overseas. We created a tool that runs in the background and transcodes the audio and video files to formats that are compatible with Apple devices and provides these compatible links embedded as additional source files within the original links for compatability.

Repo: Admin tool

Syncing more User Data

Most of our user data comes from our SIS. Automating this means the LMS is easily managed. As well as syncing user details and course enrolments, we’ve created a number of admin tools so we can also sync:

  • class groups,
  • parent (mentor) relationships and
  • category and site enrolments.

Repos: Group sync admin tool, Mentor sync admin tool, Category role sync admin tool, Site role sync admin tool

Our Course Roll-over Process

When setting up your Moodle site, you may want to consider how long courses will live and how you will “roll” them over, which means making them ready for a new cohort.

Edit: We created a plugin to assist with setting enrolments to manual for archived courses. I’ve updated that part of our process below.

It is possible to leave courses in place and simply reuse them, but I wouldn’t recommend this as courses can accumulate a lot of mess over years, both in content and settings. Moodle’s backup, restore and import processes are set up to easily copy courses, adjusting to new timelines. Our approach is to create a new copy of courses for each teaching period. This also affects how we name and organise our courses.

Keeping a version of each past course, as it was taught, means teachers and students can remain enrolled in that course for years, referring back to it over time. Recreating courses means the teacher only has to focus on the current instance and not worry about maintaining historical activities and resources. It also doesn’t greatly increase storage as Moodle’s file management transparently keeps a single copy of files that might be used in multiple courses.

We roll-over our academic and sports courses every six, 12 or 24 months, depending on the length of the courses. Our roll-over process relies on feeding a number of spreadsheets into Moodle admin tools. One optional added tool is the Upload Course Categories tool.

Our roll-over process therefore has three stages: preparation of spreadsheets (CSV files), using these to execute course changes and make copies (as well as a few manual changes) then a clean up of the results while the system continues to be used.

Preparation

Once you have set up the following spreadsheets once, reusing them with new details is very easy. We use a collaborative Google Sheet and then export sheets as CSV files when we need them.

  1. Prepare a CSV file containing a list of categories that will contain courses that will be rolled over. Include the following fields: idnumber, name, description
    • The new categories will reflect the structure of the current categories, but will be rooted at a category for the past year (eg 2019).
    • The idnumber and description fields can be empty, but must be present as columns
    • The name field includes the path with forward slashes (/) between levels, eg “2019/Senior School/Senior Academic/IBDP”
    • See this repo for more field info
  2. Prepare a CSV file containing a list of current courses that will be rolled over with the following fields: shortname, enddate
    • The shortname is used as the identifier for the course to be updated
    • The enddate will be in the format DD.MM.YYY eg 31.12.2019
    • See this doc for more field info.
  3. Prepare a CSV file containing a list of new courses with the following fields: shortname, fullname, idnumber, category_idnumber, format, showgrades, templatecourse
    • The shortname and fullname should follow the format described using the naming convention (see ours).
    • The idnumber will match the course in the SIS.
    • The category_idnumber is the target category ID (we use short words for these codes).
    • The format is the course format. We use tiles for academic courses.
    • The showgrades controls whether grades are shown; the value will be 1 for academic courses and 0 for other courses
    • The template course will be the idnumber of the previous instance of the course you will be rolling over (copying).

Execution

As this affects users’ experience, it needs to be done out-of-hours and relatively quickly, so be prepared. You may want to try this a test system first to ensure you get it right. The critical bits are done in maintenance mode.

  1. Create the roll-over categories using by uploading the first CSV created earlier (the list of categories) to Site admin > Courses > Upload course categories (or create the categories manually if this plugin is not installed).
  2. Manually move each course that is being rolled over into its corresponding roll-over category (under the year category). You can do whole categories of courses at a time.
  3. Put the site into Maintenance mode in Site admin > Server > Maintenance mode.
  4. Wait for a minute to ensure any cron jobs are completed.
  5. Create new courses, copying old courses, by uploading the third CSV you created earlier (the list if new courses) to Site admin > Courses > Upload courses (this copying may take a while). If you have a large number of courses, you may want to do this in batches.
  6. Set end dates for rolled-over courses by uploading the second CSV you created earlier (formerly the current courses) to Site admin > Courses > Upload courses.
  7. Check that courses are in place and set up.
  8. Freeze the year level category (assumes freezing is enabled at Site admin > Development > Experimental settings)
  9. Take the site out of Maintenance mode.

Clean up

The following can be done while the system is in use, but shouldn’t be delayed.

  1. Make the teacher and student roles in archived courses fixed by setting them to manual (if you used DB syncing). We created a admin tool plugin to allow us to change/fix roles.
  2. Sort the courses in each category by short name ascending. Check this makes sense and possibly re-order courses into a more logical order (eg, Pre-S, Pre-K, K, 1…) if necessary.
    undefined
  3. Copy any course images to the new courses. These don’t come across in the course copy.

Implementing Moodle in a School

TL;DR – School specific bits

Overview

This post describes the technical details of the setup of Moodle as an LMS, announcements system and portal in a School. For details of the greater change management project, please see my previous post: An LMS Transition Project.

After Moodle was decided on as the preferred system, there were a number of implementation decisions that needed to be made. Over time we have adjusted and improved upon our installation and I hope to share advice with other schools, particularly as a lot of Moodle advice is given in the context of tertiary education.

We have two systems running: one for testing and one as our production server. The configuration for the test server is mostly the same as the production server, except for redundancy and backups. In this document I will focus on the production instance.

Database

Moodle works best with PosgreSQL and if you have no DBMS preference, I suggest you go with that. At our School we had been running MS SQL Server for a number of systems, so it made sense to stick with that. Speed is about as good as PostgreSQL, but there are some additional settings needed to accommodate Unicode characters (guide). Staying with a single DBMS has also made cross-system reporting, backups and the focusing of expertise simpler.

The database for Moodle is hosted on two VMs with automatic fail-over. Each has 8 CPU cores @ 2.3GHz, 160GB storage (all flash) and 16GB RAM, which seems to be more than enough for our application. The storage is split across a few partitions to allow resizing for different DB tables (eg logs) as needed.

Server

For the webserver, we are running IIS under Windows. Again this is not the best option for Moodle (most use Linux with Apache), but it is simpler for our system administration and backups. Running a relatively large instance of Moodle on Windows has been more challenging than I thought it would be, but it can work.

The VM for our web server has 16 CPU cores @2.3GHz, 550GB storage (all flash) and 16GB RAM. The Moodle data directory sits on a separate 500GB partition to allow resizing when needed.

We’re running a recent, but not bleeding-edge PHP version. We did have to pick a version that would work successfully with the PHP drivers for our DB, Redis cache and for Solr. Within php.ini, increase the max_execution_time (3600) and max_input_vars (20000). Turn on the various drivers Moodle suggests (except I suggest not enabling opchache on your development or test servers, as that makes code tweaks possible without them being cached). To allow Curl to work with SSL, you need to download certificate from the CURL website into a location defined by curl.cainfo in php.ini.

Windows has an arbitrary limit of 4000 on TCP sockets. With a web connection, DB connection and cache connections, each user can be utilising three or four sockets and we were hitting that socket limit and creating contention. A few Registry changes are needed to overcome that limit.

We also increased the allowed filesize for IIS, mostly to be able to handle large backups (guide). In IIS > Request Filtering > Edit Feature Settings > Set Maximum allowed content length to 2147483648 for 2GB. We also made a change to Windows to work better with UTF8 characters in filenames (guide).

Git

Even though you won’t develop code on your production instance, the easiest way to fetch the latest versions of Moodle, and plugins developed by yourself and others, is to use Git. See this guide for details if you are not familiar with Git. I recommend starting all installs with Git as starting with a downloaded zip version makes overlaying Git harder.

Caching

For caching, Redis is the new preferred solution for Moodle (guide) (session handling settings) (helpful discussion). Under Windows, you have to settle for a code port hosted by Microsoft from a few years ago, but it works (download). For our purposes, we set up two Redis stores so we could separate session caching from application caching for monitoring purposes. You can create two Redis instances with the following command line commands…

redis-server --service-install --service-name Redis --port 6379
redis-server --service-install --service-name Redis2 --port 6378

…and then enable them as services using the Windows Services admin app. They will start automatically when the machine restarts and are really just blackboxes that Moodle throws cache information into.

You then need to download and install the PHP driver (matching your web server and thread-safe status) and adding an entry to your php.ini file.

To monitor the cache instances, we are using a script (download) and made a change to array at the start of the file for two caches.

If Redis is working, it should show up in Site admin > Caching > Configuration. Add an instance of Redis named Redis1 with IP 127.0.0.1:6379 and another instance named Redis2 with IP 127.0.0.1:6378.
Click Edit mappings at bottom of page and add Redis1 as Application cache and add Redis 2 as Session cache.

For the session caching to work, you also need to add the following lines to your config.php file. Be sure that these are late in the file, but before the require_once() call for setup.php.

$CFG->session_handler_class = '\core\session\redis';
$CFG->session_redis_host = '127.0.0.1';
$CFG->session_redis_port = 6378;
$CFG->session_redis_database = 1;
$CFG->session_redis_auth = '';

$CFG->session_redis_prefix = '';
$CFG->session_redis_acquire_lock_timeout = 120;
$CFG->session_redis_lock_expire = 7200;

Cron

On more than a trivial site, most of Moodle’s work is handled in the background. This is also where Moodle can fail most often. This means you have to set up a mechanism to execute scripts and log the output from those scripts. In the Unix world, this is called cron and in Windows it is Scheduled tasks. For our instance, we have a scheduled task that runs every minute, triggering a batch file that runs the Moodle PHP script admin\cli\cron.php. We create a timestamp and use this to create a new file that we can pipe the output from the cron script into. We also have another scheduled task that cleans up cron output files after five days. We use anther log file that we output the returned status and run time into for each cron run, which is a helpful overview in seeing when cron tasks run long; we truncate the file to 30,000 lines to keep a few days of history.

Moodle takes care of what tasks it will complete during a cron run. It understands overlapping tasks and scheduling itself, so you don’t have to. It maintains locks for overall cron running, ad hoc tasks and individual tasks. What can irregularly happen is that a task can fail and the locks are not cleared. By default, locks are cleared after 24 hours, but this does not always work and a lot can happen in 24 hours. We have made a few changes to get more reliable results. First, some changes to the config.php file to use the DB instead of files for various kinds of locking…

$CFG->preventfilelocking = true; 
$CFG->lock_factory = '\\core\\lock\\db_record_lock_factory';

Be sure these lines occur before the require_once() for setup.php.

We have allowed scheduled tasks to run in parallel in Windows. This means that you can have up to three scheduled task runners and three ad hoc task runners running at the same time, controlled by the limits in Moodle in Site admin > Server > Task processing. If there are long-running, multi-minute tasks (like search indexing, forum notifications, etc), other shorter tasks are not affected as much. Also, if one of the task runners locks up completely, others will still be able to run.

We’ve also put DB alerts in place to monitor the locks. When tasks have not run for an hour or when a lock has not been cleared for an hour, it sends out an alert. This doesn’t occur often, but is good to know and check on when it does.

Integrations

SSO (Single Sign-on)

Our default login is through SSO using the SAML2 Single Sign on plugin. When users hit the site they are redirected to sign in through SSO, if they haven’t already. Our SSO sessions are handled by an external provider and works across most of our web-based systems. The only manual authentication to Moodle is for the main admin account which is accessed by an SSO bypass URL.

Google Drive

To access Google services, you need to register for an OAuth2 client ID (guide). We do not use the authentication side of this as we use SSO, but we do used this for Google Drive repository access.

One Drive

Like the Google API, you need to register an OAuth2 client for Microsoft to be able to access One Drive (guide). There is a more extensive plugin set to access more MS API services, like OneNote, but we were not able to get that working.

Google Analytics

One way to get stats about users passing through your site, including their locations and device details, is with Google Analytics. You have to set up an account on the Google Analytics site and get a tracking ID. I recommend the Local Analytics plugin, which makes setting up the link to Google Analytics easy and provides more meaningful information when you are analysing traffic later.

Solr (Search)

Moodle has some basic search functionality baked in, which is easy to use, but does not index PDFs and other files for search terms. We set up the Solr search engine, which runs in the background and is accessed by Moodle’s cron to index new and modified content hourly. The process of setting this up can be achieved by following Moodle Solr Guide and this relevant Moodle Forum discussion.

The Solr port for Windows uses Java (unfortunately), so you have to install JRE. You can then install Solr 5.5.5 from Solr Downloads page (see also this Solr Guide).

You need to download the PHP extension DLL from this Forum page or PECL page, depending on your version of PHP.

There are some tricks to get Solr to work with Moodle. Under the Solr install folder server\solr\moodle\conf\managed-schema you have to comment out the following lines (using XML comments, like HTML comments).

<field name="_text_" type="text_general" indexed="true" stored="false" multiValued="true"/>
<copyField source="*" dest="_text_"/>

We also had to increase the number of items that could be searched, otherwise admins and users with broad access to lots of content will face errors when searching. In the file server\solr\moodle\conf\solrconfig.xml we changed the maxBooleanClauses value to 524,288 (32,000 wasn’t enough).

<maxBooleanClauses>524288</maxBooleanClauses>

The Solr engine doesn’t run as a service, so in Windows we added a scheduled task to start the program (bin\solr.cmd with arguments start -m 2g) at startup and keep it running (checking hourly). It seems to run happily without our intervention.

A search result based on PDF content

Unoconv (PDF rendering)

One of Moodle’s nicest features is the ability to render documents so teachers can annotate on them during marking. We tried GhostScript, which has worked in the past, but this resulted in errors for us. One alternative is Google’s Document converter, but this is slow when large files have to be sent for rendering and returned. Another alternative is Uniconv, which is part of LibreOffice (guide). You need to download and install LibreOffice. Download the Unoconv source code zip from Github, extract the unoconv script, rename it to unoconv.py and store it in C:\unoconv\. Create a unoconv.bat file in C:\unoconv\. Add paths to Libre Office’s python.exe and the unoconv.bat files in Moodle’s config.php file (see config.dist for examples). In Site admin > Plugins > Document converters > Manage document converters, unhide the Unoconv document converter and promote it to top.

A rendered PDF assignment, ready for annotation

Moodle Setup

Advanced Features, etc

The following table shows where we have deviated from defaults and why.

FeatureOn/OffWhy
enableblogsuncheckedNot needed in School at this stage
enableplagiarismcheckedUsed with TurnItIn
enablebadgesuncheckedPossibly useful later, but a big step initially
enableglobalsearchcheckedA requirement identified by stakeholders
core_competencyuncheckedPossibly useful later, but a big step initially
contextlockingcheckedAllows historical courses to be kept in read-only mode (frozen)

Security and Privacy

A number of measures can be taken to secure the Moodle setup as advised by the Security overview report (Site admin > Reports), which includes links to guides for each security check.

A security measure you will want to undertake is to fix the paths to system files that can be viewed in Site admin > Server > System paths; this can be done by adding these settings in your config.php file. Fixing these prevents someone who gains admin access on the front end modifying these to gain access to back-end processes.

To secure the site, we’ve reduced the capabilities of the Authenticated User role, which is the base role for most other roles. A good way to secure your permissions is to edit the Authenticated user permissions (Site admin > Users > Permissions > Define roles), looking for the icons under the Risks column, changing anything with an XSS (XSS icon) or Configuration (Configuration risk icon) risk icon to Prevent or Prohibit.

Prevent means it can be overridden by a more specific role, like a Teacher, while Prohibit essentially means only administrators can use that; be liberal with Prevent, but consider using Prohibit carefully as it can break the experience for users unintentionally.

Being outside Europe, we’re not subject to strict rules for privacy. We’ve therefore turned off the tool that allows users to automatically delete their personal information (automaticdeletionrequests) and the display of the data retention summary (showdataretentionsummary).

User Roles

In order to get Moodle to work the way you need it to in your School, you will need to make changes to all roles and set up some new ones.

General Changes

We made a number of changes to the Authenticated user role to control the user experience. This is partly because we have a student information system (SIS) that is the source-of-truth for identity information and because enrolments are defined by timetables.

CapabilityPermissionWhy
Prevent users from viewing courses without participationmoodle/course:view → preventUsers in the School should only see courses they are enrolled in through the timetable
Prevent users from browsing courses unless explicitly given that capabilitycategory:viewcourselist → preventOnly staff can browse. Parents and students should only see what they’re enrolled in
Prevent users from seeing the participants list unless explicitly given that capabilitymoodle/site:viewparticipants, moodle/course:viewparticipants → preventParents and students should not be able to see other student details
Prevent suggesting coursesmoodle/course:request → prohibitOnly admins can create courses based on the timetable
Prevent sending messages to any usermoodle/site:sendmessage → prohibitAnother channel for messages was not wanted
Prevent password changesmoodle/user:changeownpassword → prohibitUsers log in through SSO using their School password
Prevent users from editing own profilemoodle/user:editownprofile → prohibitUser info is synced from the SIS
Prevent adding blocks on own profile pagemoodle/user:manageownblocks → prohibitThe user profile page is used to show parents custom user information
Prevent course category changesmoodle/course:changecategory → prohibitOrganisation of courses is set according to School departments
Prevent course renamingmoodle/course:changefullname, moodle/course:changeidnumber, moodle/course:changeshortname → prohibitCourse naming follows set patterns and is needed for syncing
Prevent direct grade editingmoodle/grade:edit → prohibitGrades are synced, only allow grading through activities
Turn off Private filesmoodle/user:manageownfiles, repository/user:view → prohibitUsers have cloud storage accounts

Changes to other standard roles

RoleCapabilityPermissionWhy
ManagerPrevent login asmoodle/user:loginas → preventOnly admins should be able to do this
Manager, TeacherAllow editing of student submissionsmod/assign:editothersubmission → allowTeachers do this for students in a school
Non-editing teacherAllow teachers to import from a course they can see (assumes they are a teacher in the destination course)moodle/backup:backuptargetimport → allowAllows collaboration
Non-editing teacherAllow non-editing teachers to see hidden course contentmoodle/course:viewhiddensections → allowTeachers can see hidden content in another teacher’s course
StudentPrevent students seeing the participants listmoodle/site:viewparticipants, moodle/course:viewparticipants → preventOnly teachers should see who is enrolled

Browsers

As a means of simplifying navigation, we limit course enrolments to users involved in courses. Users do not see other courses in their Boost navigation bar. However, there is a desire to allow teachers to be able to browse within their own parts of the School and some staff, such as learning support staff, need to be able to browse to a student’s course in order to support them. We created a “Browser” role that is equivalent to a non-editing teacher but is limited to category levels. We automatically sync teachers as browsers to their parts of the School. This provides access without affecting course enrolments.

Parents

Moodle was first started to support education at tertiary level and, although it is used in other sectors, that origin shows through in how parents are handled in the system. Normally parents can only see their student’s profiles and are not allowed into courses. We don’t want to allow guests freely around the system and we don’t want to enrol parents in courses as that pollutes the participants list and confuses marking lists. We have created Parent role in the normal way, but we allow this to be assigned at a category level. We then sync parents to categories containing courses their children are enrolled in. We control their access to specific courses within these categories by using a customised version of the Mentees block on the site Home and student profile pages, showing the courses their students are enrolled in and allowing direct access. This is not a perfect solution, but it will work until Moodle understands Parents better.

Parents are themselves enrolled in a number of “Brochure” courses depending on what part of the School their children are enrolled. These courses allow posting of general academic and co-curricular information and also act as a means of sending targeted announcements.

User Photos

Getting user photos into the system is relative simple. We have a script that exports staff and student photos from our SIS into a folder, each with filenames using their user ID. Zip all the photos and drop the zip file into the form at Site admin > Users > Upload user pictures. We repeat this annually after photo day.

Plugins

Moodle has thousands of contributed plugins that you can add to your site; this is one if its strengths. Be cautious about adding plugins, however, as developers are volunteers and if they stop developing your favourite plugin, you may be burdened with the responsibility of maintaining that plugin through future Moodle upgrades. Look for plugins created by developers that are active and responding to user questions. I co-wrote a book with Gavin Henrick about choosing Plugins and, although the list of plugins is aging, the first few chapters about evaluating plugins is still relevant.

The following is the list of plugins we use (with configuration changes we made). This does not include plugins we have developed ourselves.

  • Accessibility Block
  • Atto Fullscreen
    • editor_atto | toolbar → other = html, fullscreen
  • Checklist advanced grading
  • Clickview (editor plugins and activity module)
    • editor_atto | toolbar + clickview (Add Clickview button to Atto editor toolbar in files section)
    • replaced both icons in /lib/editor/atto/plugins/clickview/pix with B&W version 
  • Group self select
  • Media Gallery Set
  • MailTest (admin tool)
  • Moodle Benchmark (admin tool)
  • Portfolio (GI Portfolio tool)
  • Sharing Cart
    • block_sharing_cart | userdata_copyable_modtypes | all→ checked
    • block_sharing_cart | gapselect | all → unchecked
  • SolutionSheet (Assign feedback)
  • OUWiki (Used in place of the default wiki activity)
  • Tiles course format (our default course format)
    • format_tiles | followthemecolour → checked
    • format_tiles | phototilesaltstyle → checked
    • format_tiles | showprogresssphototiles → unchecked
    • format_tiles | assumedatastoreconsent → checked
    • format_tiles | showoverallprogress → unchecked
    • format_tiles | phototiletitletransarency → 30%
    • format_tiles | customcss → 
      div#abovetiles { display: block; float: right; width: auto !important; }
      ul.section.img-text.nosubtiles { margin: 0; padding: 0; }
      .format-tiles .modal-backdrop.fade.in { display: none !important; }
      .format-tiles .embed_cm_modal a {color: #005b94;}

We have turned off the following standard activities.

  • Chat
  • IMS Content Package
  • External Tool
  • SCORM Package
  • Survey
  • Wiki (using OU Wiki instead)

The following blocks are disabled.

  • Blog Menu
  • Blog Tags
  • Community Finder
  • Courses
  • Flickr
  • Global search (in theme at top of page)
  • Latest Announcements (we use our own)
  • Latest Badges
  • Login
  • Navigation
  • Private files
  • Recent Blog Entries

For Plagiarism detection, we use TurnItIn. There are alternatives, but that was a pre-existing system used by the School, so it was easy to transition that over. TurnItIn controls distribution of their plugin in a deliberately confusing way. You can try this guide and seek further support from TurnItIn if you want to go that way.

In terms of Text editors, we have disabled the TinyMCE editor and rely on the more accessibility-friendly Atto editor. For the RecordRTC plugin, we’ve increased the time limit to 300sec (10min). For the Table plugin, we’ve allowed border styling.

Our Front Page

The default landing page for Moodle is the Dashboard page. This makes sense when students are the main audience without parents, but in a School, the landing page is used by a wider audience. In our School, the landing page also acts as a portal to other systems and for communications, so we needed it to be consistent. For this reason, we set the Default home page setting to be the Site home and we’ve actually hidden the Dashboard using a CSS tweak in our theme. It is also just simpler to have one landing page.

Talking about themes, we use a child theme of the default Boost theme (guide). This means we benefit from improvements to Boost while allowing us to make customisations as needed. As well as customising colours, we are able to add additional elements, such as block regions, and hide elements that are difficult to disable through admin settings (like messaging controls). The result is a very clean interface.

Course creation and organisation

Our organisation of categories and courses was set up to reflect the organisation of the School itself, giving a natural way of browsing to courses (most users only see courses they are directly enrolled in).

Our School caters for students from early learning to year 12. The School is divided into two main parts: Primary and Senior (High School). Within each, there are Academic and Co-curricular courses. The Senior School uses Houses to organise students for pastoral care. There is also an overarching Community category and a category for Staff. The categories are therefore organised as follows, each with courses inside.

  • Community
  • Primary School
    • Primary Academic
    • Primary Co-curricular
  • Senior School
    • Senior Academic
      • (Department eg Mathematics, Science, etc)
    • Senior Co-Curricular
      • Activities
      • Arts
      • Outdoor Ed
      • Sports
      • Trips
    • Senior Houses
  • Staff

We also have year categories (2019, 2020, …) that allow us to archive courses when they end. The structure of courses within these year categories matches the categories listed above, mostly for administrative convenience as user’s can’t browse to these courses and can only get to them if they are enrolled.

We create new courses each teaching period. For some courses this is six months, for most it is a year, and for some year 11 and 12 courses it is two years. In order to uniquely identify each course, a naming convention is used.

Fullname template <Subject> <Year/Award> [<Level>] <End Year> [<Semester>] Full example
Fullname examples English Year 9 2019 English Year 9 2019
English IB HL 2020 English IB HL 2020
English HSC Extension 1 2021 English HSC Extension 1 2021
Geography Year 10 2020 Sem 1 Geography Year 10 2020 Sem 1
Shortname template <Subject abbrev.> <Year number/Award> [<Level abbrev.>] <End Year> [<Semester number>]
Shortname examples Maths 8 2019 Maths 8 2019
Ancient Hist HSC Ext1 2020 Ancient Hist Ext1 2020
Geography 10 2020 1 Geography 10 2020 1

Data Syncing

To be more than a trivial independent system, Moodle needs to be integrated with systems that can provide user information. We rely on table views as the interface for communicating this information. We populate these views using scripts on our SIS information. Moodle provides some syncing tools out-of-the-box and some we have created ourselves.

  • External Database Authentication (Site Admin > Plugins > Authentication > Manage authentication)
    • We sync most user fields this way and have added campus and year values for students as custom profile fields
  • External Database Enrolments (Site Admin > Plugins > Enrolments > Manage enrol plugins)
    • We sync student, teacher and parent enrolments in actual courses. For students and teachers, this is based on the timetable. For all users, enrolment in brochure courses is done depending on the part of the School relevant students are enrolled or staff are employed.
    • We turned off automatic course creation to ensure control over new courses, just in case.
  • Group Syncing (custom)
    • Because of our course organisation, students enrolled in courses may be in different classes with different teachers. To allow teachers to distinguish their own class for assessment and communication, groups are automatically set up for each class. To avoid interfering with manually created groups, automatically created ones are associated with a specific grouping.
  • Mentors (custom)
    • The parent-child relationship uses the generic mentor association in Moodle. We have created a plugin that populates these relationships automatically.
  • Category and System roles (custom)
    • To assign parent and browser roles to categories and system levels, we have set up a sync that populates these. This allows parents to see their childrens’ courses, teachers to browse courses within their department, learning support staff to browse into student’s courses and School leaders to be given Manager roles depending on their job position.

We are also working on grade syncing between Moodle’s gradebook and our SIS to streamline the flow of grade information.

Lessons Learned

Keep things together on your network

When we started setting up our VMs, we had them in different parts of our network. Our webserver VM was in the DMZ, but the database server was inside our network. Traffic between the VMs had to traverse our firewall, which created sync issues and a lot of errors. Bringing them together on the network solved these problems.

Don’t move your server

As we migrated systems from our old LMS (Blackboard) to Moodle, the old system was still in use. It was our intention to set up our new system and then simply move it so to the same URL as the old system, so any users would simply arrive at the new system. The new system was called next.cgs.act.edu.au, but when we needed to shift this to connect.cgs.act.edu.au. This caused some big problems, both within the system and with our DNS registration, leading to an unwanted outage, just when we wanted people to start using the new system. A better approach would have been to set up the system on a new URL and redirect traffic from the old.

Migration takes a lot of time

Moodle does have some tools that let you import content from other systems. We thought that since so many people had shifted from Bb to Moodle, the process would surely be simple. Our initial experiments with common cartridge showed that what Bb delivered was a large pile of mess. Cleaning this up took more time than manually importing content and this manual touch led to better courses in the end. Translating Bb’s many-layered courses into Moodle’s flat course structure was also tricky.

Our School had determined that the transition between systems should be done quickly, without bothering teachers and with all historical content migrated for future use. With the help of a few good recruited workers who were familiar with Moodle, we managed to deliver a migrated set of courses, however it was late, which negatively affected the change experience for many users.

Be sure to carefully measure how much content you need to migrate, give yourself plenty of time to migrate content and be transparent about migration progress with users.

An LMS Transition Project

Since I arrived at Canberra Grammar School, an LMS transition seemed to be on the cards. Engagement with Blackboard, the incumbent system, was low and anecdotal reports suggested disatisfaction.

A project with process

I had worked on a number of large projects based around an EdTech Project Management Process, based on PMBOK, and in 2018 the go-ahead was given for an LMS project. It was estimated that a proper transition would take around two years, which was an unprecedented length for an IT project in the School, but we were going to do it right.

Determining the need for change

The first step was determining the need for change. Before we could commit to an expensive change, we wanted to objectively know it was necessary. At CGS, the LMS is used for three main purposes: as a learning environment, as a means of making announcements and as a portal to access other systems. A survey was created to gauge whether users felt confident, enabled and satisfied with the incumbent system across these purposes. We also asked users to identified how they were enabled and blocked in their use. The survey was voluntary but yielded a 61% response rate, which allowed confidence in the results. In relation to use as a learning environment, the findings showed users were confident, but most did not feel enabled and half were not satisfied. Similar results were found for other purposes of the system.

The initial attitudinal survey suggested that a change was warranted and identified a number of deficiencies to overcome in a replacement, such as the interface and ease of use.

Planning

Driving Questions

With a project determined as necessary, planning began. Driving questions had to be answered to identify the, mostly pedagogical needs, which included an improved online teaching environment, refined flows of assessment and reporting information, the potential to collect data for analytics, a focused communication system and a portal to link to other systems.

Coordination

Some history of the previous system was compiled for context and a rough schedule was started.

Consultation

The people who were going to be involved in the project were identified including:

  • A Project Leader and Data Owner (an executive responsible for strategic change and respected teacher),
  • A Consultative Committee (the standing EdTech Committee),
  • A Change Manger (myself) and
  • A broad range of Stakeholder Groups.

A RACI Matrix was drawn up to ensure the project members (and implicitly others involved), new what level of responsibility they had in the project.

Because the project was going to affect many users, the stakeholder groups consisted of a broad range of staff (Primary and High School teachers, pastoral leaders, executives, communications and support staff), students and parents, forming nine groups in all.

Scope

In the case if this project, it was important to identify the parts of the greater system that was being reviewed. The branding we use for the system “CGS Connect”, the consistent theming across systems and the transparent linking between systems meant that the boundary between the system were not obvious to all users, including executive staff. Some areas outside the system were thought by some to be in need of change, so limiting the scope to LMS, announcements and portal allowed the project to focus on a single system change.

With the scope set, it was possible to define the deliverables and objectives of the change and to describe exclusions from the change.

Time Management

The earlier rough schedule was further developed with dependencies added. Stages like Migration and the creation of new courses were added. A period of longevity of 3 to 5 years was suggested before a subsequent review.

Cost Management

Before starting to look at alternatives, it was worth defining a budget for the project. The main differentiation was going to be whether we paid for an externally hosted system or hosted our own, with customisations and development.

  • Staff time costs were going to be significant for training. If we were going to make customisations then time of development staff time would be needed.
  • Migration would be a large up-front cost and some budget was set aside for consultation.
  • Depending on the system chosen, ongoing costs could vary, but an estimated budget was set to allow system costs to be compared against it. If the system was self hosted, those ongoing costs would be incurred locally as opposed to paying them to an organisation hosting the system externally.

Quality Management

Before starting a change, we set a number of quality measures were considered. The reason for doing this before starting was to allow for benchmark measurements to be made of the incumbent system. Some of the measures related to system use that could be counted or timed and some related to users’s attitudes, which would come through user testing and surveys.

A list of risks was drawn up, each with an estimate of probability and means of mitigation.

  • Development overrun
  • Migration overrun
  • Resistance to adoption
  • Focus on other projects

Considering these risks early was useful as each of them became relevant at stages of the project. It was good to have them stated in the plan and known to executives, so the project could be backed and prioritised when needed.

Communications Plan

Being a significant change, there was a long list of messages that needed to be communicated. Each message was defined with rough content, who would be responsible for sending and who would the recipients be and when the messages would be sent. The key goal of communications was to preempt the change in the minds of users and develop a sense of ownership for the new system.

Messages included notifications of coming change, invitations to be involved, demonstrations of functionality and project status reports. Recipients included various groups including staff, students and parents at various stages of the project. Messages were delivered by announcement email, at meetings and through written reports.

Procurement

With initial planning out of the way, the bulk of remaining planning time was spend comparing alternatives. The goal was to allow stakeholders to provide input from their perspective and feel they had contribute to the choosing of a new system and ultimately ownership of that system.

Sessions were held with each of the stakeholder groups. Based on an extensive list of possible LMS features, stakeholder groups collectively identified and prioritised requirements and each groups’ requirements were amalgamated. We used an electronic survey that allowed people to designate each of 89 possible requirements as “Unnecessary”, “Useful sometimes”, “Often useful” or “Essential”. Comments and general feedback were also collected for later consideration.

Each response was then given a weighted value with values averaged within groups and then across groups, giving each stakeholder group equal representation. The top 30 requirements became the criteria for comparing alternative systems.

Top 30 Requirements (by importance)

  1. Obvious, consistent navigation
  2. Search functionality
  3. Single sign-on integration
  4. Usable on small screens/touch screens
  5. Ability to attach/link to files to announcements
  6. Mobile notifications for announcements
  7. Linking to Google Drive files
  8. A tool for promoting School events
  9. Ability to use system when Internet is unavailable
  10. Up-to-date class/group enrolment information
  11. Greater control over announcement formatting
  12. Context specific landing page for different users
  13. Sharing sound/photos/videos
  14. A means of communicating with staff
  15. A means of parent communication
  16. Drag-and-drop resource sharing
  17. Linking to/embedding web pages
  18. Ability to elect announcement audience in granular way
  19. Accessibility aids for visually impaired
  20. Online archive of announcements
  21. Dedicated mobile app for activities
  22. Scheduling of future announcements
  23. A means of communicating with groups of students
  24. Understanding of timetables
  25. Multi-language support
  26. Surveys/forms to gather student feedback
  27. Integration with School calendars
  28. Ability to re-use content
  29. Linking to Google Docs, Sheets, etc
  30. Guest access for parents and other staff

There were numerous alternatives available. Armed with users’ criteria a number of systems were able to be eliminated through a lack of numerous necessary features. Other shortlisted systems were then scored numerically against the criteria, a process that required lengthy investigation and consultation with vendors.

For each criterion, a value was given as follows.

  • 3 for a mature feature meeting the criterion
  • 2 for partly meeting the criterion
  • 1 for hard-to-apply or unproven against the criterion
  • 0 for features absent or not advertised

Values were then weighted against the inverse value of the criterion (30 points for the highest priority down to 1 for the lowest) and then summed, leading to the following scores.

This image has an empty alt attribute; its file name is managebac.png1891
2192
This image has an empty alt attribute; its file name is seqta.png2198
2401
This image has an empty alt attribute; its file name is canvas.png2930
3160
This image has an empty alt attribute; its file name is moodle.png3161
System scores based on weighted criteria

The three highest-scoring systems, Canvas, Schoolbox and Moodle, were selected for trialing. The challenge was to create an objective trial that would allow users to select a system to implement.

Technical criteria were defined and including integration with our pre-existing systems and the ability to theme and organise spaces in the system. Each of the trailed systems was able to accommodate these criteria to varying degrees. Some needed to be judged against the technical criteria through trialing.

Trailing – A blind taste test

An instance of each of the three systems was established and populated with representative course content. Each system was themed with School branding and obvious system naming was avoided to attempt blind evaluations.

Test scripts were created that covered most of the criteria, some tests combining multiple criteria. Test steps were defined for each of the three systems in an electronic form that led the user through the test. Staff and parent representatives were invited to complete a relevant test across each system and to rate their experience with the systems as “Worked”, “Worked, but was cumbersome” or “Didn’t work” together with free-form comments. Over 100 tests were completed by users leading to a solid set of opinions with Moodle proving most functional, followed by Schoolbox and Canvas.

Students were involved in the trials by another method. A group of Primary students and another group of High School students, who each were actively involved in computer-related activities, were given the opportunity to experience each system and run through some tests. A selection of the volunteers was interviewed through a talk-aloud process across the systems while observing their actions as they used each system. Students seemed happy to explore the systems freely with aspects of colour and imagery shown to be attractive to students, influencing the formation of their opinions about the functionality of the systems, sometimes in a manner that contradicted the difficulties they experienced. This was noted for the later implementation.

Technical criteria were also applied to each system and Moodle was seen to be the best technical fit for the School. The School’s executive also appreciated the potential to customise the LMS in a way that would set the School apart, while understanding the cost and risks associated with hosting an open source system.

Based on scoring against users’ criteria, blind testing, technical criteria and the blessing of executives, Moodle was chosen as the system to implemetn.

Implementation

With a system chosen, implementation began, involving setup, configuration and customisation. A test and production instance were established so changes could be tested before being deployed. I will create another post or two describing how we have set up our Moodle instance in detail, so others can benefit from that experience.

In summary, choosing to implement and host an LMS meant that more work would need to happen locally, rather than relying on outside help. It was worth doing, but it was a challenge.

Roadshows and Piloting

With a transition at the turn of the year, as well as creating our customisation, we had six months to mentally prepare users for change. A Roadshow of demonstrations was established at weekly staff meetings to demonstrate functionality and develop enthusiasm. Volunteers from different parts of the School were given spaces for teaching, which also helped refine the organisation and configuration of the system.

Communications were send out through various means to inform the community about the system and the coming change.

Migration

In order to minimise disruption to teaching, content from the previous system needed to be migrated. Automated methods proved flawed and resulted in courses that did not resemble the source, so it was determined that manual migration work would be needed. Assistance was sought from outside organisations. One organisation pulled out at the last minute and additional assistance was found in private individuals. It was later found that the second outside organisation was delegating migration work to inexperienced users, creating work that later had to be repeated. The aim of completing the migration before the first training sessions was not met and migration activities continued over the end-of-year break.

Training

Fighting for time with teachers proved difficult. Before the end-of-year break a training session was conducted with the entire staff, which was counter-productive as users had many perspectives and some did not have migrated content to work with. With lessons learned, training after the break, before the return of students, was conducted with smaller, focused groups and proved reasonably successful. A series of subsequent voluntary CPL sessions was interrupted by the advent of COVID-19.

End Results

We’re now winding down the project, but planning to make continual improvements. When our School closed due to the COVID-19 pandemic and regular learning became Remote Learning, the recent training for teachers meant they were more prepared than they would have been if we hadn’t recently transitioned. As the system became the primary modality for teaching, engagement in the system increased dramatically.

We have achieved almost all the distinctive criteria in the new system, which will hopefully achieve more from the School than other systems would have.

An EdTech Project Management Process for Schools

When I started as a leader responsible for Educational Technology projects in a school, I lacked a framework to work within. Having come from the software development world, I understood how to develop systems, however, selecting and implementing systems is quite different.

I looked into a few frameworks for system change management and found PMBOK (the Project Management Body of Knowledge) was adaptable to EdTech projects in schools. PMBOK is a general project framework, but it is adaptable, so I set about writing a process guide and template based on it and also included a number of software and educational models to give specific relevance to schools.

In 2018 I ran a workshop at the AIS ICT Leadership conference to share my version of this process and it was well received. In summary, the process involves a number of work areas for planning, execution and change control as shown in the following diagram.

When working with a project team, a template can be used as the basis of a collaborative document to work through the planning work areas.

The most involved area is the Procurement area, which involves consultation to determine requirements, transforming these into prioritised criteria then setting a cut-off for essential criteria.

The documents below describe the process in detail including a comprehensive guide, a project plan template and the slides from my workshop (with examples of a software and hardware project in a school).

I’ve since heard back from other schools who have applied the process successfully.

I’m sharing it here now so that I can refer back to this process as I describe an LMS transition project we have undertaken over the last two years in subsequent posts.

Myths about Analytics in Schools

At recent school level (K-12) ed-tech conferences I’ve witnessed a larger than expected amount of fear-mongering, prognostication and exaggeration. There’s also been a great number of presentations about analytics, pronouncing as “here now” or impending many data-related technologies that are arguably not achieved. I thought it was worthwhile scrutinising some of these claims.

My critique is likely to become outdated in the near future (at least I hope it will) but is intended to be a general reflection of the state of analytics in schools in 2017.

Myth 1: “We have analytics”

I have seen a number of people claiming student-data-related reports are analytics. What defines analytics is the analysis of trends, usually relating to behaviours, to allow prediction. I would also add that the point of analytics is to promote proactive responses. Anything less than this is simply a report, regardless of how many graphs are included.

Oneschool dashboard
Education Queensland’s Oneschool Dashboard

 

Myth 2: “Build it and they will come”

Another claim I have noted is the prediction that, with “analytics” in hand (or more accurately reports as I have seen), teachers will transform education. Simply providing more information to time-poor educators is unlikely to encourage change.

field of dreams
From the movie Field of Dreams

Where analytics have the potential to encourage positive change in education is through highlighting where action is needed and prompting teachers to undertake that action. Analytics tools need to be following trends silently in the background, incorporating new information as it becomes available, making predictions and proactively prompting action when thresholds are passed.

Myth 3: “We have too much data”

As the technology of analytics filters down from the Web to higher education and towards schools, some of the rhetoric about “big data” is naturally transmitted along with those ideas. However, in schools, there is not really a large number of rich data streams to be compared.

Student data

In higher education analytics are employed to track participation and submissions, primarily to determine “students at risk” as it relates to drop-outs and also to placement funding. Student activity in higher education is focused on activity in LMSs where most document sharing and assessment takes place. It is a focused, rich source of behavioural data.

In schools, blended learning will remain a focus for the foreseeable future. Also, the purpose of analytics in schools is more about improving student outcomes. The set of data streams is quite different at these earlier years of education. Attendance is the richest source of data, but even that is prone to errors and anomalies. Some schools have LMSs, but utilisation varies, making it difficult to compare students or even focus on a single student across courses. Common assessment information tends to be summative and describes learning across periods such as terms or semesters, not days or weeks. In order for analytics to be feasible, schools need to mandate more frequent points of electronic assessment and additional streams of information need to be added, such as pastoral and attitudinal information.

Ultimately, I think we still have a way to go.

Organising Selection for an IT Position

Over my career I’ve been involved in interviewing and selecting new staff for IT positions on numerous occasions. I’ve learned a few tricks along the way and I thought I should share those.  A lot of these techniques are generalisable to positions peripheral to IT and elsewhere.

Choosing a candidate

The Panel

It helps to have more that one person doing the interviewing; two is OK; three is ideal; four can be intimidating. If the position is really serious and more people need to be involved, create two panels with different foci.

Interview

Aim for diverse perspectives in the panel members. If you’re a manager, involve a technical staff member and a support staff member, such as someone from HR.

Preparing

  • Short-listing
    Without going into too much detail, the panel that will interview should be the ones selecting who should be interviewed. Start separately and blindly review all the CVs. Bring together opinions into a collaborative space, such as a shared spreadsheet.
  • Discuss candidates openly
    After each panel member has rated candidates, come together to decide who to interview. Be open to disagreement as others may have spotted potential that you have not seen. Consider rounds of interviews with the most likely candidates first.
  • Invitations
    When you have a list of candidates, you need to invite them in.

    • Negotiating a time is best achieved over the phone. Offer the candidate opportunities within a specific window, but be accommodating.
    • Once a time is set, send a formal invitation that introduces the panel and their positions; this establishes perspectives for the candidate. Set expectations for where to go, when to arrive, what to wear and how long the interview will take. You may want to prompt the candidate to undertake some research into your organisation by directing them to online resources and work spaces.
  • The script
    It’s good to have a set series of questions going into the interview. All panel members should agree on the script before interviews start. Use a common document with names beside each question (rather than each panel member having their own script); this allows you to pass the flow of questions between panel members. If you have a script from a previous position, review the questions and ensure they are relevant to the current position. The script can be duplicated for each candidate so that notes can be inserted, by someone not asking current questions, during the interview.Interview script
  • Quick recap before interview
    Before an interview, all panel members should take a few minutes to review the candidates CV. Discuss their strengths and peculiarities so that you can focus questions during the interview.
  • Hospitality
    In a sense, your organisation is being interviewed as well as the candidate. You want the best candidate, who could possibly go elsewhere, to choose you. Simple things will help, like:

    • tidying the space where the interview will take place,
    • ensuring the temperature is comfortable and
    • having glasses and water poured for the candidate and the panel.
  • Everyone shakes hands
    Allow the opportunity for each member of the panel to shake hands with the candidate. That first physical contact is disarming and will establish what could be a future working relationship.
  • Seating
    Don’t arrange seating in a way that is confrontational, such as sitting on the opposite side of a big table from the candidate; a small table is better, with the candidate as part of a formation that is inclusive, like a circle.

The Interview

Repeat Introductions

Start by reintroducing the panel and what they do. This can be quick, but is important to preface the questions you will ask later. The panel leader can do this or each panel member can quickly say who they are and what their role is.

Questions

The candidate’s CV will tell you about their education, their experience and their skills, but it won’t tell you what kind of person they are, how well they will work with you and how they can apply what their skills. You want a good script of questions that tease these important aspects from the candidate’s brain.

brain-2029391_960_720

  • Icebreakers
    Candidates will be mentally prepared to convince you about their professional worth, but don’t jump straight into serious questions. Start by allowing the candidate to settle in and feel comfortable. A good way to achieve this is to ask the candidate to talk about their personal life; if they start drifting into work and skills, redirect them by saying you will get to that soon.

    • How did you come to be here in ___ ?
    • Tell us a bit about yourself as a person. What do you do in your spare time?
    • Tell us about your study. What inspired you to get into IT?
  • Focused career questions
    You want the candidate to tell you about their experience, but you don’t want a litany that will take up all your interview time. Ask questions that will allow the candidate to showcase them self, while highlighting aspects you are keen to hear about.

    • Without going into too much detail, tell us the places you have worked and your roles there.
    • (If applicable) Why are you leaving your current position?
    • What has motivated you to choose your career path?
    • What are some of the tasks you really enjoy doing?
  • Tell questions
    It’s hard to tell when people are being honest. One technique for eliciting humility and honesty is to ask the candidate to admit where they have failed. This may be counter to what the candidate is prepared for and it may be affected by cultural background, but it can give you a good idea of whether you want to work with that person. It’s a good way to distinguish potential assholes.

    • Can you think of a time when things did NOT work out the way you expected them to?
    • Can you tell me about a time you had a conflict with a colleague? How did you deal with it?
  • Focused skill questions
    You should be able to tell what skills a candidate has from their CV, but you want to know if they have real experience or was it something they observed someone else doing.

    • Tell us about your experience with Active Directory?
    • Have you ever written documentation in a wiki? No: what did you use?
    • Have you ever worked with a issue tracking system? How was that used?
  • Don’t forget the soft skills
    It’s easy to get stuck on technical skills for an IT job, but non-technical skills are really just as important in the day-to-day working of a successful team.

    • Have you worked as part of a team? What was your role?
    • What techniques do you use to manage your time?
    • How do you handle conflicting priorities?
  • A conundrum
    You want someone who can ‘think on their feet’ and consider alternative solutions. Posing a scenario that seems unsolvable at face value will prompt candidates to demonstrate their ability to think ‘outside the square’. The following example is for a service desk position in a school.

It’s been a busy day; you are feeling under pressure and a teacher calls you demanding that you set up an email account for a person who is not an employee but has come into their class to present.  This would be against the school’s policy, but you understand the teacher needs to make the class work. How would you deal with this situation?

  • Most candidates will start by stating that they cannot break policy because they want to give you the impression they are honest workers, ready to follow the rules. Some might say they will seek permission from a manager to break the rules. A good candidate will recognise that problems are often not what they are first reported to be and probing into the client’s needs will allow you to consider the problem then create a solution or a workaround.500px-Tribar.svg
  • Questions about your organisation
    You want to know if the candidate is actually interested and enthusiastic about working in your organisation. Give them the opportunity to share their research and how they have envisaged them self in your organisation.

    • What do you know about ?
    • What do you think it will be like working in a ?
  • Prompt for their questions
    Allowing candidates to ask you questions is more than a courtesy, it allows the candidate to take control of the interview and demonstrate their strengths and knowledge by probing you about what you do, what technologies you use and how the organisation works. A good candidate will come with prepared questions.

    • Do you have any questions for us?
  • Formalities
    Don’t leave yourself open to surprises.

    • What are your obligations and availability?

If you are leading the panel, avoid keeping all the curly questions to yourself. Farming some complex questions to another panel member allows you focus on how the candidate is answering the question, following their body language and ‘reading between the lines’.

Flow

A smooth interview is not rushed, nor is it slow. With good flow, the interview can be comfortable and friendly and elicit the honest answers you are seeking.

comments-151907_960_720

  • Build up with some easy questions first.
  • Hand over between panel members when asking questions.
  • Ask questions from the panel member who has the perspective from which you want questions answered (personal from the manager, technical from the technician, organisational from HR).
  • Be adaptive.
    • Don’t stick to the script when you want to clarify or probe deeper.
    • Don’t ask questions that have already been answered.
    • Make questions specific on-the-fly.

After asking all of your questions, lead into a task…

Evidence

A candidate may say they have the skills you require, but it’s hard to judge to what degree that is true. Their CV may have been developed over time, with outside help. Every candidate will say they have good communication and problem solving skills; we all have a self-optimistic bias. Don’t be afraid to take some time to get the candidate to demonstrate their skills.

cat-1800843_960_720

  • A role-play
    Pretend to be a client with a predetermined problem. Ask the candidate to put them self into a support role and attempt to unravel the problem. Getting the answer is not as important as how they approach the problem.
  • A quick quiz
    Allow the candidate to answer questions in a quick quiz. You might throw together some basic questions in a Google form or online survey and ask them to provide their answers.
  • A writing task
    Being able to write clearly is an important skill for all IT workers. Set up a scenario and ask the candidate to respond to a pretend client. Writing a pretend email or ticket-update on a machine you provide is an easy way to run this task.
  • A dev task
    If the candidate is applying for a technical role, ask them to resolve a bug or simple problem. This may be something they have to do after they leave you and later submit the response back to you. Be sure the problem requires them to establish a dev environment close to your work environment.
  • A presentation
    If the candidate is applying for a role that involves training, ask them to run a quick training session on a simple technology. If you’re considering this task, you will need to give the candidate notice before the interview so they can realistically prepare.

Debrief

Assessment biases can creep in over time. You can glorify earlier candidates or favour candidates you have seen more recently. Reflecting immediately after each interview is recommended, even if this means delaying the next interview by a few minutes.

When you’ve seen all candidates, hopefully you’re in a good position to choose. If none of the candidates are suitable, consider re-advertising. If there is a candidate that is suitable, but you’re not completely confident, remember that you can rely on a probation period if things don’t work out.

Don’t forget the unsuccessful candidates. Failing to respond respectfully to unsuccessful candidates puts the reputation of your organisation in danger, whereas an honest response with feedback that will help the candidate in future will be welcomed.