What do teachers want from IT Support?

Do we know what teachers want and does it really matter?

Among school IT managers we often talk about providing secure and stable systems. Good IT service can be objectively qualified through ITIL and other approaches. But in an educational setting, understanding the IT needs and desires of users, who are predominantly teachers, is essential. Relationships between IT support staff and teachers can become siloed and confrontational without a strong understanding of what teachers expect from IT. I porposed a model of what teachers want from IT, based on my tacit understanding, and refined this with teachers at various levels of responsibility. This post is the result.

What teachers want doesn’t constitute all that IT professionals need to do in educational institutions. Teachers are not usually aware of planning, testing or maintenance. They do not worry about technical policies, vendor engagement or documentation. Yet all these things are needed and indirectly affect teachers. The reason IT professionals need to understand what teachers want is so that we can work together in a way that makes sense.

The relationship between teaching staff and an IT Team matures in stages. For example, IT staff may have an ingenious plan to improve student outcomes, but if teachers don’t trust IT systems, like AV in classrooms, they will not cooperate in such projects. Conversely, as IT teams deliver more trustworthy systems and successfully deliver change, teachers will be more likely to seek collaboration with IT staff on projects targeting student success. The following diagram depicts what teachers want in a graduated hierarchical sense. I don’t think teachers necessarily have this model in their mind, but I know that they won’t engage IT staff at higher levels if they are not satisfied with the lower levels first.

Trustworthy Systems

“Trust is earned in drops and lost in buckets.” (Attributed to multiple people)

When IT systems don’t work and this disrupts a teacher’s lesson plans, teachers lose trust in those systems. They will avoid using them or, when they can’t avoid them, they will become frustrated. It doesn’t matter to teachers whether the cause of an IT issue is infrastructure, the failing of the IT Team, an external provider or even the teaching staff themselves. Their focus is that they can’t do their job. If there is a series of issues over a period, not only will teachers distrust IT systems, they will also distrust IT staff to fix issues. They will also be less inclined to report issues, which compounds the problem.

Teaching staff want to walk into a classroom assuming their computer, the network, software and AV facilities will all work so they can deliver teaching. They are not interested in security, backups, system efficiency the latest update; the focus of teachers is student learning.

To improve trust, IT teams can do the following.

  • Work on general system stability more than system expansion, and encourage third-party providers to do the same.
  • Proactively monitor systems using dashboards and automation to detect problems before they are reported.
  • Encourage a culture of reporting among staff by verbally asking them to let you know when they spot issues. Make it easy to make reports. Repeat this regularly at meetings. Thank reporters for taking time to report (especially less confident staff, unsure if the issue is their fault).
  • Create a collaboration culture in the IT Team. I use the phrase “working alongside teachers” and encouraging IT staff to think of themselves indirectly as educators. I regularly discourage defensiveness about system issues.
  • Get out of the IT Office and into staffrooms, even when there are no problems. This gives staff opportunities to ask for assistance with unreported problems and increases visibility of the IT Team. I attend staff meetings to present IT matters, but also to put a face to the IT Team.
  • Work with school leaders to set realistic teacher expectations: IT systems are not going to work 100% of the time. Teachers need to be able to pivot their lesson plans when systems don’t work. Of course, any banked trust very quickly degrades with repeated or continuous outages.

Technological Change

Without trust in systems, teachers (particularly execs) will often try to drive technology change on their own. They may be motivated by a single third-party provider, rather than broad consultation. They may not consider scope, time, quality or communication, which leads to systems that are difficult to implement and integrate. The resulting systems are often ones that other teachers don’t feel they own and may resent using.

When the IT Team lead change without trustworthy systems, they will not have positive relationships they can draw on. Key stakeholders will not volunteer their time and system champions will not emerge.

Once there is trust in systems, teachers are more likely to see the IT Team as partners in change. I’ve written about project management in schools previously, so I won’t repeat that here. To summarise, good EdTech project management requires planning, change control and execution. This approach can be scaled down for smaller projects, but for significant user-facing systems, like an LMS or SIS. Following such an approach greatly increases the chance of success.

From a teacher’s perspective, an unsuccessful technological change is one that causes them to change their teaching practices. If teachers need to recreate resources and lessons, resentment towards the system will build. If a new system doesn’t support a particular function efficiently, it will be seen as unsuccessful, even with gains elsewhere.

Ultimately, the target is to bring about suitable systems that teachers feel they own. Realistic expectations are important before they use the system. Successful IT change occurs when the following are in place.

  • A scalable, easy-to-follow change process with tangible instruments, such as planning templates, gives teachers confidence to be involved.
  • Consultation to with key stakeholders to develop appropriate requirements, which leads to a sense of user ownership.
  • Encourage system owners to become system champions, keen to promote systems that benefit teaching staff.
  • Communication that sets expectations of benefits for staff, so when change comes they will pay attention.

Improved Student Outcomes

With trusted systems in place and a number of successful change projects completed, the relationship between teachers and the IT Team can develop into one that allows collaboration on teacher’s primary goal: improved student outcomes.

Outcomes include student academic grades, but are not limited to those. Student mood and other pastoral measures are also outcomes teachers work with students to improve. Attendance is the best predictor of success for most students and is something that can be improved.

Often the impact from IT comes in the form of improvements to workflows, processes and interfaces for accessing information. These can be used by teachers to affect student outcomes by seeing students within their cohort and individually. Differentiated teaching for personalised learning happens when it is well informed by data. Sometimes, teachers are just trying to get to know students more objectively around the start of the teaching period or when writing reports.

The relationship between the IT Team and teachers may start with the provision of data for analysis, then pre-defined reports and eventually analytics. The distinction of analytics over lesser presentations of data is that they are predictive, proactive and available on demand. I’ve written about analytics in schools previously.

To achieve successful collaboration on improved student outcomes, concentrate on the following.

  • Seek questions that teachers need answered, and an understanding of what data is available to answer them.
  • Develop integrations and data warehouses that bring data together, so it can be presented efficiently, but also to achieve active alerting when threshold values are detected.
  • Understand how data can be used to benefit student outcomes; providing masses of reports will only overwhelm users. Linking proactive alerts to well presentation information means teachers can be drawn in and use systems, applying the information to practical improvements for students.

Beyond?

To be honest, I’m not sure if teachers have expectations beyond the levels described above. I have ideas for evolving the nature of teaching through student models, knowledge profiling and automated individualised learning schedules. But that requires a relatively radical departure from the current classroom situation, so I’m not sure teachers want that.

I hesitate to mention the notion of AI as an agent that will satisfy teachers’ desires beyond what I’ve described. I am seeing examples of Generative AI being used to take raw lesson plans and creating more personalised versions based on student interests, but time will tell if that is useful for more than adding engaging variety.

I may have to revisit this post again if I reach that top and can see beyond.

Simpler Classroom AV

Is it better to buy an interactive screen with or without smarts?

Background

I’ve been dealing with interactive screens in classrooms for several years now. When I started in schools, I inherited responsibility for a legacy of screens from various manufacturers. Some were still within support, but most were living on like a crazy naked man: unsupported and likely to attack. Some staff had invested in the associated software, allowing them to present lessons as the manufacturer intended, but when the software license expired, they would be left with a disappointing investment of time. Consequently, most staff relied on teaching from a connected laptop.

My first task when I started at my new school was to assess various systems, including the classroom AV situation. At St Mary MacKillop College, Canberra, there is a 50/50 mix of projectors and interactive screens. I started working on plans to transition the remaining projectors to screens, but I needed a classroom AV standard to drive this. After talking to teachers about their use, I needed to assess the market.

Teacher using an interactive screen connected to a laptop.

What I found was a growing number of manufacturers keen to get their devices into schools, including traditional screen-makers, like Promethean, Smart, CommBox, a range of known electronics manufacturers moving into the market, like LG, Samsung and Phillips, and quite a few Chinese brands I wasn’t familiar with. I also attended EduTech (a significant Australian commercial education technology conference) to see the offerings from these brands in real life.

I discovered each manufacturer attempting to differentiate themselves, not on their hardware, but through their software offerings. Each sales pitch highlighted the various Android-based tools you could while using their device. The walled gardens of the past seemed to be growing.

Students using the functions built into an interactive screen. Source: Promethean website.

Teachers had told me they were not using the built-in interactive screen tools they had access to now and weren’t interested in using more than their laptops. For some teachers, I could imagine some of these tools could have use-cases some of the time, but I couldn’t help considering the support, training and maintenance needed to provide these tools, as well as the disappointment in teachers if it were to be taken away later. Some manufacturers have considered breaking down the walls slightly by allowing staff to log into Google Drive or OneDrive and to deliver their content from those cloud sources, but this also left me a bit concerned as it would need more connections, more accounts, and would create a possible data breach vector that would be hard to control if staff walked away while logged in. Teachers are entering class with a tool that has the smarts they need, so why try to replicate that on a device with another operating system?

Can it be simpler? Yes, it Can.

I started looking for devices without smarts. This is something I had achieved at a previous school, so I knew there were possibilities. Sending touches back to a connected laptop is still important as teachers like to write on a board like a whiteboard. I turned to Dell, who I had sourced not-smart touchscreens from before, but their AU$4,800 price for a 75″ screen was higher than most of the smart interactive screens available. I started speaking to the makers of the smart interactive screens to see if their devices could be modified or reconfigured to disable the smarts, but their solution still required access through the Android OS to connect a laptop, and that OS would run out of support well before the screen’s expected end of life. Fortunately, I stumbled upon a new model from Promethean that allowed you to purchase the screen and smarts separately, and the cost of the screen without smarts was very attractive at around AU$2,200.

Promethian LX connected to a laptop. Source: Promethean website.

I was a bit worried that this might be a lower-grade device with laggy touch and tinny speakers, but our testing showed the response to touch and pen input to be responsive and the sound sufficient to fill a classroom. One nice bonus is the USB-C port that allows a single-cable connection for video, sound and return touch, while charging the device. Another benefit is that the device turns on quickly, so you don’t have to wait for an OS to load.

We set about making this the basis of our new standard AV classroom setup.

Setting Things Up – Stands

One other costly aspect of classroom AV can be the installation. In my previous school, all screens were wall-mounted with a cupboard containing devices and a controller to allow you to switch sources. This Rolls-Royce solution suited the school, but a simpler solution is available. Movable stands are widely used, but it’s worth comparing to mounting. Putting an interactive screen on a mobile trolley means it can be set up by IT staff, without any need for electrical or network contractors.

Trolley ProsTrolley Cons
  • Easy, cheap setup
  • Only connection needed: power
  • Replacing a busted screen is easy: wheel a spare one in
  • Teachers can place them where they need them (no more fighting for whiteboard wall space)
  • Trolleys take up more space
  • Trolleys can be moved where you don’t want them (like out the door)
Rear view of an interactive screen on a trolly, with cabling.

Is Wireless Projection Worthwhile?

Wireless adapters give teachers the freedom they want to move around the classroom while presenting. It’s not perfect, but it can achieve what teachers want most of the time.

Using a wireless adapter has some advantages and disadvantages.

Wireless Display Adapter ProsWireless Display Adapter Cons
  • Teachers can move around the classroom while projecting
  • Cables don’t go missing
  • Ports don’t wear out
  • Adapters can be remotely managed
  • High-res videos show compression artifacts
  • Introduces lag on touch and pen input
  • Although relatively simple, some training is required (press Win+K)
  • Users need their own charger
  • Adds another layer of maintenance

We’re using ScreenBeams and there are other wireless adapters available. I recommend wireless adapters that use the native screen mirroring potential of users’ laptops.

  • I have found requiring software installation to use a wireless display adapter is a step too far for most users.
  • The native wireless solution uses a direct radio connection rather than clogging up the network to communicate. Guests not on the WiFi can still use them.

Some effort is required to set up the global monitoring software that allows you to see the status of SceenBeams and change their configuration. Some network changes and testing went into getting that working at our school; Kudos to Corey and Roman for that work.

A Simpler Classroom AV Spec

Our classroom AV setup now consists of the following components, with rough prices, incl. GST (as at 20231110).

Promethean LX$2,200
Moving Trolley (eg Gilkon), height adjustable (could be cheaper otherwise$980
ScreenBeam 960$600
Extension cord, power board, cable ties$30
Total$3,810

Why ChatGPT has Educators Concerned

(I didn’t ask ChatGPT to write this for me.)

Generative AI has educators worried. I am assuming you have heard enough to know what AI-based tools like ChatGPT can offer and, like me, are thinking about the impact on learning. After a disappointing webinar from Microsoft, where we were effectively told to get used to the idea, I’ve been considering what was unsatisfactory about that suggestion and why educators are concerned about generative AI, which brought to mind Bloom’s Taxonomy.

The following diagram represents Bloom’s taxonomy (1956 version) with a few extras. I showed Bloom’s Taxonomy to a gathering of school IT officers and none recognised it. But I’m certain teachers would as they are trained using this theoretical framework. There are multiple versions, but they all represent levels of understanding that can be measured through assessment.

Bloom’s Taxonomy against technological aids

ChatGPT, and tools like it, have been compared to past technologies that challenged educators at their inception. Calculators were, at first, resisted in schools as they could answer mathematical questions that students needed to answer. The Internet, with its search engines and sites like Wikipedia, has also challenged the way students are assessed.

Generative AI has concerned educators to the same degree as these earlier technologies, if not more. Considering Bloom’s Taxonomy gives insight into what may be going on in the back of teachers’ minds.

  • A quick web search can substitute a student’s recall of knowledge.
  • Sites like Wikipedia can provide information showing a level of comprehension.
  • A calculator allows a student to take solve mathematical equation without demonstrating the application of their understanding.

Generative AI, in comparison, can:

  • take a sophisticated prompt,
  • gather information that demonstrates knowledge and comprehension,
  • contrast views in a way that shows application and analysis, and
  • synthesise these ideas into a cohesive written piece.

As suggested in the diagram above, artificial intelligence can now be used to substitute higher levels of understanding that were previously reserved for the human student. Together with the relatively sudden release of generative AI, I believe this is what has teachers worried.

At this stage, the quality of output from generative AI is not perfect, containing factual errors and bias, but it is improving. It is already at a stage where it could be equated to responses expected from post-graduate students, such as medical and legal students. The output is also relatively unique, so existing plagiarism detection is less effective against it.

Going back to my original question, educators are concerned as ChatGPT threatens their methods of assessing student understanding in a new way, which threatens academic integrity and institutional reputation, but also means they have work to revise their assessment approach.

What can be done?

It seems that generative AI is not going to disappear any time soon, in fact it will probably have deeper impacts on education than we can currently imagine. From what I can see, the possible responses for educators to adapt their assessment to this new paradigm.

Block

Blocking generative AI tools, such as ChatGPT and QuillBot could discourage students from attempting to use these tools. This would be a similar response to when earlier technological tools have been introduced and it has been the response from ACT Education (where I work). However, in the case of ChatGPT, the number of places where it can be accessed, beyond its main website (such as Bing and various chat platforms), is proliferating, so blocking may not be effective.

Control

If generative AI tools are not blocked, controls can be put in place to prevent cheating and, perhaps more importantly, ensure students are able to learn and demonstrate their understanding appropriately.

  • Education
    As with other tools, students need to be equipped to understand the associated ethical standards expected. Plagiarism and other forms of academic misconduct are frowned upon and do not lead to successful learning outcomes for students. There already needs to be information literacy training around content from online sources, to allow students to grow into citizens who can identify trustworthy content, and generative AI output is an extension of this.
  • Task Controls
    In mathematical tasks, we ask students to show their working; in essay tasks, we ask students to note their sources. Similarly, with tasks that could make use of generative AI, students can be asked to give more than the answer to demonstrate how they came to their understanding. Assessment design can be improved to deter (or at least complement) use of generative AI by adding specificity to the task or by asking students to deliver artefacts in multiple modalities (other than written text). Ultimately, the best way to avoid cheating is to make tasks seem achievable by providing clear instruction and appropriately weighting tasks.
  • Technological Controls
    Plagiarism detection seems to have been diminished now that generative AI can synthesise a novel text presentation, with appropriate citations. So what can be done technologically to control the use of generative AI?
    • The makers of OpenAI have released a tool that can help detect AI-generated text, which may be useful, but for now it is hard to tell.
    • It’s possible to ask students to present their work using a shared Word document, Google Doc or OneNote file, which shows a history of how the document was constructed, allowing teachers to scrutinise where content may have been copied and pasted. This is not foolproof, but a useful check for teachers.
    • Quizzes have been shown to allow demonstration of understanding that is as good or better than a written essay. A quiz can elicit responses to various question types, which may be useful to redirect students away from generative AI. Quizzes can also be partially or fully automatically marked, which is always an incentive for time-poor teachers. Adding time constraints and a lock-down browser to a quiz should give confidence for most assessment.
  • Physical Controls
    When authenticity of assessment really counts, it still means asking students to undertake tasks in person, away from computers. That could mean a paper exam or an in-person question and answer session. The utopia of online exams, temptingly closes after Covid19 remote learning, will be challenged by generative AI when institutional reputation is at stake.

Embrace

Educators have the opportunity to employ generative AI as a tool for learning and assessment. Like plagiarism detection (which began as a tool for educators to spot cheating, but became a tool shared with students to learn appropriate citation) generative AI in the hands of students can have learning benefits. The possibilities are already being enumerated and I anticipate we’ll see many pedagogical applications of generative AI over the coming years. Here are some.

  • Providing summaries of written works to gain a better understanding
  • Generating exercises to self-assess understanding
  • Supporting text revision where non-native language skills are lacking
  • Providing a starting point for in-depth research

Evaluation is a level of Bloom’s Taxonomy that I don’t think AI has yet conquered and that leaves room for higher-order thinking to be assessed. A colleague pointed out an article from the Sydney Morning Herald describing tasks being prescribed to medical students, who were instructed to prompt ChatGPT for information and then critique it.

Conclusions

The benefits of generative AI could lead to better student outcomes, if educators allow learning to make use of them. At present, there is significant positive innovation to match the reactions of those wishing to block generative AI.

I don’t expect efforts to block generative AI to last long, especially while they are less than fully effective. Ultimately a point of balance between control and embrace needs to be established, where assessment of understanding can happen and the learning benefits of AI-based tools can be achieved.

Limitations

I need to say that these views are my own and not an official policy or stance of ACT Education.

I haven’t completed a comprehensive survey of opinions from “the coalface”, however I’ve been communicating with people who have been gathering reactions to generative AI and writing briefs and presentations for Education decision-makers, which led me to this understanding.

An EdTech Project Management Process for Schools

When I started as a leader responsible for Educational Technology projects in a school, I lacked a framework to work within. Having come from the software development world, I understood how to develop systems, however, selecting and implementing systems is quite different.

I looked into a few frameworks for system change management and found PMBOK (the Project Management Body of Knowledge) was adaptable to EdTech projects in schools. PMBOK is a general project framework, but it is adaptable, so I set about writing a process guide and template based on it and also included a number of software and educational models to give specific relevance to schools.

In 2018 I ran a workshop at the AIS ICT Leadership conference to share my version of this process and it was well received. In summary, the process involves a number of work areas for planning, execution and change control as shown in the following diagram.

When working with a project team, a template can be used as the basis of a collaborative document to work through the planning work areas.

The most involved area is the Procurement area, which involves consultation to determine requirements, transforming these into prioritised criteria then setting a cut-off for essential criteria.

The documents below describe the process in detail including a comprehensive guide, a project plan template and the slides from my workshop (with examples of a software and hardware project in a school).

I’ve since heard back from other schools who have applied the process successfully.

I’m sharing it here now so that I can refer back to this process as I describe an LMS transition project we have undertaken over the last two years in subsequent posts.

Organising Selection for an IT Position

Over my career I’ve been involved in interviewing and selecting new staff for IT positions on numerous occasions. I’ve learned a few tricks along the way and I thought I should share those.  A lot of these techniques are generalisable to positions peripheral to IT and elsewhere.

Choosing a candidate

The Panel

It helps to have more that one person doing the interviewing; two is OK; three is ideal; four can be intimidating. If the position is really serious and more people need to be involved, create two panels with different foci.

Interview

Aim for diverse perspectives in the panel members. If you’re a manager, involve a technical staff member and a support staff member, such as someone from HR.

Preparing

  • Short-listing
    Without going into too much detail, the panel that will interview should be the ones selecting who should be interviewed. Start separately and blindly review all the CVs. Bring together opinions into a collaborative space, such as a shared spreadsheet.
  • Discuss candidates openly
    After each panel member has rated candidates, come together to decide who to interview. Be open to disagreement as others may have spotted potential that you have not seen. Consider rounds of interviews with the most likely candidates first.
  • Invitations
    When you have a list of candidates, you need to invite them in.

    • Negotiating a time is best achieved over the phone. Offer the candidate opportunities within a specific window, but be accommodating.
    • Once a time is set, send a formal invitation that introduces the panel and their positions; this establishes perspectives for the candidate. Set expectations for where to go, when to arrive, what to wear and how long the interview will take. You may want to prompt the candidate to undertake some research into your organisation by directing them to online resources and work spaces.
  • The script
    It’s good to have a set series of questions going into the interview. All panel members should agree on the script before interviews start. Use a common document with names beside each question (rather than each panel member having their own script); this allows you to pass the flow of questions between panel members. If you have a script from a previous position, review the questions and ensure they are relevant to the current position. The script can be duplicated for each candidate so that notes can be inserted, by someone not asking current questions, during the interview.Interview script
  • Quick recap before interview
    Before an interview, all panel members should take a few minutes to review the candidates CV. Discuss their strengths and peculiarities so that you can focus questions during the interview.
  • Hospitality
    In a sense, your organisation is being interviewed as well as the candidate. You want the best candidate, who could possibly go elsewhere, to choose you. Simple things will help, like:

    • tidying the space where the interview will take place,
    • ensuring the temperature is comfortable and
    • having glasses and water poured for the candidate and the panel.
  • Everyone shakes hands
    Allow the opportunity for each member of the panel to shake hands with the candidate. That first physical contact is disarming and will establish what could be a future working relationship.
  • Seating
    Don’t arrange seating in a way that is confrontational, such as sitting on the opposite side of a big table from the candidate; a small table is better, with the candidate as part of a formation that is inclusive, like a circle.

The Interview

Repeat Introductions

Start by reintroducing the panel and what they do. This can be quick, but is important to preface the questions you will ask later. The panel leader can do this or each panel member can quickly say who they are and what their role is.

Questions

The candidate’s CV will tell you about their education, their experience and their skills, but it won’t tell you what kind of person they are, how well they will work with you and how they can apply what their skills. You want a good script of questions that tease these important aspects from the candidate’s brain.

brain-2029391_960_720

  • Icebreakers
    Candidates will be mentally prepared to convince you about their professional worth, but don’t jump straight into serious questions. Start by allowing the candidate to settle in and feel comfortable. A good way to achieve this is to ask the candidate to talk about their personal life; if they start drifting into work and skills, redirect them by saying you will get to that soon.

    • How did you come to be here in ___ ?
    • Tell us a bit about yourself as a person. What do you do in your spare time?
    • Tell us about your study. What inspired you to get into IT?
  • Focused career questions
    You want the candidate to tell you about their experience, but you don’t want a litany that will take up all your interview time. Ask questions that will allow the candidate to showcase them self, while highlighting aspects you are keen to hear about.

    • Without going into too much detail, tell us the places you have worked and your roles there.
    • (If applicable) Why are you leaving your current position?
    • What has motivated you to choose your career path?
    • What are some of the tasks you really enjoy doing?
  • Tell questions
    It’s hard to tell when people are being honest. One technique for eliciting humility and honesty is to ask the candidate to admit where they have failed. This may be counter to what the candidate is prepared for and it may be affected by cultural background, but it can give you a good idea of whether you want to work with that person. It’s a good way to distinguish potential assholes.

    • Can you think of a time when things did NOT work out the way you expected them to?
    • Can you tell me about a time you had a conflict with a colleague? How did you deal with it?
  • Focused skill questions
    You should be able to tell what skills a candidate has from their CV, but you want to know if they have real experience or was it something they observed someone else doing.

    • Tell us about your experience with Active Directory?
    • Have you ever written documentation in a wiki? No: what did you use?
    • Have you ever worked with a issue tracking system? How was that used?
  • Don’t forget the soft skills
    It’s easy to get stuck on technical skills for an IT job, but non-technical skills are really just as important in the day-to-day working of a successful team.

    • Have you worked as part of a team? What was your role?
    • What techniques do you use to manage your time?
    • How do you handle conflicting priorities?
  • A conundrum
    You want someone who can ‘think on their feet’ and consider alternative solutions. Posing a scenario that seems unsolvable at face value will prompt candidates to demonstrate their ability to think ‘outside the square’. The following example is for a service desk position in a school.

It’s been a busy day; you are feeling under pressure and a teacher calls you demanding that you set up an email account for a person who is not an employee but has come into their class to present.  This would be against the school’s policy, but you understand the teacher needs to make the class work. How would you deal with this situation?

  • Most candidates will start by stating that they cannot break policy because they want to give you the impression they are honest workers, ready to follow the rules. Some might say they will seek permission from a manager to break the rules. A good candidate will recognise that problems are often not what they are first reported to be and probing into the client’s needs will allow you to consider the problem then create a solution or a workaround.500px-Tribar.svg
  • Questions about your organisation
    You want to know if the candidate is actually interested and enthusiastic about working in your organisation. Give them the opportunity to share their research and how they have envisaged them self in your organisation.

    • What do you know about ?
    • What do you think it will be like working in a ?
  • Prompt for their questions
    Allowing candidates to ask you questions is more than a courtesy, it allows the candidate to take control of the interview and demonstrate their strengths and knowledge by probing you about what you do, what technologies you use and how the organisation works. A good candidate will come with prepared questions.

    • Do you have any questions for us?
  • Formalities
    Don’t leave yourself open to surprises.

    • What are your obligations and availability?

If you are leading the panel, avoid keeping all the curly questions to yourself. Farming some complex questions to another panel member allows you focus on how the candidate is answering the question, following their body language and ‘reading between the lines’.

Flow

A smooth interview is not rushed, nor is it slow. With good flow, the interview can be comfortable and friendly and elicit the honest answers you are seeking.

comments-151907_960_720

  • Build up with some easy questions first.
  • Hand over between panel members when asking questions.
  • Ask questions from the panel member who has the perspective from which you want questions answered (personal from the manager, technical from the technician, organisational from HR).
  • Be adaptive.
    • Don’t stick to the script when you want to clarify or probe deeper.
    • Don’t ask questions that have already been answered.
    • Make questions specific on-the-fly.

After asking all of your questions, lead into a task…

Evidence

A candidate may say they have the skills you require, but it’s hard to judge to what degree that is true. Their CV may have been developed over time, with outside help. Every candidate will say they have good communication and problem solving skills; we all have a self-optimistic bias. Don’t be afraid to take some time to get the candidate to demonstrate their skills.

cat-1800843_960_720

  • A role-play
    Pretend to be a client with a predetermined problem. Ask the candidate to put them self into a support role and attempt to unravel the problem. Getting the answer is not as important as how they approach the problem.
  • A quick quiz
    Allow the candidate to answer questions in a quick quiz. You might throw together some basic questions in a Google form or online survey and ask them to provide their answers.
  • A writing task
    Being able to write clearly is an important skill for all IT workers. Set up a scenario and ask the candidate to respond to a pretend client. Writing a pretend email or ticket-update on a machine you provide is an easy way to run this task.
  • A dev task
    If the candidate is applying for a technical role, ask them to resolve a bug or simple problem. This may be something they have to do after they leave you and later submit the response back to you. Be sure the problem requires them to establish a dev environment close to your work environment.
  • A presentation
    If the candidate is applying for a role that involves training, ask them to run a quick training session on a simple technology. If you’re considering this task, you will need to give the candidate notice before the interview so they can realistically prepare.

Debrief

Assessment biases can creep in over time. You can glorify earlier candidates or favour candidates you have seen more recently. Reflecting immediately after each interview is recommended, even if this means delaying the next interview by a few minutes.

When you’ve seen all candidates, hopefully you’re in a good position to choose. If none of the candidates are suitable, consider re-advertising. If there is a candidate that is suitable, but you’re not completely confident, remember that you can rely on a probation period if things don’t work out.

Don’t forget the unsuccessful candidates. Failing to respond respectfully to unsuccessful candidates puts the reputation of your organisation in danger, whereas an honest response with feedback that will help the candidate in future will be welcomed.