Source: Deep Learning on Medium
New Prototype for Building Interview Chatbots with Active Listening Skills
This research summary is just one of many that are distributed weekly on the AI scholar newsletter. To start receiving the weekly newsletter, sign up here.
Technology is evolving faster than we can keep up. Today, there’s a myriad of technologies disrupting almost every industry in ways that we never imagined. The recruiting industry is one of them with things changing in the way we look for jobs and are being found by recruiters.
Chatbots are one of those technologies being adopted for several industrial applications and have become a big deal in the recruitment world. Specifically, interview chatbots are being implemented to source job applicants, elicit information from them, schedule interviews, make job offers, receive offers, and more saving recruitment teams effort, time and money. However, state-of-the-art interview chatbots are far from perfect and it is still a grand challenge to build capable chatbots that can handle user free-text responses to open-ended questions to deliver an engaging user experience.
Building and Evaluating Interview Chatbots with Active Listening Skills
There has been a lot of effort towards building effective chatbots. Meena, one of the latest by Google can conduct conversations that are more sensible and specific compared to existing state-of-the-art chatbots.
Researchers with the University of Illinois have also been working towards building effective interview chatbots with active listening skills. To do that, they first investigated the possibility of and effectiveness of using publicly available and practical AI technologies.
They looked into already existing chatbot platforms and selected Juji because it is publicly available, rule-based and allows a chatbot designer to bootstrap a chatbot without training data. The platform also demonstrates its success in support of interview chatbots to develop a prototype system for building chatbots with active listening skills.
They then used the prototype to create two chatbots with and without active listening skills, respectively. The two chatbots were then evaluated live with 206 participants from Amazon Mechanical Turk to compare their performance by a set of metrics, including quality of user responses and user perception. Results show the effectiveness of an interview chatbot with active listening skills better at engaging users and eliciting quality user input.
Potential Uses and Effects
This work contributes the following;
- Practical approaches to effective interview chatbots.
- A hybrid framework for developing progressive chatbot platforms
- Design implications for building empathetic chatbots beyond interview tasks.
By combining a rule-based chatbot builder with data-driven models to power interview chatbots with active listening skills, interview chatbots can be able to better handle complex and diverse user responses to open-ended interview questions. In effect, the chatbots will be able to deliver more engaging user experiences and eliciting higher-quality user responses.
While the implementation is done on top of Juji, the methodology can be extended to any chatbot platforms.