##plugins.themes.bootstrap3.article.main##

The paper explores the implementation and evaluation of the Video Interviewing and Digital Scoring System (VIDS) for Multiple Mini Interviews (MMIs) by NHS England/Medical Dental Recruitment and Selection (NHSE/MDRS). The study primarily focuses on the support provided during the pilot stage in late 2022, and interviews held from January to May 2023. Despite initial technical challenges and the need for ample support, the user satisfaction was beyond expectation at 86.7%. The system allowed participants to join their interviews remotely, significantly reducing physical resources, travel, and related expenses. On the flip side, it necessitated additional staffing and technical support for first-time users. Feedback collected over the pilot period indicated areas for further improvement. The paper concludes that overall, one administrator can manage 57 interviews per day for each Medical or Dental specialty with the need for one technical person per specialty managing 24/7 support.

Downloads

Download data is not yet available.

Introduction

The Medical Dental Recruitment and Selection (MDRS) unit is responsible for recruiting all health professionals for the National Health Service (NHS) across England, Scotland, Wales, and Northern Ireland (NHSE, n.d.). In 2012, the MDRS used Multiple Mini Interviews (MMIs) in its selection process. This approach, originally pioneered by Evaet al. (2004), employs a sequence of stations to assess the non-cognitive skills of applicants, similar to entry-level interviews for medical students. Following the structure of the Objective Structured Clinical Examination (Harden & Gleeson, 1979), these time-constrained, station-based evaluations aim to emulate real-life medical scenarios. Over the years, the MMI system has increasingly been used to select students for health professions programmes (Pauet al., 2013).

Multiple Mini Interviews (MMIs) are deemed reliable, feasible, and well-accepted by both candidates and evaluators. O’Brienet al. (2011) suggest that future longitudinal research could provide further insights into the validity of MMI as an effective tool in assessing the potential of applicants to become successful medical professionals (O’Brienet al., 2011). In addition, a 2019 systematic review by Yusoff confirmed the satisfactory predictive validity of MMI, initially proposed by Eva and colleagues (Yusoff, 2019). Despite its merits, the MMI process does face some constraints, notably its resource-intensive nature, which necessitates significant efforts in planning, developing, implementing, and evaluating. Factors such as station structure, context, and the training of raters play crucial roles in determining the effectiveness of MMIs. Additionally, the examination venue significantly impacts the raters’ evaluations. For an efficient MMI process, it is recommended to have a venue that is comfortable and sound-proof (O’Brienet al., 2011).

In 2014, Kelly and her team noted the increasing recognition of the Multiple Mini Interview (MMI) as a valuable tool when added to the evaluative processes, mainly due to its acceptability among stakeholders. It was recognised as helpful in understanding various factors influencing performance discrepancies among international candidates. Nonetheless, Kelly et al. emphasised the importance of aligning MMI procedures with high-quality assessment standards and ensuring fairness based on distributive and procedural justice principles for every candidate, regardless of their cultural background or nationality (Kellyet al., 2014). As higher education institutions progressively focus on enhancing students’ transferable skills to meet labour market needs, the importance of appropriate measures to evaluate students’ preparedness for the job market has been underscored. Within this context, the MMI was reiterated as a superior option to traditional one-on-one interviews for graduate students (Santoset al., 2020).

The primary objective of Freire and Barbosa’s 2023 study was to evaluate and contrast the scoring patterns of graduates at two separate multiple-interview (MMI) stations specifically designed for varied academic disciplines (Freire & Barbosa, 2023). With the Medical Dental Recruitment and Selection (MDRS) strategy, it is evident that MMIs are gaining traction for interviews in the recruitment process, primarily in Medicine, Dentistry, and Nursing. However, the use of MMIs in other industries remains limited. As MMI implementation requires substantial resources deliberate planning, development, and evaluation efforts, gauging how to enhance its efficiency and reduce costs is crucial. During the COVID-19 pandemic, face-to-face interviews and clinical examinations were suspended due to health safety measures. A 2020 cross-sectional survey focusing on UK medical candidates revealed that the pandemic would impose significant long-term effects on the medical workforce’s selection, training, and performance. Hence, it is essential to consider the perspectives and experiences of candidates from diverse backgrounds for decision-making processes that shape their career trajectories and the profession’s future.

In 2008, the University of Galway initiated the automation of time-constrained, station-based assessments to streamline Objective Structured Clinical Examinations (OSCEs) and Multiple Mini Interviews (MMIs). The primary goal was to reduce errors typically associated with manual, paper-based evaluations, often leading to inaccurate final results and incomplete scoresheets. The promising results of this innovation were first documented in 2011 (Kropmanset al., 2011). Amid the COVID-19 pandemic, video integration was introduced to these assessments, leading to the development of the Video-Embedded Multiple Mini Interview and Digital Scoring System (VIDS). Introduced in 2020 but fully operational by 2023, VIDS was utilised by the National Health Service England (NHSE) for health professional recruitment across England, Wales, Scotland, and Northern Ireland (Qpercom Ltd, 2020). More recently, Allison Callwood conducted a feasibility study confirming that candidates perceived remote video MMI as equally effective as traditional face-to-face interviews (Callwoodet al., 2022).

Not much is known about the level of (technical) support that is required for remote (multiple-mini) interviews (Green & Lindley, 2022). This study used a Research through Design approach to examine how interactive systems can improve remote, video-recorded interviews. Amid logistical, technical, and procedural challenges, they implemented established and innovative techniques to create the Interview Box solution. The research focused on how dedicated features could enhance the standard functionality offered by video conferencing platforms like MS Teams, Zoom, Google Meet, and Blackboard. They found that supplementing the basic video call format with features inspired by traditional interview practices improved interview quality. The rapid shift towards remote interviews during the COVID-19 pandemic normalised the use of lower-quality media, which the study suggests may detrimentally impact the effectiveness of interviews due to the loss of critical verbal and non-verbal signals. Achieving high-quality media is critical to effectively representing the interviewer and candidate interaction. This benefits all parties involved, including viewers, interviewees, and interviewers, as it can enhance the audience’s understanding of the content (Green & Lindley, 2022).

The purpose of this paper is to review the required (technical) support provided to the NHS England/Medical Dental Recruitment and Selection (NHSE/MDRS) in rolling out the use of the Video Interviewing and Digital Scoring System (VIDS) for Multiple Mini Interviews (MMIs) methodology used. This study primarily focuses on the support provided during the pilot stage in late 2022 and the interviews held between January and May 2023. The report demonstrates how the introduction of VIDS for MMIs reduces the need for physical resources and travel and necessitates additional staffing and technical support for the participants using this system for the first time.

Methods

Between January and May 2023, a total of 38 medical specialities and nine dental specialities participated in the interview process. Each applicant underwent multiple-mini-interviews (MMIs) with 2 to 4 stations. Each station’s content was varied and specifically tailored based on job requirements, competency frameworks, and the score sheets relevant to each speciality. The process involved applicants from participating specialities across England, Wales, Scotland, and Northern Ireland, with 69 interview events for 13 weeks conducted using the Video Interviewing and Digital Scoring System (VIDS) for application interviews (as shown in Fig. 1). The timeframe for the MMIs and accompanying online support spanned six months, with applicant feedback being gathered immediately upon completion of the online MMI interview. Subsequently, the collected feedback was analysed throughout June 2023.

Fig. 1. Total interview statistics in England, Wales, Scotland, and Northern Ireland regions (Source: Chad Atkinson, NHSE).

Considerations

Before initiating the online support, several factors were considered:

  1. It was anticipated that applicants may need to access the platform beyond the traditional 9-5 office hours, necessitating out-of-hours online support.
  2. Applicants were given at least five working days’ notice before their interview slot, allowing ample time to test their technical setup using the diagnostic tool.
  3. The support teams needed to know the interview slot details to effectively prioritise incoming queries.
  4. The cost, workload, and staffing requirements were determined by balancing immediate response times with the projection that 10% of the applicants (based on previous data from 69 interview events with 7225 scheduled applicants) would require assistance to go online successfully.

Implemented Support Process

The 6 months support process (see Fig. 2) that was put in place involved:

  1. Deployment of a diagnostic web tool for applicants, prompting them to verify their technical setup before interviews (as shown in Fig. 3).
  2. Integration of the Pubble-ASK (customer support web chat) tool into both the applicant web pages and the diagnostic web tool.
  3. Activation of Pubble-ASK only when an applicant fails a component of the diagnostic test.
  4. Provision of live, dedicated agents to respond to Pubble-ASK queries.
  5. Introducing a ‘bypass’ option allows applicants with poor connections to proceed while making them aware of potential connection problems (such as loss of video/audio or complete disconnection). However, applicants with audio issues were prohibited from proceeding to the interview.
  6. The creation of a Slack command allowed dedicated agents to quickly bypass applicants requiring immediate support.

Fig. 2. Project flowchart (Source: Kylee Fort).

Fig. 3. Support process flowchart.

Before initiating the pilot program, all participating NHS staff-administrators, troubleshooters, and panel members-were allowed to partake in optional training. The training aimed to acquaint them with the system. Subsequently, a single-question survey embedded in the web application, titled “How would you rate your experience of the Multiple-Station Video System?” was disseminated to all applicants instantly after the MMI. The survey aimed to ascertain their overall assessment of the Video Interviewing and Digital Scoring (VIDS) system.

Results

Over 13 interview weeks, 64 professional groups conducted 69 interview events during a two-round process. They interviewed 7225 applicants via the Video Interviewing and Digital Scoring (VIDS) platform. The median was 105 applicants per event or 556 applicants per interview day, recorded between the fourth and sixteenth weeks. The estimated number of applicants who required support fluctuated between 20 and 165, depending on the daily interview count. This was based on an anticipated 10% need for support. However, the number of questions asked ranged from 15 to 106, with some applicants posing multiple inquiries. On an average day, based on the number of interviews scheduled, approximately 60 applicants required support, as indicated by the overall mean. It was also found that an average of 55 queries were fielded, often concerning technical aspects of accessing VIDS. Out of 7225 interviewees, 775 were anticipated to require support. However, only 9.3% of them utilised this assistance, falling below the projected 10% threshold. Each speciality had assigned troubleshooters, with a total workforce of 66 troubleshooters. For immediate handling of queries, these 66 NHS troubleshooters were employed, employing around 1.03 staff members per interviewing speciality or roughly 4.4 per interview day.

Additionally, nine administrators or assessment coordinators were available to manage and coordinate online interviews. An average administrator managed approximately 858 interviews over 15 days, corresponding to 57 daily interviews. Qpercom, the VIDS provider, had eight online support agents available round the clock to oversee 142 technical bypasses. Furthermore, 22 FAQs were added to the Pubble-ASK FAQ database for user guidance. As per our system evaluation based on a single question - “How would you rate your experience on the Multiple-Station Video System?” - the overall average satisfaction rating was 4.33 out of 5 (SD = 0.08), indicating an impressive satisfaction rate of 86.7% (SD = 1.7%).

Discussion

Multiple Mini Interviews (MMIs) provide several advantages for non-medical selection interviews:

  1. Reliability: MMIs are known to be reliable methods of assessing applicants’ non-cognitive skills. Each applicant is tested in different categories by different interviewers, which reduces bias and gives a broad perspective on the candidate’s abilities.
  2. Diversity: MMIs allow assessing a wide range of skills and attributes within the same interview process and provide a more holistic evaluation of an applicant than traditional interviews.
  3. Feasibility: MMIs are feasible and acceptable to both applicants and interviewers. Despite being resource-intensive, their efficiency can lead to good outcomes.
  4. Predictive Validity: MMIs have shown satisfactory predictive validity, meaning they can effectively predict an applicant’s potential to perform in their role.
  5. Broader Application: While MMIs are extensively used in medical fields, they also show promise in other fields. They can assess a wide array of skills and competencies, making MMIs useful for non-medical fields too. They offer a potential alternative to traditional interview methods in diverse areas of job recruitment and selection.
  6. Adaptability: MMIs have proven adaptable in the face of unprecedented circumstances, such as during the pandemic when it was transformed into remote video MMI without major disruption.
  7. Efficiency: Use of digital scoring systems like VIDS for MMIs can be more efficient and reduce assessment errors compared to paper-based assessments (Cameron & MacKeigan, 2012; Kellyet al., 2014). They also significantly reduce cost by eliminating the need for physical meeting spaces and travel for interviews.
  8. Beneficial for Applicants: MMIs also have benefits for applicants, providing them with a fair and thorough evaluation system, where they can demonstrate a range of skills and attributes beyond those typically assessed in traditional interviews. It’s important to note, though, that while MMIs offer several potential advantages, they also require significant resources for planning, development, implementation, scoring, and evaluation. The success of MMIs also heavily depends on the structure and context of the stations and the quality of the examiners’ training (Callwoodet al., 2022; Pauet al., 2013).

Digital scoring and statistical analysis can greatly enhance the Objective Structured Clinical Examination (OSCE) as well as the Multiple Mini Interviews (MMIs) and its examiners’ quality in the following ways:

  1. Standardisation: Digital scoring ensures the standardisation of evaluations, reducing discrepancies that may arise from different examiners’ subjective judgments.
  2. Accuracy: Digital scoring eliminates the possibility of human error that might occur with manual calculations, providing more accurate and reliable results.
  3. Clear Metrics: Digital scoring tools offer clear scoring metrics, which guide examiners in their ratings and ensure that each applicant is assessed using the same criteria.
  4. Instant Analysis: Statistical analysis can be conducted immediately after the MMI, offering insightful and real-time feedback. This can reveal areas of the MMI process that may require improvement.
  5. Quality Assurance: Training trainers through online modules can improve their understanding and implementation of MMIs. Statistical feedback on their scoring behaviour can improve reliability over time, contributing to ongoing quality assurance.
  6. Examiner Calibration: With digital scoring, it’s easier to track examiners’ scoring trends and identify discrepancies among raters. This allows for appropriate recalibration and training, improving examiner consistency and reliability.
  7. Transparency and Fairness: Digital scoring and analysis offer more transparency, as the scores can be reviewed and justified easily and quickly. Since it reduces subjective bias, it contributes to a fairer and more equitable selection process.
  8. Efficiency: Using digital scoring systems and immediate statistical analysis significantly reduces the time spent on post-assessment administration and speeds up the decision-making process, making the selection process more efficient. By ensuring that the evaluation process is transparent, standardised, and supported by real-time data, digital scoring and statistical analysis add significant value to the MMI process and its examiners (Callwoodet al., 2022; Van Der Wantet al., 2021).

Support Required for VIDS Rollout in Recruitment Teams

The support provided for NHSE/MDRS during the rollout of VIDS in Jan-May 2023 for recruitment teams was substantial and well-experienced. The staffing perspective required at least one assessment coordinator/administrator and one troubleshooter per assessment event during the pilot stage. This level of support was necessary as it was the first time all users were becoming familiar with VIDS. During the pilot stage, the user satisfaction of 86.7% exceeded expectations (see Fig. 4). Additionally, using VIDS allowed interviewees to join their interviews from the comfort of their homes or office spaces, eliminating the need for additional office space in assessment centres and reducing travel expenses. Furthermore, using VIDS also potentially reduced the carbon footprint, although data on this is currently unavailable.

Fig. 4. System ratings among different participants (Source: Chad Atkinson, NHSE).

Adopting VIDS in recruitment processes is proving to be beneficial in terms of resource allocation and convenience for both applicants and NHSE. Using multiple-mini-interviews (2–4 stations) significantly reduces the need for assessment centres and travel costs for participants. However, it is important to note that implementing VIDS also requires additional staffing and technical support, especially for NHSE applicants who are using the system for the first time. The findings of this study suggest that one administrator can effectively manage 57 interviews per day and speciality. In contrast, one technical support person per speciality is required to provide 24/7 support before and on the day(s) of the interviewing event (Qpercom, n.d.). These findings provide valuable insights for planning remote multiple-mini-interviewing events within the NHS.

Feedback and Recommendations for Improvement

Throughout the pilot stage, valuable feedback was gathered from applicants and recruitment staff on the VIDS process and remote multiple-mini-interviews. It was found that 25% of medical applicants and 17% of dental applicants reported experiencing issues during the process (see Fig. 5). These issues varied depending on the interviewing date, with a peak in issues occurring around ¾ of the process, likely due to the high number of applicants and the resulting increase in technical difficulties. To address these issues, prompt surveys were sent to all users, informing them of stage changes or reasons for any delays. These prompts were well received, with 87.12% of dental applicants and 96.12% of medical applicants finding them helpful.

Fig. 5. Number of issues reported among medical and dental applicants. (Note: The timeframe between the two groups of applicants differs. Source: Chad Atkinson, NHSE).

The training provided prior to the start of the pilot was also deemed helpful by those who participated. However, there is room for improvement in the delivery of the training. It is suggested that offering both “quick guide training” and “in-depth training” after the first recruitment event(s) may enhance the learning experience for administrators. Participants reported that the VIDS system was easy to use and ran smoothly overall. The technical team has considered the suggestions for improvement and implemented them in VIDS. One area that still requires attention is addressing outstanding reflection points and resolving them before the next official recruitment rounds in 2024.

It is also important to track user feedback and promptly address any issues. Furthermore, there were a few issues related to communication between the agent and applicants regarding accessing Pubble-ASK and FAQ’s. To address this, it is recommended to provide clear instructions and examples of what the applicant would see and need to do to resolve their problem in a blog post or other easily accessible resources. In general, the pilot phase of the VIDS system and remote multiple-mini-interviews had several advantages. VIDS allowed almost all interviewees to participate in their interviews from the comfort of their homes or offices. This not only provided convenience for the participants but also eliminated the need for renting additional office space in assessment centres and reduced travel expenses for participants who did not need to travel. Moreover, these changes likely resulted in a reduction in the carbon footprint associated with the recruitment process.

Implementing the VIDS system during the pilot phase of remote multiple-mini-interviews for recruitment teams in NHSE/MDRS was met with substantial support. It experienced a high level of user satisfaction. During this pilot phase, it was found that the support provided for NHSE/MDRS for rolling out VIDS between January and May 2023 for recruitment teams was extensive and went beyond just the use of the Video Multiple Mini Interviewing Digital scoring system. This support included at least one assessment coordinator/administrator and one troubleshooter per assessment event, highlighting the importance of having adequate staffing for a successful implementation and smooth operation of the VIDS system.

The user satisfaction of 86.7% exceeded expectations, considering that this was the first time all users were getting familiar with VIDS (Qpercom, n.d.). The ability of almost all interviewees to attend their interviews from the comfort of their homes or office space added to their overall satisfaction (Qpercom, n.d.). Furthermore, eliminating the need for additional office space in assessment centres and reducing travel expenses for participants who did not need to travel were significant advantages of implementing remote multiple-mini-interviews using VIDS. The cost-saving benefits of these changes and the potential reduction in the carbon footprint cannot be overlooked. During the pilot phase of the VIDS system and remote multiple-mini-interviews, it became evident that there were several advantages to this new approach. Firstly, the convenience for participants cannot be overstated. Being able to join interviews from the comfort of their homes or offices eliminated the need for participants to travel to assessment centres and reduced the cost and inconvenience. Moreover, the remote interviewing process allowed the NHSE to avoid the expense of renting additional office space for assessment centres.

In conclusion, valuable insights were gained through participant feedback and observations during the pilot phase and rollout of the support system to all four nations. As a result, several improvements have been implemented. These include scaling cloud resources to handle increased web traffic, improving performance significantly. Moreover, administrators can now bypass applicants facing connection issues on-demand, simplifying the process, and enhancing user experience while ensuring WCAG accessibility compliance. Other enhancements include:

  • An extended audit trail for capturing more detailed information,
  • Video recording for post-assessment analysis in case of technical problems,
  • Shared screen assistance for troubleshooting issues.

Moving forward, our goal is to continue gathering end-user feedback and applying design thinking principles to further enhance efficiency in remote multiple-mini-interviews.

References

  1. Callwood, A., Gillam, L., Christidis, A., Doulton, J., Harris, J., Piano, M. (2022). Feasibility of an automated interview grounded in multiple mini interview (MMI) methodology for selection into the health professions: An international multimethod evaluation. BMJ Open, 12(2), e050394. https://doi.org/10.1136/bmjopen-2021-050394.
     Google Scholar
  2. Cameron, A. J., & MacKeigan, L. D. (2012). Development and pilot testing of a multiple mini-interview for admission to a pharmacy degree program. American Journal of Pharmaceutical Education, 76(1), 10. https://doi.org/10.5688/ajpe76110.
     Google Scholar
  3. Eva, K. W., Rosenfeld, J., Reiter, H. I., & Norman, G. R. (2004). An admissions OSCE: The multiple mini-interview. Medical Education, 38(3), 314–326. https://doi.org/10.1046/j.1365-2923.2004.01776.x.
     Google Scholar
  4. Freire, C., & Barbosa, I. (2023). Assessing graduates’ transversal competences through an adapted MMI model: Confidant interview vs. stress interview. Education + Training, 65(1), 146–162. https://doi.org/10.1108/ET-05-2022-0195.
     Google Scholar
  5. Green, D. P., & Lindley, J. (2022). The interview box: Notes on a prototype system for video-recording remote interviews. 2022 ACM Designing Interactive Systems Conference: Digital Wellbeing, DIS 2022, 1044–1057. https://doi.org/10.1145/3532106.3533504.
     Google Scholar
  6. Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13(1), 39–54. https://doi.org/10.1111/j.1365-2923.1979.tb00918.x.
     Google Scholar
  7. Kelly, M. E., Dowell, J., Husbands, A., Newell, J., O’Flynn, S., Kropmans, T. (2014). The fairness, predictive validity and acceptability of multiple mini interview in an internationally diverse student population—A mixed methods study. BMC Medical Education, 14(1), 267. https://doi.org/10.1186/s12909-014-0267-0.
     Google Scholar
  8. Kropmans, T. J., O’Donovan, B. G., Cunningham, D., Murphy, A. W., Flaherty, G., Nestel, D. (2011). An online management information system for objective structured clinical examinations. Computer and Information Science, 5(1), 38. https://doi.org/10.5539/cis.v5n1p38.
     Google Scholar
  9. NHSE. (n.d.). Qpercom Recruit: Qpercom Recruit will be used by a range of specialties to conduct virtual interviews. https://medical.hee.nhs.uk/medical-training-recruitment/medical-specialty-training/overview-of-specialty-training/qpercom-recruit.
     Google Scholar
  10. O’Brien, A., Harvey, J., Shannon, M., Lewis, K., & Valencia, O. (2011). A comparison of multiple mini-interviews and structured interviews in a UK setting. Medical Teacher, 33(5), 397–402. https://doi.org/10.3109/0142159X.2010.541532.
     Google Scholar
  11. Pau, A., Jeevaratnam, K., Chen, Y. S., Fall, A. A., Khoo, C., & Nadara- jah, V. D. (2013). The multiple mini-interview (MMI) for student selection in health professions training—A systematic review. Medical Teacher, 35(12), 1027–1041. https://doi.org/10.3109/0142159X.2013.829912.
     Google Scholar
  12. Qpercom Ltd (2020, December). Time constrained examinations: Using video to assess students remotely [WordPress]. In Time Constrained Examinations: Using Video to Assess Students Remotely. https://www.qpercom.com/time-constrained-examinations-using-video-to-assess-students-remotely.
     Google Scholar
  13. Qpercom. (n.d.). Review NHSE 2023 Support Plan & Feedback (unpublished internal report).
     Google Scholar
  14. Santos, S., Barbosa, I., Freire, C., Figueiredo, H., & Costa, M. J. (2020). The Multiple Mini-Interviews as a Method to Assess Transversal Competencies for the Graduate Job Market: A Pilot Study. pp. 4144– 4153. https://doi.org/10.21125/iceri.2020.0929.
     Google Scholar
  15. Van Der Want, A. C., Bloemendaal, P. M., & Van Der Hage, J. A. (2021). Examiners’ perceptions in surgical education: The blind spot in the assessment of OSCEs. Journal of Surgical Education, 78(2), 590– 596. https://doi.org/10.1016/j.jsurg.2020.07.024.
     Google Scholar
  16. Yusoff, M. S. B. (2019). Multiple mini interview as an admission tool in higher education: Insights from a systematic review. Journal of Taibah University Medical Sciences, 14(3), 203–240. https://doi.org/10.1016/j.jtumed.2019.03.006.
     Google Scholar


Most read articles by the same author(s)