JCDR - Register at Journal of Clinical and Diagnostic Research
Journal of Clinical and Diagnostic Research, ISSN - 0973 - 709X
Education Section DOI : 10.7860/JCDR/2021/46288.14593
Year : 2021 | Month : Mar | Volume : 15 | Issue : 03 Full Version Page : JI01 - JI04

Evolution of Objective Structured Clinical Examination- Actual to Virtual

Tripti K Srivastava Waghmare1, Lalitbhushan S Waghmare2

1 Professor, Department of Physiology, Datta Meghe Institute of Medical Sciences, Wardha, Maharashtra, India.
2 Professor, Department of Physiology, Datta Meghe Institute of Medical Sciences, Wardha, Maharashtra, India.


NAME, ADDRESS, E-MAIL ID OF THE CORRESPONDING AUTHOR: Tripti K Srivastava Waghmare, Jawaharlal Nehru Medical College, Wardha, Maharashtra, India.
E-mail: drtriptisrivastava@gmail.com
Abstract

Objective Structured Clinical/Practical Examination (OSCE/OSPE) has come a long way since its inception by Harden RM in 1975. Literature offers many studies and reviews about its applicability in assessment and impact in learning outcomes. The present review traces the variants of OSCE as it has evolved over time with need and contextual variations, though still being sought after as one of the most valid tools in assessment of clinical and professional skills. The article reviews various forms viz., Group OSCE, Team OSCE, Objective structured assessments of technical skills, Shadow OSCE, Inter-professional OSCE, Inter-professional OSCE with Allied Embedded Actors, Reverse OSCE, Culture OSCE, e OSCE, Tele OSCE and Virtual OSCE in terms of the method and utility. Inspite of numerous variations, the principle of OSCE remain intact i.e., an effective tool to foster learning and attainment of clinical/practical competencies by direct observation of skills, timely and developmental feedback. The manuscript also gives an example of blueprint for assessment of skills wherein such OSCE variations can be planned depending on the nature and objective of assessment.

Keywords

Introduction

The OSCE/OSPE was first described by Harden RM et al., in 1975 to combat the limitations of traditional clinical examination-long and short case, for improving its validity and reliability [1]. The traditional examination posed the limitations of reproducibility and validity of these examinations, inter-case variance, non-standardised scoring and no direct assessment of history taking, physical examination, laboratory procedures etc. To combat these limitations, OSCE, as envisaged by Harden RM, was a series of stations assigned with tasks of equal duration, encompassing one/multiple domains of learning wherein students can be assessed on a competency by a standardised checklist [1]. Khan KZ et al., represented OSCE as “An assessment tool based on the principles of objectivity and standardisation, in which the candidates move through a series of time-limited stations in a circuit for assessment of professional performance in a simulated environment. At each station, candidates are assessed and marked against standardised scoring rubrics by trained assessors” [2].

OSCE targets the ‘Shows How’ level of Miller’s pyramid of assessment in a simulated environment and has, since its inception, gained credence because of its objectivity and scope of multiple learning opportunities [3-5]. Past decade has revolutionised its role, particularly as a formative assessment tool for identification of learning gaps, tailoring feedback on student performance and need based instructional adjustments [6-10]. Since, its origin in 1975, medical world has remodelled OSCE with contextual need and variations. The aim of this review was to explore the variations that have been coined and piloted with significant observations. The manuscript critically reviews these variations in terms of its applicability, utility and reproducibility in different settings.

A conventional OSCE: Typically an OSCE/OSPE comprises of 10-20 individual stations that sample across a wide range of clinical or practical competencies as depicted in [Table/Fig-1] [11,12].

Range of Skills assessed in OSCE/OSPE [11,12].

Communication and professionalism skills

History taking skills

Physical examination skills

Practical/technical skills

Clinical-reasoning skills

Clinical decision making including differential diagnosis

Interpretation of clinical findings and investigations

Management of a clinical situation including treatment and referral

Patient education

Health promotion

Clinical problem solving skills

Acting safely and appropriately in an emergency clinical situation

Critical thinking in therapeutic management

Team skills

Interdisciplinary health care management


Journey of OSCE from Actual To virtual

The exploration of this valid and reliable assessment tool has been substantial despite its limitations in terms of it being resource and time intensive. In view of its strengths and feasibility, most of the variants have evolved to address a wide range of competencies in health care and adapting its utility more so as a learning tool. Some of the trialled variants are as listed below.

Shadow OSCE [13]

In Shadow OSCE, the examiner (observer) moves along with the learner through all stations and completes the checklists for all stations. Thus, the observer acts as a shadow of the examinee and has first-hand information about performance of a particular examinee for all stations of OSCE. The essential structure of the OSCE is maintained in this strategy. The observer is trained to observe on the selected tasks, give feedback and elicit reflection. Structured feedback is given to the examinee at the end of the OSCE for all the stations. The feedback is specifically targeted to reinforce correct approaches, identify gaps and suggestions for improvement in performance.

Utility: The shadow examiner is intended for an optimised targeted feedback that is the hallmark of formative nature of OSCE. These modifications do not lead to any important bias in students’ score. This strategy may provide important insights for formative assessment in clinical performance.

Challenge/shortfall: Shadow observer (examiner) fatigue and single observer compromises the reliability of this method to some extent.

Group Objective Structured Clinical/Practical Examination/Group Observed Procedural Examination (GOSCE) [10,14,15]

In GOSCE, examinees are assigned in groups of four to five rather than individually as in traditional OSCE. The respective groups rotate around OSCE stations and take turns in performing assigned tasks in subsequent stations. The examinee who has performed the assigned task at a given station narrates his/her findings to the observer, whilst the rest of the group members observe. Later, the observer can ask other group members to contribute. GOSCE gives the advantage of optimal time and resource utilisation and also provide opportunity to the examinees to learn by observing each other.

Utility: The GOSCE is an efficient, learner centred, experiential learning approach for assessing communication skills, clinical reasoning, self-assessment and giving feedback in a low-resource setting.

Challenge/shortfall: Duration at each station may be increased in GOSCE. Preparation of checklist requires meticulous planning.

Team OSCE [16]

Team OSCE gives opportunity for assessment of skills necessary in working with a health care team, understanding role of other team members, shared decision-making, paraphrasing, maintaining confidentiality and empathy. It involves role playing in a multidisciplinary team by trainees who take the role of health care worker other than their specialty, for example a psychiatry trainee plays the role of psychiatry social worker.

Utility: The approach can be of significance in any specialty where team approaches are needed (palliative care, chronic illnesses, road traffic accidents). It can help promote understand the role of other team members, shared decision-making, problem-solving, handling unexpected events, giving feedback and closure.

Challenge/shortfall: Few practice sessions are required before actual conduction of Team OSCE to avoid confusion amongst examinees. Preparation of checklist requires meticulous planning in this method.

Objective Structured Assessments of Technical Skills (OSATS) [17,18]

OSATS are structured assessment of surgical skills in operating room or laboratory. It was first used by University of Toronto in the 1990s as a multi-station performance-based examination of surgical skills. Conventionally, it consisted eight-station of 15 minutes each (total 2 hour) with bench model simulation of excision of a skin lesion, the insertion of a T-tube, abdominal wall closure, hand-sewn bowel anastomosis, stapled bowel anastomosis, control of Inferior Vena Cava (IVC) haemorrhage, pyloroplasty, and tracheostomy.

Typically, the skills are evaluated on two components, an operation specific checklist and a global rating scale. The global rating scale consists of seven evaluation items scored on a 5-point scale. The seven evaluation items are: 1) Respect for Tissue; 2) Time and Motion; 3) Instrument handling; 4) knowledge of instrument; 5) Flow of operation; 6) Use of assistant; and 7) knowledge of specific procedure [19].

Utility: OSATS can be used to evaluate and teach both basic and complex skills to residents. The global rating scale can be applied to any skill assessment since it is generic in nature.

Challenge/shortfall: It is limited to few chosen skills in one disciple only.

Inter-professional Objective Structured Clinical Examination (IP-OSCE) [20]

IP-OSCE is informative in assessing Interprofessional skills that involve teamwork and communication. Typically, in IP-OSCE a nurse and a medical student collaborate in caring for a standardised patient in ambulatory setting.

Utility: It can assess interprofessional knowledge, communication, teamwork and problem solving ability of learner in low ambulatory settings. The particular value of these IP-OSCE is that it can be place within a conventional OSCE to add a dimension of collaborative practice.

Challenge/shortfall: Few practice sessions are required before actual conduction of IP-OSCE to avoid confusion amongst examinees. Preparation of checklist requires meticulous planning in this method.

Inter-professional Objective Structured Clinical Examination with Allied Embedded Actors (AEA-IP OSCE) [21]

This is a variant of IP-OSCE wherein Allied Embedded Actors (AEAs) can be individuals who simulate the role of a pharmacist, social worker, technician etc. These role actors are trained to depict real life scenarios, complex situations, specific challenges, interprofessional teamwork issues, or any additional information necessary in patient care [22,23]. The Individual Team- work Observation and Feedback and guided feedback (iTOFT, a checklist designed for live observation and assessed in simulated situations of teamwork, behaviours, and skills) can be obtained.

Utility: This can serve as a useful and effective method for incorporating Inter-professional Education (IPE) into the medical curriculum. It provides a real life immersive experience of working in an interprofessional team, understanding a complex clinical situation from multiple perspectives and act efficiently and collaboratively for better health care outcomes.

Challenge/shortfall: It is a resource intensive method and training of Allied embedded actors require lot of efforts.

Reverse OSCE (ROSCE) [24]

In this method, the role of tutor and student is reversed. The tutor demonstrates a given task on a particular station, whereas, a group of students act as observers. The tutor purposely misses a step and/or embedded an erroneous step while performing assigned tasks at a respective station. These fallacies, that are to occur while performing the given task, are pre-decided. The student has to record any deviation from the standard sequence of steps or identify the erroneous step. Apart from identifying the fallacies; student has to reflect upon the impact of such fallacies and also mention the correct approach. Significant attribute of Reverse OSCE is the inclusion of the last station as a “Reflect station” wherein the student has to reflect and record the details of their observation. Observation of communication, counselling and attitudinal skills can also be included in reverse OSCE.

Utility: Reverse OSCE can be used as an assessment tool in situations where it is presumed that the student has to be assessed upon his thoroughness regarding a standard exercise through his observational and reflective skills. It can assess 8-10 students at a given time on one station, thus reducing the time of assessment. Since the students are observers in Reverse OSCE, technicians, residents and junior staff can take the role of demonstrators.

Challenge/shortfall: Embedding erroneous steps or planning of missing a step in ROSCE requires considerable thought and training of demonstrators.

Peer-led Multi Role Practice OSCE (PrOSCE) [25]

It is a series of totally peer-led multi-role practice OSCEs (PrOSCEs) as an alternative to large scale mock OSCEs. The PrOSCEs are designed to closely replicate real OSCEs. In practice OSCEs, the students take the role of examinee, observer and patient turn by turn, entirely from practice point of view, throughout the curriculum. This practice ensures that student get sufficient opportunities to practice the necessary skills in a safe environment without any resource constraint.

Utility: The primary objective of PrOSCEs is to provide a low cost, low administrative burden format for OSCE practice. It is resource efficient and easy to replicate. It is envisaged that this multi-role aspect of this approach can enable the learner to understand clinical condition from patient’s perspective and nurture empathetic skills.

Challenge/shortfall: Practice sessions in PrOSCE are time intensive.

Culture OSCE [6,26]

Culture OSCE was proposed with the principle that Health care training involves dealing with different cultural and ethnic groups and inculcation of culturally relevant competencies is a must for a health professional. In this method the standardised patients from the relevant cultural groups or the ones trained to understand specific cultural issues are recruited and relevant clinical conditions are kept in OSCE stations. Costumes and props are also utilised to enhance the authenticity of the encounter. Observers are specifically trained to observe such encounters, on a rating scale that include communication and cultural skills and give constructive feedback.

Utility: The objective of the Culture OSCE is to provide an opportunity to develop cultural competency skills that are an integral part of training in health care from global perspective.

Challenge/shortfall: Practice sessions in Culture OSCE are time intensive.

e-OSCE [27-29]

e-OSCE combats one of the biggest challenges of manual entry and analysis of scores at each station. The student’s record, scoring, analysis and feedback included is entered into i-pad that is connected to the main server. A considerable amount of man hours is reduced with e-OSCE and customised score sheet with feedback can be shared with the learner in digital form.

Utility: e-OSCEs reduced the post-examination manual compilation of data, minimises potential errors in managing scores and enables a quick and efficient feedback mechanism. The challenge is extensive for pre-examination data feeding in the software and handling of technology by the examiners.

Challenge/shortfall: For e-OSCEs, observers need to be trained in handling the technology and customise digital feedback for every learner.

Tele OSCE [30]

In tele-OSCE, the Examinee, Observer and Standardised patient participate from different locations through a digital platform like webex, zoom, Blackboard, Learning space etc. A variant can be Student and Observer from same location and Standardised patient from different location via a computer screen.

Utility: Students learn core concepts of practising patient centered medicine using technology while also getting the novel experience of what a telemedicine encounter is like. Appropriate Communication and clinical reasoning skills in a virtual encounter is emphasised.

Challenge/shortfall: Standardised patient, Observer and student need to be trained for effectively implementing this tool on digital platform.

Virtual OSCE (VOSCE) [31]

Virtual OSCEs, offer a range of variations for designing OSCE stations wherein a 3-D interactive virtual OSCE station allows the user to “live in” and interact via his or her graphic representation, or “avatar” utilise virtual reality for an immersive experience. Virtual worlds are engaging, media-rich simulation environments that promote “experiential learning” and that have been used for training and skills assessment. In virtual OSCEs virtual case scenario based stations can be gamified and be remotely accessed by students [32-35].

Utility: Virtual OSCE is especially useful for assessing skills that require typical situations like Emergency management, team-based skills, disaster management, and cultural competencies. Avatars can serve as alternative to standardised patients that can be programmed in multiple ways and are consistent in presentation of range of clinical conditions.

Challenge/shortfall: Virtual OSCE requires prior training and sophisticated gadgets for learners to immerse in 3D virtual world.

An OSCE for All Seasons!

With increased evidence of high validity, OSCEs have become more sophisticated and are portraying more realistic clinical scenarios. It is a tool that has evolved with changing needs and contextual variations. A sneak peek into the series of evidences that explores its utility for learning and assessment; confirm its attributes of a good and useful tool as described by Van der Vleuten CP about five such criteria of a test tool viz., reliability, validity, educational impact, cost efficiency and acceptability [36].

OSCE variations, as described above, can be considered while planning OSCE for assessment of learning. It is recommended that a judicious admixture of various forms may ascertain a more holistic assessment of various competencies of a clinician, leader and member of health care team, communicator and a professional [37]. [Table/Fig-2] depicts an OSCE blueprint for planning OSCE/OSPE of Cardiovascular system in Physiology depending upon the skill that needs to be assessed. It gives a range of options that can be considered for assessing a chosen skill and the level of learning of the student.

Blueprint for choosing OSCE variations for assessing different skills (*mean the factor is present in that category).

Skills to be assessedType of OSCE
Conventional (with SP/response station/question station)GOSCE/Team OSCEIP-OSCEAEA- IP-OSCEPrOSCEShadow OSCEROSCECulture OSCEOSATSe-OSCETele/Virtual OSCE
History taking******
Psychomotor skill/Physical examination/specific surgical skill*******
Communication/Attitudinal/Consultation*******
Problem solving*****
Cognitive skills**
Clinical decision*****
Critical thinking/problem solving******
Interprofessional skills**
Practice sessions*

Professional year: 1st MBBS, Subject: Physiology, Theme/System: Cardiovascular system; OSCE: Objective structural clinical examination; GOSCE: Group objective structural clinical examination; ITOSCE: Interprofessional structural clinical examination; AEA: Allied embedded actors; tROSCE: Tear-led objective structural clinical examination; ROSCE: Reverse objective structural clinical examination; OSATS: Objective structured assessment of tecnical skills; e-OSCE: Electronic Objective structural clinical examination


Conclusion(s)

Irrespective of the design of OSCE, the basic intent of OSCE remains the same: an assessment tool to foster learning. Depending upon intent and objective of OSCE, the various different forms can be mixed and matched or may be performed in isolation. Whether, it is adopted in its conventional form or any variation; the chronology of its impactful implementation remains consistent i.e., diligent planning, adequate preparation, smooth implementation that involves direct observation of skills viz., psychomotor, affective, communication, attitudinal, problem solving, clinical reasoning, critical reasoning, inter-professional etc., and timely, specific and developmental feedback. Further studies should be focussed more towards impact of these different forms of OSCE weighed against its feasibility in achieving intended outcomes.

Professional year: 1st MBBS, Subject: Physiology, Theme/System: Cardiovascular system; OSCE: Objective structural clinical examination; GOSCE: Group objective structural clinical examination; ITOSCE: Interprofessional structural clinical examination; AEA: Allied embedded actors; tROSCE: Tear-led objective structural clinical examination; ROSCE: Reverse objective structural clinical examination; OSATS: Objective structured assessment of tecnical skills; e-OSCE: Electronic Objective structural clinical examination

References

[1]Harden RM, Stevenson M, Downie WW, Wilson GM, Assessment of clinical competence using objective structured examination BMJ 1975 1(5955):447-51.10.1136/bmj.1.5955.4471115966  [Google Scholar]  [CrossRef]  [PubMed]

[2]Khan KZ, Ramachandran S, Gaunt K, Pushkar P, The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: An historical and theoretical perspective Med Teach 2013 35(9):e1437-46.10.3109/0142159X.2013.81863423968323  [Google Scholar]  [CrossRef]  [PubMed]

[3]Miller G, The assessment of clinical skills/competence/performance Academic Medicine 1990 65(9):S63-67.  [Google Scholar]

[4]Norcini JJ, ABC of learning and teaching in medicine: Work based assessment BMJ 2003 326(7392):753-55.10.1136/bmj.326.7392.75312676847  [Google Scholar]  [CrossRef]  [PubMed]

[5]Brazeau C, Boyd L, Crosson JC, Changing an existing OSCE to a teaching tool: The making of a teaching OSCE Academic Medicine 2002 77(9):93210.1097/00001888-200209000-0003612228103  [Google Scholar]  [CrossRef]  [PubMed]

[6]Aeder L, Altshuler L, Kachur E, Barrett S, Hilfer A, Koepfer S, The “Culture OSCE”-Introducing a Formative Assessment into a Postgraduate Program Educ Health 2007 20(1):11  [Google Scholar]

[7]Lele SM, A mini-OSCE for formative assessment of diagnostic and radiographic skills at a dental college in India Journal of Dental Education 2011 75(12):1583-89.10.1002/j.0022-0337.2011.75.12.tb05218.x22184597  [Google Scholar]  [CrossRef]  [PubMed]

[8]Browne G, Bjelogrlic P, Issberner J, Jackson C, Undergraduate student assessors in a formative OSCE station Medical Teacher 2013 35(2):170-71.10.3109/0142159X.2012.73706023137245  [Google Scholar]  [CrossRef]  [PubMed]

[9]Pugh D, Desjardins I, Eva K, How do formative objective structured clinical examinations drive learning? Analysis of residents’ perceptions Medical Teacher 2017 40(1):45-52.10.1080/0142159X.2017.138850229037098  [Google Scholar]  [CrossRef]  [PubMed]

[10]Sulaiman ND, Shorbagi SI, Abdalla NY, Daghistani MT, Mahmoud IE, Al-Moslih AM, Group OSCE (GOSCE) as a formative clinical assessment tool for pre-clerkship medical students at the University of Sharjah J Taibah Univ Med Sc 2018 13(5):e409-14.10.1016/j.jtumed.2018.06.00331435356  [Google Scholar]  [CrossRef]  [PubMed]

[11]Gormley G, Summative OSCEs in undergraduate medical education Ulster Med J 2011 80(3):127-32.  [Google Scholar]

[12]Harden RM, What is an OSCE? Med Teach 1988 10(1):19-22.10.3109/014215988090193213221760  [Google Scholar]  [CrossRef]  [PubMed]

[13]Rodrigues MAV, Olmos RD, Kira CM, Lotufo PA, Santos IS, Tibério IFLC, “Shadow” OSCE examiner. A cross-sectional study comparing the “shadow” examiner with the original OSCE examiner format Clinics (Sao Paulo) 2019 74:e150210.6061/clinics/2019/e150231721909  [Google Scholar]  [CrossRef]  [PubMed]

[14]Ludwig AB, Raff AC, Lin J, Schoenbaum E, Group observed structured encounter (GOSCE) for third-year medical students improves self-assessment of clinical communication Med Teach 2017 39(9):931-35.10.1080/0142159X.2017.133236128553735  [Google Scholar]  [CrossRef]  [PubMed]

[15]Konopasek L, Kelly KV, Bylund CL, Wenderoth S, Storey-Johnson C, The Group Objective Structured Clinical Experience: building communication skills in the clinical reasoning context Patient Educ Couns 2014 96(1):79-85.10.1016/j.pec.2014.04.00324882085  [Google Scholar]  [CrossRef]  [PubMed]

[16]Sharma MK, Chandra PS, Chaturvedi SK, Team OSCE: A teaching modality for promotion of multidisciplinary work in mental health settings Indian J Psychol Med 2015 37(3):327-29.10.4103/0253-7176.16295426664082  [Google Scholar]  [CrossRef]  [PubMed]

[17]Reznick R, Regehr G, MacRae H, Martin J, McCulloch W, Testing technical skill via innovative “Bench station” examination Am J Surg 1997 173(3):226-30.10.1016/S0002-9610(97)89597-9  [Google Scholar]  [CrossRef]

[18]Martin JA, Regehr G, Reznick R, Objective structured assessment of technical skill (OSATS) for surgical residents Br J Surg 1997 84(2):273-78.10.1046/j.1365-2168.1997.02502.x9052454  [Google Scholar]  [CrossRef]  [PubMed]

[19]Niitsu H, Hirabayashi N, Yoshimitsu M, Mimura T, Taomoto J, Sugiyama Y, Using the Objective Structured Assessment of Technical Skills (OSATS) global rating scale to evaluate the skills of surgical trainees in the operating room Surg Today 2013 43(3):271-75.10.1007/s00595-012-0313-722941345  [Google Scholar]  [CrossRef]  [PubMed]

[20]Shinnick M, O’Gara E, Interprofessional Objective Structured Clinical Examination (IPOSCE) 2015 https://apps.medsch.ucla.edu/ipe/docs/9A_ip_osce_background_guide_FINAL.pdf. Accessed July 27, 2020  [Google Scholar]

[21]Cyr PR, Schirmer JM, Hayes V, Martineau C, Keane M, Integrating interprofessional case scenarios, allied embedded actors, and teaching into formative observed structured clinical exams Fam Med 2020 52(3):209-12.10.22454/FamMed.2020.76035732159833  [Google Scholar]  [CrossRef]  [PubMed]

[22]Nestel D, Mobley BL, Hunt EA, Eppich WJ, Confederates in health care simulations: Not as simple as it seems J Clinical Simulation in Nursing 2014 10(12):611-16.10.1016/j.ecns.2014.09.007  [Google Scholar]  [CrossRef]

[23]Sanko JS, Shekhter I, Kyle RR Jr, Di Benedetto S, Birnbach DJ, Establishing a convention for acting in healthcare simulation: Merging art and science Simul Healthc 2013 8(4):215-20.10.1097/SIH.0b013e318293b81423884448  [Google Scholar]  [CrossRef]  [PubMed]

[24]Srivastava T, Waghmare L, Reverse Objective Structured Clinical Examination (ROSCE) Can Med Educ J 2018 9(4):e138-41.10.36834/cmej.43220  [Google Scholar]  [CrossRef]

[25]Bevan J, Russell B, Marshall B, A new approach to OSCE preparation-PrOSCEs BMC Med Educ 2019 19(1):12610.1186/s12909-019-1571-531046773  [Google Scholar]  [CrossRef]  [PubMed]

[26]Lisa A, Elizabeth K, A Culture OSCE: Teaching Residents to Bridge Different Worlds Academic Medicine 2001 76(5):51410.1097/00001888-200105000-0004511346552  [Google Scholar]  [CrossRef]  [PubMed]

[27]Snodgrass SJ, Ashby SE, Rivett DA, Russell T, Implementation of an electronic Objective Structured Clinical Exam for assessing practical skills in pre-professional physiotherapy and occupational therapy programs: Examiner and course coordinator perspectives Australasian Journal of Educational Technology 2014 30(2):152-66.10.14742/ajet.348  [Google Scholar]  [CrossRef]

[28]Schmitz FM, Zimmermann PG, Gaunt K, StolzeSissel M, Schär G, Electronic rating of objective structured clinical examinations: mobile digital forms beat paper and pencil checklists in a comparative study. In: Holzinger A., Simonic KM. (eds) Information Quality in e-Health. USAB. Lecture Notes in Computer Science 2011 7058:501-12.10.1007/978-3-642-25364-5_35  [Google Scholar]  [CrossRef]

[29]Palarm T, Griffiths M, Phillips R, The design, implementation and evaluation of electronic objective structured clinical examinations in diagnostic imaging: An ‘action research’ strategy Journal of Diagnostic Radiography and Imaging 2004 5(2):79-87.10.1017/S1460472804000045  [Google Scholar]  [CrossRef]

[30]Cantone RE, Palmer R, Dodson LG, Biagioli FE, Insomnia Telemedicine OSCE (TeleOSCE): A simulated standardised patient video-visit case for clerkship students MedEdPORTAL 2019 15:1086710.15766/mep_2374-8265.1086732051850  [Google Scholar]  [CrossRef]  [PubMed]

[31]Andrade AD, Cifuentes P, Oliveira MC, Anam R, Roos BA, Ruiz JG, Avatar-mediated home safety assessments: Piloting a virtual objective structured clinical examination station J Grad Med Educ 2011 3(4):541-45.10.4300/JGME-D-11-00236.123205205  [Google Scholar]  [CrossRef]  [PubMed]

[32]Boulos MNK, Hetherington L, Wheeler S, Second Life: An overview of the potential of 3-D virtual worlds in medical and health education Health Info Libr J 2007 24(4):233-45.10.1111/j.1471-1842.2007.00733.x18005298  [Google Scholar]  [CrossRef]  [PubMed]

[33]Mantovani F, Castelnuovo G, Gaggioli A, Riva G, Virtual reality training for health-care professionals Cyberpsychol Behav 2003 6(4):389-95.10.1089/10949310332227877214511451  [Google Scholar]  [CrossRef]  [PubMed]

[34]Creutzfeldt J, Hedman L, Medin C, Wallin CJ, Felländer-Tsai L, Effects of repeated CPR training in virtual worlds on medical students’ performance Stud Health Technol Inform 2008 13289-94PMID: 18391263  [Google Scholar]

[35]Heinrichs WLR, Youngblood P, Harter PM, Dev P, Simulation for team training and assessment: Case studies of online training with virtual worlds World J Surg 2008 32(2):161-70.10.1007/s00268-007-9354-218188640  [Google Scholar]  [CrossRef]  [PubMed]

[36]Van der Vleuten CP, The assessment of professional competence: Developments, research and practical implications Adv Health Sci Educ 1996 1(1):41-67.10.1007/BF0059622924178994  [Google Scholar]  [CrossRef]  [PubMed]

[37]https://www.mciindia.org/CMS/e-gazette (last accessed on 23rd July 2020)  [Google Scholar]