JCDR - Register at Journal of Clinical and Diagnostic Research
Journal of Clinical and Diagnostic Research, ISSN - 0973 - 709X
Community Medicine Section DOI : 10.7860/JCDR/2017/24884.10607
Year : 2017 | Month : Sep | Volume : 11 | Issue : 9 Full Version Page : LC10 - LC13

Use of Multi-Response Format Test in the Assessment of Medical Students’ Critical Thinking Ability

Mahboobeh Khabaz Mafinejad1, Seyyed Kamran Soltani Arabshahi2, Alireza Monajemi3, Mohammad Jalili4, Akbar Soltani5, Javad Rasouli6

1 PhD Candidate, Department of Medical Education, Faculty of Medicine, Tehran University of Medical Sciences, Tehran, Iran.
2 Professor, Department of Medical Education, Center for Educational Research in Medical Sciences (CERMS), Iran University of Medical Sciences (IUMS), Tehran, Iran.
3 Assistant Professor, Philosophy of Science Department, Institute for Humanities and Cultural Studies, Tehran, Iran.
4 Associate Professor, Department of Emergency Medicine, Department of Medical Education, Faculty of Medicine, Tehran University of Medical Sciences, Tehran, Iran.
5 Professor, Center for Critical Thinking and Evidence Based Medicine, Institute of Endocrinology and Metabolism, Tehran University of Medical Sciences, Tehran, Iran.
6 Assistant Professor, Department of Epidemiology and Biostatistics, Urmia University of Medical Sciences, Urmia, Iran.


NAME, ADDRESS, E-MAIL ID OF THE CORRESPONDING AUTHOR: Dr. Seyyed Kamran Soltani Arabshahi, Professor, Department of Medical Education, Center for Educational Research in Medical Sciences (CERMS), Tehran-1417613151, Iran.
E-mail: soltarab34@gmail.com
Abstract

Introduction

To evaluate students critical thinking skills effectively, change in assessment practices is must. The assessment of a student’s ability to think critically is a constant challenge, and yet there is considerable debate on the best assessment method. There is evidence that the intrinsic nature of open and closed-ended response questions is to measure separate cognitive abilities.

Aim

To assess critical thinking ability of medical students by using multi-response format of assessment.

Materials and Methods

A cross-divtional study was conducted on a group of 159 undergraduate third-year medical students. All the participants completed the California Critical Thinking Skills Test (CCTST) consisting of 34 multiple-choice questions to measure general critical thinking skills and a researcher-developed test that combines open and closed-ended questions. A researcher-developed 48-question exam, consisting of 8 short-answers and 5 essay questions, 19 Multiple-Choice Questions (MCQ), and 16 True-False (TF) questions, was used to measure critical thinking skills. Correlation analyses were performed using Pearson’s coefficient to explore the association between the total scores of tests and subtests.

Results

One hundred and fifty-nine students participated in this study. The sample comprised 81 females (51%) and 78 males (49%) with an age range of 20±2.8 years (mean 21.2 years). The response rate was 64.1%. A significant positive correlation was found between types of questions and critical thinking scores, of which the correlations of MCQ (r=0.82) and essay questions (r=0.77) were strongest. The significant positive correlations between multi-response format test and CCTST’s subscales were seen in analysis, evaluation, inference and inductive reasoning. Unlike CCTST subscales, multi-response format test have weak correlation with CCTST total score (r=0.45, p=0.06).

Conclusion

This study highlights the importance of considering multi-response format test in the assessment of critical thinking abilities of medical students by using both open and closed-ended response questions.

Keywords

Introduction

In the 21st century, the main goal of education is not the acquisition of more information, but it is rather to strengthen students’ critical thinking skills which enable them to analyse and then apply the existing information [1]. The importance of this issue increases when promotion of the ability to think critically has been identified as a considerable factor in the professional success of medical students [2]. Many university faculty members believe that critical thinking should be the main purpose of college education [3], and several organisations have called for critical thinking development in medical education [4]. As listed in reports released by medical universities such as Aberdeen, Dundee and McGill, critical thinking has been viewed as a key competency to be cultivated and assessed in medical students [5,6]. In addition to Western universities, recent reforms in medical undergraduate curriculum in Iran have as well advocated the improvement of critical thinking skills in medical students [7]. Despite the vested interest developed among medical colleges in improving critical thinking as a core competency, evidences indicate that assessing critical thinking has entailed many difficulties and challenges [8].

The difficulties associated with critical thinking assessment are diverse. One of the obstacles is lack of consensus regarding an obvious and operational description of critical thinking. Notwithstanding this most researchers concur on the importance of critical thinking competency, and its being nurtured and honed among medical students [9-11]. Then, with varied definitions of critical thinking presented so far, its assessment still remains indefinite [12-14]. In other words, there has not been a consensus on proper assessment that objectively and accurately computes medical students’ critical thinking. Another stumbling block is that critical thinking is considered as a complex and multivariate concept which contains both cognitive and dispositional components [15]. Consequently, a test should be designed in a way that can more validly measure it.

The assessment of a student’s ability to think critically is a constant challenge, and yet there is considerable debate on the best assessment methods [16,17]. There is evidence that closed and open-ended response questions measure separate cognitive capabilities, with their respective constraints looming [18,19]. Cox M et al., mentioned that no single method of assessment is better than other and possibly a valid test needs a combination of different assessment methods [20]. Another study recommends that multiple test measures be used in order to assess changes in students’ critical thinking skills [21].

The aim of the present study was to contribute further to methodological gaps in the assessment of critical thinking as a major outcome of medical education by utilising multi-response format of assessment. This study was of paramount importance as it provided a multi-response format test for the assessment of critical thinking using surrogate measures which had never been used before at any medical school.

Materials and Methods

This cross-sectional study was conducted at Tehran University of Medical Sciences (TUMS), School of Medicine, one of Iran’s largest and oldest medical schools, in 2013-2014. The undergraduate medical curriculum at TUMS is divided into three phases; including two and half years of studies in basic sciences, one year of pathophysiology, and three and half years of clinical phase. Since 2006, TUMS commenced to develop and implement a newly revised curriculum for delivering undergraduate medical education. One main feature of the revised curriculum is to pay more attention to the integration of critical thinking as cross-cutting theme for training and assessing medical students [22].

In this study, data were collected from a group of medical students involved in the completion of California Critical Thinking Skills test (CCTST) [23], to measure general critical thinking ability and a researcher-developed test which consists of both open and closed-ended questions. The initial convenience sample was 159 third-year students enrolled in medical undergraduate curriculum at TUMS. Third-year medical students were chosen to participate in this study because they had completed the mandatory critical thinking course. Out of these 159 students, 102 completed both the tests. The reason of having incomplete data was voluntary participation and exclusion of participants who did not complete all tests.

Instruments

Multi-response format test: A researcher-developed 48-question exam (The Persian version), consisting of short answers and essay questions, Multiple-Choice Questions (MCQ), and True-False (TF) questions, was used to measure critical thinking skills of third-year medical students. While the content of the questions was focused on medical-related subject matter, these were developed to measure a range of critical thinking abilities (assumptions, analysis, inference, evaluation, cognitive biases etc.,). Once complete, the research team reviewed the questions and then sent them to the medical education specialists and critical thinking experts for review of content. Additionally, internal consistency was examined using Cronbach’s alpha (0.69) computed for the total score. We administered the test assessments to medical students. Respondents, with an aim of determining evidence of critical thinking ability in relation to a medical context, using multi-response format test, answered the test consisting of open-ended questions (8 short answer questions and 5 essays) and closed-ended questions (19 MCQ which was followed by 4 or 5 options and 16 TF questions). In a bid to motivate students to participate, a battery of questions was designed which were related to the students’ background, promising them to provide feedback on the results in due course.

California critical thinking skills test [24]: The CCTST questionnaire (The Persian version of CCTST-form B) was used to evaluate general critical thinking skills of medical students. The CCTST contains 34 multiple-choice questions (analysis, inference, evaluation, deductive reasoning, and inductive reasoning) with a correct answer (0-1 score) that targets those general critical thinking skills, considered to be essential elements in higher education [Table/Fig-1]. Reliability and validity of the CCTST assessment has been reported in previous publications [23]. Each correct response was assigned one score and total score on CCTST ranged from a minimum of 0 to a maximum of 34.

Possible ranges of instrument total scores and subscales.

Instrument and SubscaleRange of Possible Scores
CCTST total score0-34
Multi-response format Test total score0-58
Essay questions (each question 0-3 score)0-15
Short answer questions0-8
Multiple choice questions0-19
True-false questions0-16

Procedure

The multi-response format test was completed in about 75 minutes. Medical and health problems were chosen due to their multifaceted nature, and the fact that they were related to the students’ background. Students took the test in an exam hall, and each put forth the effort to write down his/her responses. There were tight invigilation rules, and communication and collusion between candidates (by copying, whispering or any kind of signal, exchange of paper or objects) was not allowed. To establish the degree of content independent critical thinking ability, CCTST (Form B) was conducted two weeks later. It took students about 45 minutes to complete the instrument.

Ethical Consideration

This paper is a part of a thesis submitted for the degree of PhD in Medical education. Research Ethical Committee of Tehran University of Medical Sciences approved the study. Students received a full explanation prior to participation which was voluntary and anonymity was maintained and guaranteed.

Statistical Analysis

For all statistical analyses SPSS 22.0 was used. Descriptive statistics (means, standard deviations, and frequencies) were used to describe the sample. Reliability of the researcher-developed test was examined using Cronbach’s alpha. Pearson correlation co-efficient was employed to explore the association between the total scores of tests and subtests.

Results

One hundred and fifty nine students were included in the study which comprised 81 females (51%) and 78 males (49%) with an age range of 20±2.8 years (mean 21.2 years). One hundred two medical students completed both the CCTST and multi-response format test. Data from matched pairs were analysed. The response rate was 64.1%. The possible range for instrument scores is shown in [Table/Fig-1].

Pearson correlation was computed to explore the association between the CCTST and multi-response format test subscales. [Table/Fig-2,3] present the correlation between multi-response format test and CCTST scores and subtests scores. While there were positive correlation with all multi-response format test subtests, the relationship of MCQ (r=0.82) and essay questions (r=0.77) was strongest. CCTST total score correlated best with total scores for MCQ questions (r=0.25) [Table/Fig-2].

Correlation of total scores on multi-response format test and CCTST with different format questions was computed. a and b indicate statistical significance at confidence levels of p<0.05 and p<0.01 respectively

Instrument andSubscaleMulti-Response Format Test Total ScoreCCTST TestTotal Score
Essay questions0.77b0.18
MCQ0.82b0.25a
Short answer question0.42b0.05
TF0.30a0.06

California Critical Thinking Test (CCTST), Multiple Choice Question (MCQ), True-False (TF)


Correlation of scores on subtests’ CCTST with multi-response format test was computed. a and b indicate statistical significance at confidence levels of p<0.05 and p<0.01 respectively

Instrument and SubscaleMulti-Response Format Test Total ScoreCCTST TestTotal Score
Analysis0.25a0.56b
Evaluation0.31a0.64b
Inference0.24a0.73b
Deductive Reasoning0.170.70b
Inductive Reasoning0.25a0.74b

California Critical Thinking Test (CCTST)


The significant positive correlations between multi-response format test and CCTST’s subscales were seen in analysis, evaluation, inference and inductive reasoning. Significant correlations were found between CCTST total score and scores for nearly all its subscales. The correlations between CCTST score and its subscales scores were between 0.56 and 0.74 [Table/Fig-3].

In general, descriptive analysis of CCTST and multi-response format test total scores are presented in [Table/Fig-4]. There was a weak positive correlation between CCTST with multi-response format test total scores, which was not significant (r=0.22, p=0.06).

Medical students’ critical thinking total scores

Instrument and SubscaleMeanSDMinMax
Multi-Response Format Test35.215.718.549
CCTST Test Total Score18.256.051127

Discussion

The combination of two-response format (open and closed-ended questions) into one test is viewed as the current trend in the assessment of critical thinking. Ku KY has mentioned that any measurement of critical thinking that utilises a single-response format is neither sufficient in reflecting students’ true critical thinking ability, nor compatible with the conceptualisation of critical thinking [25]. In this study, closed and open-ended questions were utilised to indicate a better understanding of medical students’ critical thinking abilities while facing problems in the field of medicine.

The results showed changes in the assessment of medical students’ critical thinking by multi-response format test consisting of MCQ, essay, TF and short-answer questions. Our results showed that students’ scores of MCQ and essay questions significantly correlated with total scores of medical students’ critical thinking abilities. This does confirm that well-constructed MCQ can also assess higher levels of cognitive skills of medical students [26]. These findings that MCQ and essay can be used to assess critical thinking were similar to those found in previous studies. The Halpern Critical Thinking Assessment Using Everyday Situations (HCTAES) is a general test that incorporates both multiple-choice and essay questions into a single test [19]. Stein and Haynes developed a measurement tool including standardised multiple-choice tests, essay tests, and faculty-developed rubrics for evaluating student work [27].

In current the study, the mean CCTST score for medical students was 18.25. Athari Z et al., and Haghani F et al., have reported marginally lower means CCTST in medical students [28,29]. Although, the mean score gained in the area of critical thinking proved not to be high enough, by the medical students in our study, it was higher than the national scores as compared to the similar international studies. This can reflect the effects of critical thinking training program in renewal of medical curriculum.

In addition, a positive correlation was observed between total scores of CCTST and multi-response format test during the study, though being quite small. These findings indicate that performance on the multi-response format test in medicine is little related to general measures of critical thinking skills. In our study, the CCTST was used to measure critical thinking as a general competency [30], while the multi-response format test was used to measure critical thinking in medicine discipline. This can reflect that medical students’ multi-response test score is not explained by these other critical thinking tests, which rely on multiple-choice questions [27].

Performance on the multi-response format test generally correlates with scores on analysis, evaluation, inference, and inductive reasoning of CCSTS. However, the relative low correlation between the multi-response format test and subtests’ CCTST scores might suggest that the two tests assess different components of critical thinking via different test contents.

Limitation

Several limitations of this study prevent cautious transfer of the findings to other contexts. The time needed to complete the two tests was long. This may have caused the unwillingness of some medical students to answer the tests properly. Future researches are necessary to validate the findings of this work.

Conclusion

This study highlights the importance of considering the right format of response in critical thinking assessment. An important feature of the multi-response format test is that it relies primarily on different types of methods to assess critical thinking, unlike many standardised tests that rely on multiple-choice questions. It consists of MCQ, essay, short answer and true-false questions for evaluating medical students’ critical thinking. The results showed that medical students response to essay and well-constructed MCQ questions reveal their critical thinking when encountering medical problems.

California Critical Thinking Test (CCTST), Multiple Choice Question (MCQ), True-False (TF)California Critical Thinking Test (CCTST)

References

[1]Gelder TV, Teaching critical thinking: Some lessons from cognitive science College teaching 2005 53(1):41-48.  [Google Scholar]

[2]Gambrill E, Critical thinking in clinical practice: Improving the quality of judgments and decisions 2006 John Wiley & Sons  [Google Scholar]

[3]Yuretich RF, Encouraging critical thinking Journal of College Science Teaching 2003 33(3):40  [Google Scholar]

[4]Committee C, Global minimum essential requirements in medical education Medical Teacher 2002 24(2):130  [Google Scholar]

[5]Simpson J, Furnace J, Crosby J, Cumming A, Evans P, David MFB, The Scottish doctor—learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners Medical teacher 2002 24(2):136-43.  [Google Scholar]

[6]Fuks A, Boudreau JD, Cassell EJ, Teaching clinical thinking to first-year medical students Medical teacher 2009 31(2):105-11.  [Google Scholar]

[7]Mirzazadeh A, Hejri SM, Jalili M, Asghari F, Labaf A, Siyahkal MS, Defining a Competency Framework: The First Step toward Competency-Based Medical Education Acta Medica Iranica 2014 52(9):710-16.  [Google Scholar]

[8]Rane-Szostak D, Robertson JF, Issues in measuring critical thinking: meeting the challenge The Journal Of Nursing Education 1996 35(1):5-11.  [Google Scholar]

[9]Simpson E, Courtney MD, Critical thinking in nursing education: Literature review International Journal of Nursing Practice 2002 8(April):89-98.  [Google Scholar]

[10]Allen GD, Rubenfeld MG, Scheffer BK, Reliability of assessment of critical thinking Journal of Professional Nursing 2004 20(1):15-22.  [Google Scholar]

[11]Daly WM, The development of an alternative method in the assessment of critical thinking as an outcome of nursing education Journal of Advanced Nursing 2001 36(1):120-30.  [Google Scholar]

[12]Bissell AN, Lemons PP, A new method for assessing critical thinking in the classroom BioScience 2006 56(1):66-72.  [Google Scholar]

[13]Sternod L, French B, Test Review: Watson-Glaser II Critical Thinking Appraisal Journal of Psychoeducational Assessment 2015 :0734282915622855  [Google Scholar]

[14]Butler HA, Halpern Critical Thinking Assessment Predicts Real-World Outcomes of Critical Thinking Applied Cognitive Psychology 2012 26(5):721-29.  [Google Scholar]

[15]Halpern DF, Teaching for critical thinking: Helping college students develop the skills and dispositions of a critical thinker New directions for teaching and learning 1999 1999(80):69-74.  [Google Scholar]

[16]Manogue M, Kelly M, Bartakova Masaryk S, Brown G, Catalanotto F, Choo-Soo T, 2.1 Evolving methods of assessment European Journal of Dental Education 2002 6(s3):53-66.  [Google Scholar]

[17]Staib S, Teaching and measuring critical thinking Journal of Nursing Education 2003 42(11):498-508.  [Google Scholar]

[18]Ennis RH, Critical thinking assessment Theory into practice 1993 32(3):179-86.  [Google Scholar]

[19]Halpern D, Halpern critical thinking assessment using everyday situations: Background and scoring standards 2007 ClaremontClaremont McKenna College  [Google Scholar]

[20]Cox M, Irby DM, Epstein RM, Assessment in medical education New England Journal of Medicine 2007 356(4):387-96.  [Google Scholar]

[21]Behar-Horenstein LS, Niu L, Teaching critical thinking skills in higher education: A review of the literature Journal of College Teaching and Learning 2011 8(2):25  [Google Scholar]

[22]Soltani A, Allaa M, Moosapour H, Aletaha A, Shahrtash F, Monajemi A, Integration of cognitive skills as a cross-cutting theme into the undergraduate medical curriculum at Tehran University of Medical Sciences Acta Medica Iranica 2017 :68-73.  [Google Scholar]

[23]Khalili H, HOSSEIN ZM, Investigation of reliability, validity and normality Persian version of the California Critical Thinking Skills Test; Form B (CCTST) 2003   [Google Scholar]

[24]Facione PA, Facione NC, Giancarlo C, The California critical thinking skills test 1992 Millbrae, CACalifornia Academic Press  [Google Scholar]

[25]Ku KY, Assessing students’ critical thinking performance: Urging for measurements using multi-response format Thinking skills and creativity 2009 4(1):70-76.  [Google Scholar]

[26]Aljarallah BM, Evaluation of modified essay questions (MEQ) and multiple choice questions (MCQ) as a tool for assessing the cognitive skills of undergraduate medical students International Journal of Health Sciences 2011 5(1):39-43.  [Google Scholar]

[27]Stein B, Haynes A, Engaging faculty in the assessment and improvement of students’ critical thinking using the critical thinking assessment test Change: the magazine of higher learning 2011 43(2):44-49.  [Google Scholar]

[28]Athari Z, Sharif M, Nematbakhsh M, Babamohammadi H, Evaluation of Critical Thinking Skills in Isfahan University of Medical Sciences’ Students and Its Relationship with Their Rank in University Entrance Exam Rank Iranian Journal of Medical Education 2009 9(1):5-12.  [Google Scholar]

[29]Haghani F, Aminian B, Kamali F, Jamshidian S, Critical thinking skills and their relationship with emotional intelligence in medical students of introductory clinical medicine (icm) course in Isfahan University of Medical Sciences Iranian Journal of Medical Education 2011 10(5):906-17.  [Google Scholar]

[30]Facione P, Facione N, Blohm S, Giancarlo C, The California Critical Thinking Skills Test: CCTST. Form A, Form B, and Form 2000 Test manual 2002   [Google Scholar]