第46回日本集中治療医学会学術集会

講演情報

English Session

[EngO7] English Session7

2019年3月2日(土) 15:05 〜 16:05 第11会場 (国立京都国際会館1F Room C-2)

Chair:Kenji Wakabayashi(Department of Intensive Care Medicine, Tokyo Medical and Dental University, Japan)

[EngO7-4] Intergrated clinical reasoning assessment in simulation crisis management class

Kanya Kumwilaisak1, Toonchai Indrambarya1, Danai Wangsaturaka2, Paweenuch Bootjeamjai1 (1.Division of Critical Care, Department of Anesthesiology, King Chulalongkorn Memorial Hospital, Thailand, 2.Department of Pharmacology Faculty of Medicine, Chulalongkorn University, Thailand)

Background
High-fidelity human patient simulation has been advocated as an effective way in crisis management trainning . In KCMH, high-fidelity human patient simulation is used for medical student’s training in crisis management in Anesthesiology curriculum. Medical students usually was examined only check list score but it cannot reflect the clinical reasoning during their performances. Diagnostic justification is the tool to assess the clinical reasoning. Clinical reasoning ability is crucial skill for medical students to perform the proper management.
Objectives
- To evaluate the level of clinical reasoning of medical student on simulation-based training in crisis situation
- To evaluate the correlation between clinical reasoning score and clinical checklists score of medical students on simulation-based training in crisis situation
Methods and materials
All 5th year medical students in 2016 and 2017 at KCMH were trained in crisis management class simulation-based training and were examined at the end of the course. The 5th year medical students were examined in crisis situation with high-fidelity human patient simulation. Two raters graded the clinical performance checklists and diagnostic justification score for each student. Diagnostic justification score were compared with the clinical performance checklists.
Results
198 medical students were examined after the end of crisis management class. The inter-rater reliability of clinical performance checklist were moderate to high reliability (kappa ≥0.6) .The inter-rater reliablility of diagnostic justification had high agreement. There were 77.3% of students were classified in correct management (clinical performance checklist) but only 47% of students were classified in complete or excellent in diagnostic justification performance. However, 32.2% of students who were classified in correct management were graded in poor and borederline diagnostic justification performance. Median clinical check list score was 11.17 and Median diagnostic justification score was 7.5 . The inter-rater reliability of clinical check list score and diagnostic justification score were 0.92 and 0.9, respectively. The correlation of clinical check list score and diagnostic justification score was 0.43

Conclusions
In crisis management training with high-fidelity human patient simulation, student diagnostic justification performance was inconsistent across the skill performance checklist. Some students were classified in correct management(clinical performance checklist) but when we focus on their clinical reasoning may be not rational. Diagnostic justification performance may be benefit to assess the clinical reasoning in training program
image/EngO7-4.jpg