Résumés
Abstract
Background: Canadian specialty training programs are expected to deliver curriculum content and assess competencies related to the CanMEDS Scholar role. We evaluated our residency research program and benchmarked it against national norms for quality improvement purposes.
Methods: In 2021, we reviewed departmental curriculum documents and surveyed current and recently graduated residents. We applied a logic model framework to assess if our program’s inputs, activities, and outputs addressed the relevant CanMeds Scholar competencies. We then descriptively benchmarked our results against a 2021 environmental scan of Canadian anesthesiology resident research programs.
Results: Local program content was successfully mapped to competencies. The local survey response rate was 40/55 (73%). In benchmarking, our program excelled in providing milestone-related assessments, research funding, administrative, supervisory, and methodologic support, and requiring a literature review, proposal presentation, and local abstract submission as output. Acceptable activities to meet research requirements vary greatly among programs. Balancing competing clinical and research responsibilities was a frequently reported challenge.
Conclusions: The logic model framework was easily applied and demonstrated our program benchmarked well against national norms. National level dialogue is needed to develop specific, consistent scholar role activities and competency assessments to bridge the gap between expected outcome standards and education practice.
Résumé
Contexte : Les programmes de spécialité canadiens doivent proposer un contenu de formation en lien avec le rôle CanMEDS d’érudit et évaluer les compétences qui s’y attachent. Nous avons évalué notre programme de résidence en recherche par rapport aux normes nationales en la matière à des fins d’amélioration de la qualité.
Méthodes : En 2021, nous avons examiné les documents du programme d’études du département et interrogé des résidents et des médecins récemment diplômés. Nous avons utilisé un modèle logique pour déterminer si les intrants, les activités et les extrants de notre programme couvraient adéquatement les compétences pertinentes liées au rôle CanMeds d’érudit. Nous avons ensuite comparé de façon descriptive nos résultats à une analyse du milieu des programmes de résidence canadiens en recherche en anesthésiologie effectuée la même année.
Résultats : Nous avons établi une correspondance entre le contenu du programme local et les compétences. Le taux de réponse à l’enquête était de 40/55 (73 %). D’après l’analyse comparative, notre programme se démarque par l’offre d’évaluations d’étape, de fonds de recherche, de soutien administratif, de supervision, d’orientation méthodologique, et, en ce qui concerne les extrants, par l’exigence d’une analyse documentaire, de la présentation d’une proposition et de la soumission d’un résumé à l’université. Les activités admissibles pour répondre aux exigences de la recherche varient considérablement d’un programme à l’autre. De nombreux répondants ont signalé la difficulté de concilier les responsabilités cliniques et de recherche.
Conclusions : L’application du modèle logique a été aisée et elle a permis de montrer que notre programme respecte les normes nationales. Un dialogue au niveau national est nécessaire pour définir de manière précise et cohérente les activités et les évaluations des compétences en lien avec le rôle d’érudit afin de combler le fossé entre les normes quant aux résultats attendus et les pratiques des programmes.
Parties annexes
Bibliography
- Frank JR, Snell L, Sherbino J. CanMEDS 2015 physician competency framework; 2015.
- The Royal College of Physicians and Surgeons of Canada. Anesthesiology Competencies (2017 - Editorial Revision 2021; Version 1.0); 2021.
- Richardson D, Oswald A, Chan M, Lang E, Harvey B. Scholar. In: Frank J, Snell L, Sherbino J, eds. CanMEDS 2015 physician competency framework. Royal College of Physicians and Surgeons of Canada; 2015; 2015.
- The Royal College of Physicians and Surgeons of Canada CanMEDS. Milestones. Available from https://canmeds.royalcollege.ca/en/milestones [Accessed April 7, 2022].
- The Royal College of Physicians and Surgeons of Canada. EPAs and CanMEDS milestones. Available from https://www.royalcollege.ca/rcsite/cbd/implementation/cbd-milestones-epas-e [Accessed on Feb 2, 2023].
- Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009. https://doi.org/10.1097/ACM.0000000000002743
- ICE Blog. Introducing a core components framework for competency-based medical education. Nov 18, 2021. Available from https://icenetblog.royalcollege.ca/2021/11/18/introducing-a-core-components-framework-for-cbme/ [Accessed on Feb 2, 2023].
- The Royal College of Physicians and Surgeons of Canada. What is CBD? Available from https://www.royalcollege.ca/rcsite/cbd/what-is-cbd-e [Accessed on Feb 2, 2023].
- Mutter T, Girling L. Resident Research in the CBME Era: A Report of a Survey of ACUDA Research Committee Members.; 2021.
- Chou S, Cole G, McLaughlin K, Lockyer J. CanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfaction. Med Educ. 2008;42(9):879-886. https://doi.org/10.1111/J.1365-2923.2008.03111.X
- Binnendyk J, Pack R, Field E, Watling C. Not wanted on the voyage: highlighting intrinsic CanMEDS gaps in Competence by Design curricula. Can Med Educ J. 2021;12(4):2021. https://doi.org/10.36834/cmej.70950
- The Royal College of Physicians and Surgeons of Canada. CanMEDS 25. Available from https://www.royalcollege.ca/rcsite/canmeds/canmeds-25-e [Accessed on Feb 3, 2023].
- Dattakumar R, Jagadeesh R. A review of literature on benchmarking. https://doi.org/10.1108/14635770310477744
- Ettorchi -Tardy A, Levif M, Michel P. Benchmarking: a method for continuous quality improvement in health. Healthc Policy. 2012;7(4):e101. https://doi.org/10.12927/hcpol.2012.22872
- Flesher J, Bragg D. Evaluation and benchmarking module.; 2013. https://occrl.illinois.edu/docs/librariesprovider2/ptr/evaluation-benchmarking-module.pdf [Accessed April 7, 2022].
- Wilkinson TJ, Hudson JN, Mccoll GJ, Hu WCY, Jolly BC, Schuwirth LWT. Medical teacher medical school benchmarking-from tools to programmes. Published online 2014. https://doi.org/10.3109/0142159X.2014.932902
- Lankford WM. Benchmarking: Understanding The Basics. Coast Bus J. 2002;1(1):57-62.
- Anastasopoulos V. Logic models for program evaluation: purpose and parts. In: D’Eon M, ed. CanMedEd-Ipedia: The CORAL Collection. Concepts as online resources for accelerated learning. University of Saskatchewan Teaching and Learning; 2018. https://teaching.usask.ca/articles/logic-models-purpose-and-parts.php [Accessed April 7, 2022].
- McLaughlin J, Jordan G. Using Logic Models. In: Wholey JS, Hatry HP, Newcomer KE, eds. Handbook of practical program evaluation second edition. Vol 2004. 2nd ed. Jossey-Bass. http://surjonopwkub.lecture.ub.ac.id/files/2018/02/Handbook_of_Practical_Program_Evaluation__Essential_Texts_for_Nonprofit_and_Public_Leadership_and_Mana_.pdf#page=55 [Accessed April 7, 2022].
- Jain S, Menon K, Piquette D, Gottesman R, Hutchison J, Gilfoyle E. The development of a critical care resident research curriculum: a needs assessment. Can Respir J. 2016;2016. https://doi.org/10.1155/2016/9795739
- Bandiera G, Sherbino J, Frank J. The CanMEDS assessment tools handbook. an introductory guide to assessment methods for the CanMEDS competencies. In: The Royal College of Physicians and Surgeons of Canada; 2006.
- Noble C, Billett SR, Phang DTY, Sharma S, Hashem F, Rogers GD. Supporting resident research learning in the workplace: a rapid realist review. Acad Med. 2018;93(11):1732-1740. https://doi.org/10.1097/ACM.0000000000002416
- Whitehead CR, Kuper A, Hodges B, Ellaway R. Conceptual and practical challenges in the assessment of physician competencies. Med Teach. 2015;37(3):245-251. https://doi.org/10.3109/0142159X.2014.993599
- Gaboury I, Ouellet K, Xhignesse M, St-Onge C. Strategies identified by program directors to improve adoption of the CanMEDS framework. Can Med Ed J. 2018;9(4):e26-34. https://doi.org/10.36834/cmej.43049
- Silcox LC, Ashbury TL, Vandenkerkhof EG, Milne B. Residents’ and program directors’ attitudes toward research during anesthesiology training: a Canadian perspective. Anesth Analg. 2006;102(3):859-864. https://doi.org/10.1213/01.ane.0000194874.28870.fd