Umfassende Service-Einschränkungen im Bereich Ausleihe ab 17. März!

Treffer: An evaluation of programmatic assessment across health professions education using contribution analysis.

Title:
An evaluation of programmatic assessment across health professions education using contribution analysis.
Authors:
Jamieson J; School of Medical and Health Sciences, Edith Cowan University, 270 Joondalup Drive, Joondalup, Perth, WA, 6027, Australia. j.jamieson@ecu.edu.au., Palermo C; Faculty of Medicine, Nursing and Health Sciences, Monash University, Melbourne, Australia., Hay M; Faculty of Medicine, Nursing and Health Sciences, Monash University, Melbourne, Australia., Bacon R; Faculty of Health, University of Canberra, Canberra, Australia., Lutze J; Faculty of Science, Medicine and Health, University of Wollongong, Wollongong, Australia., Gibson S; Faculty of Medicine, Nursing and Health Sciences, Monash University, Monash, Australia.
Source:
Advances in health sciences education : theory and practice [Adv Health Sci Educ Theory Pract] 2026 Feb; Vol. 31 (1), pp. 211-238. Date of Electronic Publication: 2025 Jun 04.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Springer Netherlands Country of Publication: Netherlands NLM ID: 9612021 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1573-1677 (Electronic) Linking ISSN: 13824996 NLM ISO Abbreviation: Adv Health Sci Educ Theory Pract Subsets: MEDLINE
Imprint Name(s):
Publication: <2009- > : [Dordrecht] : Springer Netherlands
Original Publication: Dordrecht ; Boston : Kluwer Academic Publishers, c1996-
References:
Ajjawi, R., & Kent, F. (2022). Understanding realist reviews for medical education. Journal of Graduate Medical Education, 14(3), 274–278. https://doi.org/10.4300/JGME-D-22-00334.1. (PMID: 10.4300/JGME-D-22-00334.1)
Allen, L. M., Hay, M., & Palermo, C. (2022). Evaluation in health professions education: Is measuring outcomes enough? Medical Education, 56(1), 127–136. https://doi.org/10.1111/medu.14654. (PMID: 10.1111/medu.14654)
Baartman, L., Baukema, H., & Prins, F. (2023). Exploring students’ feedback seeking behavior in the context of programmatic assessment. Assessment and Evaluation in Higher Education, 48(5), 598–612. (PMID: 10.1080/02602938.2022.2100875)
Baartman, L., van Schilt-mol, T., & van der Vleuten, C. (2022). Programmatic assessment design choices in nine programs in higher education. Frontiers in Education, 7, 1–13. https://doi.org/10.3389/feduc.2022.931980. (PMID: 10.3389/feduc.2022.931980)
Bate, F., Fyfe, S., Griffiths, D., Russell, K., Skinner, S., & Tor, E. (2020). Does an incremental approach to implementing programmatic assessment work? Reflections on the change process. AMEE MedEdPublish, 9(55).
Berger, R. (2015). Now I see it, now I don’t: Researcher’s position and reflexivity in qualitative research. Qualitative Research, 15(2), 219–234. https://doi.org/10.1177/1468794112468475. (PMID: 10.1177/1468794112468475)
Biggs, J. S., Farrell, L., Lawrence, G., & Johnson, J. K. (2014). A practical example of contribution analysis to a public health intervention. Evaluation, 20(2), 214–229. https://doi.org/10.1177/1356389014527527. (PMID: 10.1177/1356389014527527)
Bok, H. G. J., van der Vleuten, C. P. M., & de Jong, L. H. (2021). “Prevention is better than cure”: A plea to emphasize the learning function of competence committees in programmatic assessment. Frontiers in Veterinary Science, 8, 638455. https://doi.org/10.3389/fvets.2021.638455. (PMID: 10.3389/fvets.2021.638455)
Brousselle, A., & Buregeya, J.-M. (2018). Theory-based evaluations: Framing the existence of a new theory in evaluation and the rise of the 5th generation. Eval, 24(2), 153–168. https://doi.org/10.1177/1356389018765487. (PMID: 10.1177/1356389018765487)
Brown, C., Ross, S., Cleland, J., & Walsh, K. (2015). Money makes the (medical assessment) world go round: The cost of components of a summative final year objective structured clinical examination (OSCE). Medical Teacher, 37(7), 653–659. https://doi.org/10.3109/0142159x.2015.1033389. (PMID: 10.3109/0142159x.2015.1033389)
Budhwani, S., & McDavid, J. C. (2017). Contribution analysis: Theoretical and practical challenges and prospects for evaluators. Canadian Journal of Program Evaluation, 32(1), 1–24. https://doi.org/10.3138/cjpe.31121. (PMID: 10.3138/cjpe.31121)
Buregeya, J. M., Loignon, C., & Brousselle, A. (2020). Contribution analysis to analyze the effects of the health impact assessment at the local level: A case of urban revitalization. Evaluation & Program Planning, 79, 101746. https://doi.org/10.1016/j.evalprogplan.2019.101746. (PMID: 10.1016/j.evalprogplan.2019.101746)
Caretta-Weyer, H. A., Smirnova, A., Barone, M. A., Frank, J. R., Hernandez-Boussard, T., Levinson, D., Lombarts, K., Lomis, K. D., Martini, A., Schumacher, D. J., Turner, D. A., & Schuh, A. (2024). The next era of assessment: Building a trustworthy assessment system. Perspectives on Medical Education, 13(1), 12–23. (PMID: 10.5334/pme.1110)
Chen, H. T. (2004). The roots of theory-drive evaluation: Current views and origins. In M. C. Alkin (Ed.), Evaluation roots: Tracing theorists’ views and influences. SAGE.
Choi, T., Sarkar, M., Bonham, M., Brock, T., Brooks, I. A., Diug, B., Ilic, D., Kumar, A., Lau, W.-M., Lindley, J., Morphet, J., Simmons, M., Volders, E., White, P. J., Wright, C., & Palermo, C. (2023). Using contribution analysis to evaluate health professions and health sciences programs. Frontiers in Medicine, 10, 1–11. https://doi.org/10.3389/fmed.2023.1146832. (PMID: 10.3389/fmed.2023.1146832)
Cleland, J. A., Foo, J., Ilic, D., Maloney, S., & You, Y. (2020). “You can’t always get what you want …”: Economic thinking, constrained optimization and health professions education. Advances in Health Sciences Education, 25(5), 1163–1175. https://doi.org/10.1007/s10459-020-10007-w. (PMID: 10.1007/s10459-020-10007-w)
Dart, J., Twohig, C., Anderson, A., Bryce, A., Collins, J., Gibson, S., Kleve, S., Porter, J., Volders, E., & Palermo, C. (2021). The value of programmatic assessment in supporting educators and students to succeed: A qualitative evaluation. Journal of the Academy of Nutrition and Dietetics. doi: https://doi.org/10.1016/j.jand.2021.01.013.
de Jong, L. H., Bok, H. G. J., Schellekens, L. H., Kremer, W. D. J., Jonker, F. H., & van der Vleuten, C. P. M. (2022). Shaping the right conditions in programmatic assessment: How quality of narrative information affects the quality of high-stakes decision-making. BMC Medical Education, 22(1), 1–10. https://doi.org/10.1186/s12909-022-03257-2. (PMID: 10.1186/s12909-022-03257-2)
Delahais, T., & Toulemonde, J. (2012). Applying contribution analysis: Lessons from five years of practice. Evaluation, 18(3), 281–293. (PMID: 10.1177/1356389012450810)
Delahais, T., & Toulemonde, J. (2017). Making rigorous causal claims in a real-life context: Has research contributed to sustainable forest management? Evaluation, 23(4), 370–388. https://doi.org/10.1177/1356389017733211. (PMID: 10.1177/1356389017733211)
Downes, A., Novicki, E., & Howard, J. (2019). Using the contribution analysis approach to evaluate science impact: A case study of the national institute for occupational safety and health. American Journal of Evaluation, 40(2), 177–189. https://doi.org/10.1177/1098214018767046. (PMID: 10.1177/1098214018767046)
Dweck, C. S. (2019). The choice to make a difference. Perspectives on Psychological Science: A Journal of the Association for Psychological Science, 14(1), 21–25. https://doi.org/10.1177/1745691618804180. (PMID: 10.1177/1745691618804180)
Dybdal, L., Bohni Nielsen, S., & Lemire, S. (2010). Contribution analysis applied: Reflections on scope and methodology. Canadian Journal of Program Evaluation, 25, 29–57. https://doi.org/10.3138/cjpe.25.002. (PMID: 10.3138/cjpe.25.002)
Ellaway, R. H., Chou, C. L., & Kalet, A. L. (2018). Situating remediation: Accommodating success and failure in medical education systems. Academic Medicine, 93(3), 391–398. (PMID: 10.1097/ACM.0000000000001855)
Frank, J. R., Mungroo, R., Ahmad, Y., Wang, M., De Rossi, S., & Horsley, T. (2010). Toward a definition of competency-based education in medicine: A systematic review of published definitions. Medical Teacher, 32(8), 631–637. https://doi.org/10.3109/0142159x.2010.500898. (PMID: 10.3109/0142159x.2010.500898)
Frye, A. W., & Hemmer, P. A. (2012). Program evaluation models and related theories: AMEE guide no. 67. Medical Teacher, 34(5), e288–299. https://doi.org/10.3109/0142159x.2012.668637. (PMID: 10.3109/0142159x.2012.668637)
Gale, K. N., Heath, G., Cameron, E., Rashid, S., & Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Medical Research Methodology, 13(1), 117–124. https://doi.org/10.1186/1471-2288-13-117. (PMID: 10.1186/1471-2288-13-117)
Govaerts, M., van der Vleuten, C., & Schut, S. (2022). Implementation of programmatic assessment: Challenges and lessons learned. Education Sciences, 12(10), 717–722. https://www.mdpi.com/2227-7102/12/10/717. (PMID: 10.3390/educsci12100717)
Grant, J., & Grant, L. (2023). Quality and constructed knowledge: Truth, paradigms, and the state of the science. Medical Education, 57(1), 23–30. https://doi.org/10.1111/medu.14871. (PMID: 10.1111/medu.14871)
Haji, F., Morin, M. P., & Parker, K. (2013). Rethinking programme evaluation in health professions education: Beyond ‘did it work?’. Medical Education, 47(4), 342–351. https://doi.org/10.1111/medu.12091. (PMID: 10.1111/medu.12091)
Hall, A. K., Schumacher, D. J., Thoma, B., Caretta-Weyer, H., Kinnear, B., Gruppen, L., Cooke, L. J., Frank, J. R., & Van Melle, E. (2021). Outcomes of competency-based medical education: A taxonomy for shared language. Medical Teacher, 43(7), 788–793. https://doi.org/10.1080/0142159x.2021.1925643. (PMID: 10.1080/0142159x.2021.1925643)
Heeneman, S., de Jong, L. H., Dawson, L. J., Wilkinson, T. J., Ryan, A., Tait, G. R., Rice, N., Torre, D., Freeman, A., & van der Vleuten, C. P. M. (2021). Ottawa 2020 consensus statement for programmatic assessment - 1. Agreement on the principles. Medical Teacher, 43(10), 1139–1148. https://doi.org/10.1080/0142159x.2021.1957088. (PMID: 10.1080/0142159x.2021.1957088)
Hersey, A., & Adams, M. (2017). Using contribution analysis to assess the influence of farm link programs in the U.S. Journal of Agriculture, Food Systems, and Community Development, 7(3), 83–103. https://doi.org/10.5304/jafscd.2017.073.006. (PMID: 10.5304/jafscd.2017.073.006)
Iobst, W. F., & Holmboe, E. S. (2020). Programmatic assessment: The secret sauce of effective CBME implementation. Journal of Graduate Medical Education, 12(4), 518–521. https://doi.org/10.4300/JGME-D-20-00702.1. (PMID: 10.4300/JGME-D-20-00702.1)
Jamieson, J., Gibson, S., Hay, M., & Palermo, C. (2022). Teacher, gatekeeper, or team member: Supervisor positioning in programmatic assessment. Advances in Health Sciences Education, 28(3), 827–845. https://ro.ecu.edu.au/ecuworks2022-2026/1714. (PMID: 10.1007/s10459-022-10193-9)
Jamieson, J., Hay, M., Gibson, S., & Palermo, C. (2021). Implementing programmatic assessment transforms supervisor attitudes: An explanatory sequential mixed methods study. Medical Teacher, 43(6), 709–717. https://doi.org/10.1080/0142159X.2021.1893678. (PMID: 10.1080/0142159X.2021.1893678)
Jamieson, J., Jenkins, G., Beatty, S., & Palermo, C. (2017). Designing programmes of assessment: A participatory approach. Medical Teacher, 39(11), 1182–1188. (PMID: 10.1080/0142159X.2017.1355447)
Junge, K., Cullen, J., & Iacopini, G. (2020). Using contribution analysis to evaluate large-scale, transformation change processes. Eval, 26(2), 227–245. https://doi.org/10.1177/1356389020912270. (PMID: 10.1177/1356389020912270)
Koleros, A., & Mayne, J. (2019). Using actor-based theories of change to conduct robust evaluation in complex settings. Canadian Journal of Program Evaluation, 33(3). https://doi.org/10.3138/cjpe.52946.
Kusurkar, R. A., Orsini, C., Somra, S., Artino Jr, A. R., Daelmans, H. E. M., Schoonmade, L. J., & van der Vleuten, C. (2023). The effect of assessments on student motivation for learning and its outcomes in health professions education: A review and realist synthesis. Academic Medicine, 98(9), 1083–1092. https://doi.org/10.1097/acm.0000000000005263. (PMID: 10.1097/acm.0000000000005263)
Leeuw, F. (2023). John Mayne and rules of thumb for contribution analysis: A comparison with two related approaches. Canadian Journal of Program Evaluation, 37(3), 403–421. https://doi.org/10.3138/cjpe.75448. (PMID: 10.3138/cjpe.75448)
Lemire, S. T., Nielsen, S. B., & Dybdal, L. (2012). Making contribution analysis work: A practical framework for handling influencing factors and alternative explanations. Eval, 18(3), 294–309. https://doi.org/10.1177/1356389012450654. (PMID: 10.1177/1356389012450654)
Lodge, J. M., Howard, S., Bearman, M., & Dawson, P., & Associates. (2023). Assessment reform for the age of Artificial Intelligence. https://www.teqsa.gov.au/sites/default/files/2023-09/assessment-reform-age-artificial-intelligence-discussion-paper.pdf.
Mayne, J. (1999). Addressing attribution through contribution analysis: Using performance measures sensibly (discussion paper).
Mayne, J. (2001). Addressing attribution through contribution analysis: Using performance measures sensibly. Canadian Journal of Program Evaluation, 16(1), 1–24. https://doi.org/10.3138/cjpe.016.001. (PMID: 10.3138/cjpe.016.001)
Mayne, J. (2011). Contribution analysis: Addressing cause and effect. In K. Forss, M. Marra, & R. Schwartz (Eds.), Evaluating the complex. Transaction Publishers.
Mayne, J. (2012). Contribution analysis: Coming of age? Eval, 18(3), 270–280. https://doi.org/10.1177/1356389012451663. (PMID: 10.1177/1356389012451663)
Mayne, J. (2015). Useful theory of change models. Canadian Journal of Program Evaluation, 30(2), 119–142. https://doi.org/10.3138/cjpe.230. (PMID: 10.3138/cjpe.230)
Mayne, J. (2017). Theory of change analysis: Building robust theories of change. The Canadian Journal of Program Evaluation, 32(2).
Mayne, J. (2018). The COM-B Theory of Change Model (V3) (discussion paper).
Mayne, J. (2019). Revisiting contribution analysis. Canadian Journal of Program Evaluation, 34(2), 171. https://doi.org/10.3138/cjpe.68004. (PMID: 10.3138/cjpe.68004)
Moreau, K. A., & Eady, K. (2015). Connecting medical education to patient outcomes: The promise of contribution analysis. Medical Teacher, 37(11), 1060–1062. https://doi.org/10.3109/0142159X.2015.1060307. (PMID: 10.3109/0142159X.2015.1060307)
Oandasan, I., Martin, L., McGuire, M., & Zorzi, R. (2020). Twelve tips for improvement-oriented evaluation of competency-based medical education. Medical Teacher, 42(3), 272–277. https://doi.org/10.1080/0142159x.2018.1552783. (PMID: 10.1080/0142159x.2018.1552783)
Palermo, C., Conway, J., Beck, E. J., Dart, J., Capra, S., & Ash, S. (2016). Methodology for developing competency standards for dietitians in Australia. Nursing & Health Sciences, 18(1), 130–137. https://doi.org/10.1111/nhs.12247. (PMID: 10.1111/nhs.12247)
Palermo, C., Gibson, S. J., Dart, J., Whelan, K., & Hay, M. (2017). Programmatic assessment of competence in dietetics: A new frontier. JAND, 117(2), 175–179. https://doi.org/10.1016/j.jand.2016.03.022. (PMID: 10.1016/j.jand.2016.03.022)
Palermo, C., Reidlinger, D. P., & Rees, C. E. (2021). Internal coherence matters: Lessons for nutrition and dietetics research. Nutrition & Dietetics, 78(3), 252–267. https://doi.org/10.1111/1747-0080.12680. (PMID: 10.1111/1747-0080.12680)
Pawson, R. (2006). Evidence-based policy: A realist perspective. SAGE. (PMID: 10.4135/9781849209120)
Pawson, R., & Tilley, N. (1997). Realistic evaluation.
Paz-Ybarnegaray, R., & Douthwaite, B. (2017). Outcome evidencing: A method for enabling and evaluating program intervention in complex systems. American Journal of Evaluation, 38(2), 275–293. (PMID: 10.1177/1098214016676573)
Pearce, J., & Tavares, W. (2021). A philosophical history of programmatic assessment: Tracing shifting configurations. Advances in Health Sciences Education, 26(4), 1291–1310. https://doi.org/10.1007/s10459-021-10050-1. (PMID: 10.1007/s10459-021-10050-1)
Rees, C. E., Crampton, P. E. S., & Monrouxe, L. V. (2020). Re-visioning academic medicine through a constructionist lens. Academic Medicine, 95(6), 846–850. https://doi.org/10.1097/ACM.0000000000003109. (PMID: 10.1097/ACM.0000000000003109)
Rees, C. E., Monrouze, L. V., O’Brien, B. C., Gordon, L. J., & Palermo, C. (2023). Foundations of health professional education research: Principles, perspectives and practices. John Wiley & Sons Ltd. (PMID: 10.1002/9781394322213)
Richardson, D., Kinnear, B., Hauer, K. E., Turner, T. L., Warm, E. J., Hall, A. K., Ross, S., Thoma, B., & Van Melle, E. (2021). Growth mindset in competency-based medical education. Medical Teacher, 43(7), 751–757. https://doi.org/10.1080/0142159x.2021.1928036. (PMID: 10.1080/0142159x.2021.1928036)
Riley, B. L., Kernoghan, A., Stockton, L., Montague, S., Yessis, J., & Willis, C. D. (2018). Using contribution analysis to evaluate the impacts of research on policy: Getting to “Good Enough”. Research Evaluation, 27(1), 16–27. (PMID: 10.1093/reseval/rvx037)
Roberts, C., Khanna, P., Bleasel, J., Lane, S., Burgess, A., Charles, K., Howard, R., O’Mara, D., Haq, I., & Rutzou, T. (2022). Student perspectives on programmatic assessment in a large medical programme: A critical realist analysis. Medical Education, 56(9), 901–914. https://doi.org/10.1111/medu.14807. (PMID: 10.1111/medu.14807)
Ross, S., Lawrence, K., Bethune, C., van der Goes, T., Pélissier-Simard, L., Donoff, M., Crichton, T., Laughlin, T., Dhillon, K., Potter, M., & Schultz, K. (2023). Development, implementation, and meta-evaluation of a national approach to programmatic assessment in Canadian family medicine residency training. Academic Medicine, 98(2), 188–198. https://doi.org/10.1097/acm.0000000000004750. (PMID: 10.1097/acm.0000000000004750)
Ryan, A., & Judd, T. (2022). From traditional to programmatic assessment in three (not so) easy steps. Education Sciences, 12(487), 1–13.
Ryan, A., O’Mara, D., & Tweed, M. (2023). Evolution or revolution to programmatic assessment: Considering unintended consequences of assessment change. FoHPE, 185–195. https://doi.org/10.11157/fohpe.v24i2.703.
Sahagun, M. A., Moser, R., Shomaker, J., & Fortier, J. (2021). Developing a growth-mindset pedagogy for higher education and testing its efficacy. Ssho, 4(1). https://doi.org/10.1016/j.ssaho.2021.100168.
Sandars, J. (2018). It is time to celebrate the importance of evaluation in medical education. International Journal of Medical Education, 9, 158–160. https://doi.org/10.5116/ijme.5aed.6f12. (PMID: 10.5116/ijme.5aed.6f12)
Schut, S., Heeneman, S., Bierer, B., Driessen, E., Tartwijk, J., & van der Vleuten, C. (2020). Between trust and control: Teachers’ assessment conceptualisations within programmatic assessment. Medical Education, 54(6), 528–537. (PMID: 10.1111/medu.14075)
Schut, S., Maggio, L. A., & Driessen, E. (2021). Where the rubber meets the road: An integrative review of programmatic assessment in health care professions education. Perspectives on Medical Education, 10(1), 6–13. (PMID: 10.1007/S40037-020-00625-W)
Steinert, Y. (2013). The “problem” learner: Whose problem is it? AMEE Guide No. 76. Med Teach, 35(4), e1035–1045. https://doi.org/10.3109/0142159x.2013.774082. (PMID: 10.3109/0142159x.2013.774082)
Stufflebeam, D. L. (2014). Evaluation theory, models, and applications (Second edition. ed.). Jossey-Bass & Pfeiffer Imprints, Wiley.
Toosi, M., Modarres, M., Amini, M., & Geranmayeh, M. (2021). Context, input, process, and product evaluation model in medical education: A systematic review. Journal of Education and Health Promotion, 10(1), 1–12. https://doi.org/10.4103/jehp.jehp_1115_20. (PMID: 10.4103/jehp.jehp_1115_20)
Torre, D., Rice, N. E., Ryan, A., Bok, H., Dawson, L. J., Bierer, B., Wilkinson, T. J., Tait, G. R., Laughlin, T., Veerapen, K., Heeneman, S., Freeman, A., & van der Vleuten, C. (2021). Ottawa 2020 consensus statements for programmatic assessment - 2. Implementation and practice. Medical Teacher, 43(10), 1149–1160. https://doi.org/10.1080/0142159X.2021.1956681. (PMID: 10.1080/0142159X.2021.1956681)
Torre, D., Schuwirth, L., van der Vleuten, C., & Heeneman, S. (2022). An international study on the implementation of programmatic assessment: Understanding challenges and exploring solutions. Medical Teacher, 44(8), 928–937. https://doi.org/10.1080/0142159x.2022.2083487. (PMID: 10.1080/0142159x.2022.2083487)
Torre, Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2020). Theoretical considerations on programmatic assessment. Medical Teacher, 42(2), 213–220. https://doi.org/10.1080/0142159X.2019.1672863. (PMID: 10.1080/0142159X.2019.1672863)
Touchie, C., & Ten Cate, O. (2016). The promise, perils, problems and progress of competency-based medical education. Medical Education, 50(1), 93–100. https://doi.org/10.1111/medu.12839. (PMID: 10.1111/medu.12839)
van der Vleuten, C. P. (1996). The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education, 1(1), 41–67. https://doi.org/10.1007/bf00596229. (PMID: 10.1007/bf00596229)
van der Vleuten, C. P. M., & Schuwirth, L. W. T. (2005). Assessing professional competence: From methods to programmes. Medical Education, 39(3), 309–317. https://doi.org/10.1111/j.1365-2929.2005.02094.x. (PMID: 10.1111/j.1365-2929.2005.02094.x)
van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D., Baartman, L. K. J., & van Tartwijk, J. (2012). A model for programmatic assessment fit for purpose. Medical Teacher, 34(3), 205–214. (PMID: 10.3109/0142159X.2012.652239)
Van Melle, E., Frank, J. R., Holmboe, E. S., Dagnone, D., Stockley, D., & Sherbino, J. (2019). A core components framework for evaluating implementation of competency-based medical education programs. Academic Medicine, 94(7), 1002–1009. https://doi.org/10.1097/acm.0000000000002743. (PMID: 10.1097/acm.0000000000002743)
Van Melle, E., Gruppen, L., Holmboe, E. S., Flynn, L., Oandasan, I., & Frank, J. R. (2017). Using Contribution Analysis to evaluate competency-based medical education programs: It’s all about rigor in thinking. Academic Medicine, 92(6), 752–758. https://doi.org/10.1097/ACM.0000000000001479. (PMID: 10.1097/ACM.0000000000001479)
Van Melle, E., Gruppen, L., Holmboe, E. S., Flynn, L., Oandasan, I., Frank, J. R., & International Competency-Based Medical Education, C. (2017). Using contribution analysis to evaluate competency-based medical education programs: It’s all about rigor in thinking. Academic Medicine, 92(6), 752–758. https://doi.org/10.1097/ACM.0000000000001479. (PMID: 10.1097/ACM.0000000000001479)
Van Melle, E., Hall, A. K., Schumacher, D. J., Kinnear, B., Gruppen, L., Thoma, B., Caretta-Weyer, H., Cooke, L. J., & Frank, J. R. (2021). Capturing outcomes of competency-based medical education: The call and the challenge. Medical Teacher, 43(7), 794–800. https://doi.org/10.1080/0142159x.2021.1925640. (PMID: 10.1080/0142159x.2021.1925640)
Williams, C. A., & Lewis, L. (2021). Mindsets in health professions education: A scoping review. Nurse Education Today, 100, 104863. https://doi.org/10.1016/j.nedt.2021.104863. (PMID: 10.1016/j.nedt.2021.104863)
Wolcott, M. D., McLaughlin, J. E., Hann, A., Miklavec, A., Beck Dallaghan, G. L., Rhoney, D. H., & Zomorodi, M. (2021). A review to characterise and map the growth mindset theory in health professions education. Medical Education, 55(4), 430–440. https://doi.org/10.1111/medu.14381. (PMID: 10.1111/medu.14381)
Wong, G., Greenhalgh, T., Westhorp, G., & Pawson, R. (2012). Realist methods in medical education research: What are they and what can they contribute? Medical Education, 46(1), 89–96. https://doi.org/10.1111/j.1365-2923.2011.04045.x. (PMID: 10.1111/j.1365-2923.2011.04045.x)
Yun-Ruei, K., & Catanya, S. (2022). Rethinking the multidimensionality of growth mindset amid the COVID-19 pandemic: A systematic review and framework proposal. Frontiers in Psychology, 13, 1–14. https://doi.org/10.3389/fpsyg.2022.572220. (PMID: 10.3389/fpsyg.2022.572220)
Contributed Indexing:
Keywords: Competency-based education; Contribution analysis; Evaluation; Health professions education; Programmatic assessment; Theory of change
Entry Date(s):
Date Created: 20250604 Date Completed: 20260223 Latest Revision: 20260226
Update Code:
20260226
PubMed Central ID:
PMC12929344
DOI:
10.1007/s10459-025-10444-5
PMID:
40465148
Database:
MEDLINE

Weitere Informationen

Programmatic assessment is gaining traction in health professions education. Despite this popularity, educators continue to grapple with complex contextual factors that impact implementation and outcome attainment. We used contribution analysis, a theory-informed evaluation method, to understand the mechanisms underpinning successful implementation. Applying the six steps of contribution analysis, we developed a postulated theory of change (ToC) and then conducted a qualitative study with programmatic assessment stakeholders (graduates n = 15, supervisors n = 32, faculty n = 19) from four Australian dietetic programs. These data were analysed using the Framework Analysis method and integrated with data derived from a literature review across health disciplines, to assemble contribution claims and the story, and verify the ToC. Impact pathways for programmatic assessment from inception to implementation, and contribution to outcomes were articulated in the ToC. Leaders drove implementation using compromise and worked with a design team to apply the versatile principles. All people required training, and purposefully designed tools were implemented within an ideologically aligned system. Re-orientation of responsibilities situated learners as leaders, contributing to a psychologically safe environment which promoted growth mindsets. Credible high-stakes progression decisions were enabled, people experienced less stress, and derived gratification from assessment. External factors (institutional and accreditation requirements) and threats (resource mismatches, ideological misalignments, and capabilities of the people) were identified. Contribution analysis revealed mechanisms that educators can apply to implement a contextually responsive programmatic assessment across diverse settings.
(© 2025. The Author(s).)

Declarations. Competing interests: The authors have no relevant financial or non-financial interests to disclose.