A decade of experience in high-stakes decision-making in competency-based education

Westermann Cornélie Martine Department of Clinical Sciences, Faculty of Veterinary Medicine, Utrecht University, Utrecht, Netherlands

Search for other papers by Westermann Cornélie Martine in
Current site
Google Scholar
PubMed
Close
 DVM, PhD
and
Harold Gerrit Johannes Bok Department of Population Health Sciences, Faculty of Veterinary Medicine, Utrecht University, Utrecht, Netherlands

Search for other papers by Harold Gerrit Johannes Bok in
Current site
Google Scholar
PubMed
Close
 DVM, PhD

Click on author name to view affiliation information

Introduction

Over the last few decades, the most influential paradigm shift in health professions education literature is the transition toward competency-based education (CBE). Oriented to graduate outcome abilities that address societal and patient needs, the underlying principle of CBE is preparing learners for practice using competencies as the organizing framework. Research has identified 5 core components that should be taken into account when designing and implementing competency-based curricula: 1) an outcomes competency framework, 2) progressive sequencing of competencies, 3) learning experiences tailored to competencies, 4) competency-focused instruction, and 5) programmatic assessment.1

As CBE focuses on individual developmental processes, effective assessment strategies need to support these in order to achieve the desired outcomes of training.2 The programmatic assessment model, founded on several key assessment principles, emphasizes the importance of combining formative and summative functions of assessment.3 It requires each individual assessment to provide meaningful feedback that drives learning towards the intended learning outcomes. Aggregated information on learners’ development should then help to optimize learners’ individual development and to allow robust high-stakes decisions for promotion and/or licensure. Currently, this model of programmatic assessment is receiving increased attention by veterinary educators.4,5 However, the “how” of developing transparent and trustworthy high-stakes decision-making procedures has, to date, received little scholarly attention.

At the Faculty of Veterinary Medicine, Utrecht University, Netherlands (FVMU), a 3-year clinical program, designed according to CBE principles and a programmatic approach to assessment, was implemented in 2010. The 3-year program is mainly built around clinical rotations in disciplines related to 3 tracks: equine health change; into companion animal health, and farm animal health. Apart from general rotations in different clinical departments, students mainly undertake rotations in disciplines related to their chosen animal species track. While working side-by-side with clinical staff during their rotations, students encounter a variety of learning activities. The high-stakes decision procedure is performed by 2 independent members of an assessment committee based on a review of myriad individual workplace-based assessments that are aggregated across all rotations.

In 2021, a consensus statement on the implementation and practice of programmatic assessment was published,6 identifying “establishing equitable and credible high-stakes decisions” as one of the main themes in implementing programmatic assessment. Based on our experience at FVMU, we identify the following elements as important with respect to robust high-stakes decision-making: saturation of information, triangulation, and a high-quality competence committee.

As the most important challenge with saturation of information we identify the establishment of high-quality narrative feedback. This requires a learning environment that fosters and enhances the exchange of meaningful feedback. Especially within the busy daily clinical workplace this can be challenging to accomplish.

The high-stakes decision is made based upon the triangulation of a multitude of complex and rich information.6 This is facilitated by an electronic portfolio that allows the aggregation and visualization of student’s individual competency-development over time. At FVMU standardization of the portfolio using an outcomes competency framework, and clear instruction for both staff and students turned out to be an important success factor to allow a feasible triangulation of the data.

At FVMU the competence committee consists of approximately 11 members.5 This committee has received extensive training and meets on a regular basis. As high-stakes judgments are based upon the interpretation of a variety of qualitative and quantitative assessment information, it is important for the competence committee to establish a shared mental model.

After more than 10 years, we continue to learn how best to apply the theoretical model of programmatic assessment to daily educational practice. We have realized that there is no ‘one-size-fits-all’ recipe and making it a success is an ongoing process that requires further research, training of students and faculty, perseverance and, above all, commitment of all stakeholders involved.

References

  • 1.

    Holmboe ES, Osman NY, Murphy CM, Kogan JR. The urgency of now: rethinking and improving assessment practices in medical education programs. Acad Med. Published online April 18, 2023. doi:10.1097/ACM.0000000000005251

    • Search Google Scholar
    • Export Citation
  • 2.

    Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009. doi:10.1097/ACM.0000000000002743

    • Search Google Scholar
    • Export Citation
  • 3.

    van der Vleuten CP, Schuwirth LW, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205-214. doi:10.3109/0142159X.2012.652239

    • Search Google Scholar
    • Export Citation
  • 4.

    Bok HGJ, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013;13(1):123. doi:10.1186/1472-6920-13-123

    • Search Google Scholar
    • Export Citation
  • 5.

    de Jong LH, Bok HGJ, Kremer WDJ, van der Vleuten CPM. Programmatic assessment: can we provide evidence for saturation of information? Med Teach. 2019;41(6):678-682. doi:10.1080/0142159X.2018.1555369

    • Search Google Scholar
    • Export Citation
  • 6.

    Dario Torre, Neil E. Rice, Anna Ryan, Harold Bok, Luke J. Dawson, Beth Bierer, Tim J. Wilkinson, Glendon R. Tait, Tom Laughlin, Kiran Veerapen, Sylvia Heeneman, Adrian Freeman & Cees van der Vleuten (2021) Ottawa 2020 consensus statements for programmatic assessment - 2. Implementation and practice, Medical Teacher, 43:10, 1149-1160, doi:10.1080/0142159X.2021.1956681

    • Search Google Scholar
    • Export Citation
  • 1.

    Holmboe ES, Osman NY, Murphy CM, Kogan JR. The urgency of now: rethinking and improving assessment practices in medical education programs. Acad Med. Published online April 18, 2023. doi:10.1097/ACM.0000000000005251

    • Search Google Scholar
    • Export Citation
  • 2.

    Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009. doi:10.1097/ACM.0000000000002743

    • Search Google Scholar
    • Export Citation
  • 3.

    van der Vleuten CP, Schuwirth LW, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205-214. doi:10.3109/0142159X.2012.652239

    • Search Google Scholar
    • Export Citation
  • 4.

    Bok HGJ, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013;13(1):123. doi:10.1186/1472-6920-13-123

    • Search Google Scholar
    • Export Citation
  • 5.

    de Jong LH, Bok HGJ, Kremer WDJ, van der Vleuten CPM. Programmatic assessment: can we provide evidence for saturation of information? Med Teach. 2019;41(6):678-682. doi:10.1080/0142159X.2018.1555369

    • Search Google Scholar
    • Export Citation
  • 6.

    Dario Torre, Neil E. Rice, Anna Ryan, Harold Bok, Luke J. Dawson, Beth Bierer, Tim J. Wilkinson, Glendon R. Tait, Tom Laughlin, Kiran Veerapen, Sylvia Heeneman, Adrian Freeman & Cees van der Vleuten (2021) Ottawa 2020 consensus statements for programmatic assessment - 2. Implementation and practice, Medical Teacher, 43:10, 1149-1160, doi:10.1080/0142159X.2021.1956681

    • Search Google Scholar
    • Export Citation

Advertisement