What are EPAs?
EPAs are the next big thing in medical education. But what exactly are EPAs? Entrustable Professional Activities (EPAs) are units of professional practice. Specifically, an EPA defines a task or duty that a trainee can perform unsupervised within a set timeframe, once they demonstrate sufficient competency.
EPAs offer a practical, hands-on approach to assessing whether trainees are truly ready for independent practice. Instead of emphasizing only theoretical knowledge, they focus on how effectively a trainee applies their skills in real-world, patient-facing situations, without constant oversight from a supervisor.
This blog will analyze the current situation of EPAs regarding their benefits and the challenges of implementing EPAs in medical education. If you are interested in the specific competencies I recommend reading the following articles by Profiles, or the Australian Medical Council.
EPAs to help with (PIF) Professional Identity FormationÂ
Entrustable Professional Activities (EPAs) actively impact medical students’ professional identity formation (PIF), which is about becoming the kind of professional who embodies the values, ethics, and responsibilities of their field. It’s a continuous process where individuals move beyond simply “doing” their job to truly “being” a professional, by helping them to get more involved in the clinical workplace. Students utilize EPAs to strategically generate learning opportunities, helping them understand which activities best promote their development. Also making them more actively look for feedback to improve their skills. The context of supervision has a significant impact on their development and some clinical settings provide just better learning opportunities than others. EPAs boost students’ self-confidence because they receive validation of their skills when completing specified activities. While students gain confidence in their talents through EPAs, they often find that trust from supervisors is established through casual conversations rather than through EPA documentation.Â
Challenges of implementing EPAs in undergraduate medical education
Not all countries are the sameÂ
Implementing Entrustable Professional Activities (EPAs) cannot simply be copied from one country to another. Just due to disparities in medical education institutions, clinical exposure schedules, healthcare systems, and cultural expectations around medical training. As you will see in the two following examples in some nations, like the United States, medical students’ clinical duties are more restricted during undergraduate studies. In contrast, countries such as the United Kingdom expose students to clinical settings considerably sooner, changing the dynamics of entrustment and supervision. Neither approach is necessarily better than the other but it requires a different implementation process and is therefore one of the key challenges of implementing EPAs in medical education across diverse systems.
Defining EPAs
EPAs are undoubtedly a great system and will certainly play a big role in the future of medical education worldwide. But the concrete implementation into the curriculums both in undergraduate and postgraduate education does have its challenges. The major difficulty is to define these EPAs as independent work units and develop a reliable evaluation methodology. These issues are core to the challenges of implementing EPAs in medical education, especially across institutions with differing workflows and standards. the AAMC published a paper in 2017 that entails detailed guidelines on assessing each EPA and supporting faculty development.
To get more insights into the implementation challenges the Association of American Medical Colleges (AAMC) created In 2014, 13 Core EPAs:
- Gather a history and perform a physical examination
- Prioritize a differential diagnosis following a clinical encounter
- Recommend and interpret common diagnostic and screening tests
- Enter and discuss orders and prescriptions
- Document a clinical encounter in the patient record
- Provide an oral presentation of a clinical encounter
- Form clinical questions and retrieve evidence to advance patient care
- Give or receive a patient handover to transition care responsibility
- Collaborate as a member of an interprofessional team
- Recognize a patient requiring urgent or emergent care and initiate evaluation and management
- Obtain informed consent for tests and procedures
- Perform general procedures of a physician
- Identify system failures and contribute to a culture of safety and improvement
10-school pilot experiment
New residents were expected to complete these on their first day with indirect supervision. To determine the possibility of introducing these EPAs into undergraduate medical education, a 10-school pilot experiment was started. From 2014 to 2021, this project was aimed to test implementation options and evaluate outcomes. The pilot schools developed nine guiding principles to guide their efforts, which shaped curriculum development and formative assessments for at least four EPAs at each institution, with some addressing all 13. The main goal was to identify how to measure students’ readiness for entrustment. Six schools established dedicated entrustment groups to review data and make mock choices about whether students were prepared for specific EPAs.
The first group included six EPAs that were easily integrated into existing curricula, such as collecting patient histories, prioritizing differential diagnoses, and communicating with healthcare teams. These were carefully evaluated, and the majority of students appeared to be ready for indirect supervision.
The second group included three EPAs that are often linked with sub-internships, such as recommending and interpreting tests, as well as moving patient care responsibilities. These were at a higher-level and not consistently assessed, resulting in a reduced percentage of students being ready for entrustment.
The third group consisted of four EPAs that are usually reserved for interns and residents, such as managing critical situations and gaining informed consent. These were rarely done by students in actual clinical settings, making evaluation difficult. Therefore some schools used simulators, but they lacked real-world context.
Creating a system for doctors in the hospital to understand the knowledge and skills of the student
Implementing EPAs for undergraduates can be challenging for the doctors, nurses, and other workers in the hospital since they don’t know what skills to expect from a medical student at different stages of their training. To address this, the University of Leeds implemented a simple solution: color-coded badges for students in different years. This allows any healthcare professional to immediately identify a student’s level of experience. Additionally, faculty development initiatives aim to create a shared understanding of student expectations among hospital workers.
Standardizing Clinical Expectations with the Leeds Expectations Guide
To determine these expectations, the team consulted with clinical teachers, asking what tasks students should be able to perform at what point in their education. Using a Delphi process involving clinicians, educators, and students, they developed the “Leeds Expectations Guide” (LEAF 2018), which clearly outlines expectations for each year of study. This guide is displayed in all clinical environments, ensuring that everyone from ward clerks to healthcare assistants is aware of what a student can be expected to do.
The benefits of this system are that staff no longer ask first-year students to perform tasks they haven’t learned yet, and students feel less pressure to try things they are not ready for. On the other hand, fifth-year students are now more actively tested in tasks in which they already gained competence since instructors are not afraid to overestimate their students’ skills.
Furthermore the Leeds expectations guide outlines core clinical skills for each year of study, along with corresponding levels of competence.
- Level one focuses on observation. Students are expected to actively observe procedures and tasks, such as a drug round or a consultation, without direct involvement.
- Level two involves direct supervision. Students can perform tasks, but only under the direct guidance of a clinician. They are not expected to initiate these tasks independently.
- Level three: at level three students complete tasks independently, but their work is reviewed afterward by a clinician.
- Level four: At level four the clinician recognizes the student’s ability to perform the task but does not expect them to initiate it themselves.
- Finally, at level five, students can both initiate and perform tasks independently, and they can even supervise others.
Challenges of implementing EPAs in postgraduate medical education
The implementation of EPAs in postgraduate medical education while under the right circumstances is very effective, but also has its difficulties. First, there is often a lack of shared understanding about competency frameworks. Resulting in inconsistencies in teaching and evaluation. Members of the Faculty may also have outdated or inadequate perspectives on specific competencies, which is logical since it is almost impossible to be up to date with all the changes happening in medicine. Without targeted faculty development, educators may lack clarity on how to operationalize both milestones and EPAs.Â
Furthermore, confusion between learning milestones and EPAs is a well-documented challenge in the implementation of competency-based medical education (CBME), particularly in postgraduate settings. Milestones are developmental markers that describe a learner’s progression in specific competencies, while EPAs are units of professional practice that can be entrusted to a trainee once sufficient competence is demonstrated. The confusion between the two complicates the assessment. Faculty may mistakenly use milestones as checklists for EPA entrustment decisions, resulting in fragmented or reductionist evaluation approaches. Ara Tekian’s work points out that this can undermine the holistic judgment required for entrustment, as milestones are often granular and descriptive, whereas EPA assessment requires a more global, integrative judgment.Â
Another challenge is creating an EPA model that is understandable for everyone leaving no room for misinterpretation. Without a shared understanding of what each EPA includes, evaluation and feedback become subjective and unreliable. This is further compounded by the fact that many EPAs demand a wide range of abilities, making it even more difficult to define and assess them correctly. Furthermore, healthcare practices are always evolving, therefore EPAs must evolve to reflect current models of care. This is not only time but also resource-consuming.Â
Faculty Training Program for EPAs
We predict that in the coming years, EPAs will become an integral part of competency-based medical education. Video-based EPA evaluations offer a powerful tool to enhance assessment quality, feedback, and faculty calibration. Below is a comprehensive, structured faculty training program tailored to optimize EPA evaluations done through video recordings.Â
Establishing Training Objectives
-
- Develop faculty understanding of EPAs and construct evaluation activities that align with program objectives and learning milestones.Â
- Train faculty to effectively evaluate trainee performance using video recordings.
- Enhance objectivity and fairness in evaluation by contemplating the evaluation of trainees by multiple trainers.
- Foster skills in providing constructive and actionable feedback asynchronously by providing resources or templates for trainers.
- Ensure faculty compliance with privacy and ethical standards by assigning an external team leader.
- Promote faculty engagement through frequent check-ins and continuous professional development.
Program Structure and ContentÂ
MODULE 1: Introduction to EPAs and Competency-Based Medical Education
Provide an overview of EPAs and clarify the difference between EPAs, competencies, and milestones. This will help reinforce the role of these evaluations in Professional Identity Formation. As a group, discuss the challenges and benefits of asynchronous EPAs in medical education for both trainers and trainees.
MODULE 2: Video-Based EPA Evaluation Fundamentals
Elaborate on the rationale for using video in EPA assessments, highlighting the experience of other institutions with this format. Follow up with an overview of the proposed video platform, including security features, privacy, and compliance requirements. Walk the trainers through the technical aspects of the platform (such as uploading, accessing, and completing assessments on the platform).
MODULE 3: Standardize EPA Evaluation Criteria
Review both the program objectives and the course objectives and discuss what skills need to be evaluated in order to satisfy both objectives. As a team, develop standardized rubrics and entrustment scales. During the training process, sample videos can be reviewed and trainers can practice scoring in order to align faculty judgements. This will also address subjectivity and bias in video evaluations.
MODULE 4: Providing Effective Feedback via Video
Open the discussion by establishing the principles of constructive feedback and then dive into different feedback models. Allow each trainer to lean into the feedback model that they feel most comfortable with but reinforce that feedback should always be timely, specific, and actionable. Share the different techniques that the online platform offers for providing feedback (i.e. written comments, annotations, and video responses). Encourage self-reflection as well as peer feedback amongst faculty members. Provide instances where trainers can practice providing negative feedback and having difficult conversations as well as other challenging feedback scenarios.
MODULE 5: Legal, Ethical and Privacy Considerations
Establish a clear process for acquiring patient consent for video recording (both when using simulated patients and real patients). Clarify doubts about confidentiality safeguards and data protection protocols. Clearly define faculty responsibilities and institutional policies and elaborate on the institution’s disaster recovery plan in case of data breaches or violations.
Training Methods
We recommend the use of interactive workshops, be it in-person or virtually. The use of video demonstrations is particularly useful both for technical training and for familiarizing trainers with EPA evaluations. As we mentioned previously, there should be a hands-on portion of the training so that doubts may be clarified and feedback may be provided to trainers. This part is crucial for promoting faculty discussion on evaluation biases and scoring discrepancies. Ensure that ongoing support is provided after the training by having a series of practice videos available for trainers and a technical helpdesk where they can clear up doubts about features.Â
Implementation Recommendations
-
- Pilot Phase: Start with a small group of faculty to refine training materials and processes.
- Integration: Align training with institutional EPA rollout plans and clinical schedules. Ideally, start with one faculty at a time so that adaptations may be made from department to department.
- Continuous Quality Improvement: Collect feedback from faculty and learners to improve the program. Allow time for self-reflection between one deployment and the next.
- Leadership Engagement: Secure buy-in from educational leaders by documenting results in a quantifiable way.Â
How Videolab can help with the implementation of EPAs
To help Universities with the integration of EPAs into their curriculum. Codific created Videolab, a secure videorecording software that specifically targets the implementation challenges for some of the EPAs. Providing not only secure sharing of videos within the organisation but also different feedback mechanisms.
One of the challenges of implementing EPAs in medical education is ensuring that trainees receive meaningful feedback. With Videolab, students can review their interactions, request feedback from peers and faculty, and make targeted improvements, fostering Professional Identity Formation (PIF). This is particularly useful in a clinical setting where supervisors might not always be available during a specific time and with Videolab have the possibility to provide asynchronous feedback.
Another challenge that particularly arises in postgraduate medical education is the subjectivity in entrustment decisions. Recording the specific skills acquired by students enables supervisors to be more consistent and evidence-based in their evaluations and further allows students to challenge their supervisors by having an open conversation in the fragmented feedback section.Â
Another way in which Videolab can create a more fair and standardized evaluation is by improving faculty training. A big problem is the differences in practices of latest knowledge leading to different expectations depending on the instructor. This confuses students and makes them feel treated unfairly. Through lifelong learning programs, the instructor can record their patient interactions and receive feedback from specialists in the specific field.
Ensuring Privacy and Compliance in Clinical Video Assessments
Using video recordings for EPA assessments creates serious privacy problems, particularly in clinical set ups; as videorecording patients in their most vulnerable moments is very delicate. Therefore doctors have to comply with strict regulations such as GDPR and HIPAA. Many conventional video platforms lack adequate security protections, increasing the likelihood of data breaches and ethical violations. Without sufficient safeguards, institutions struggle to find a balance between objective assessment and patient confidentiality.
Videolab overcomes these issues by offering a secure, regulatory-compliant platform for medical education. All recordings are encrypted and securely stored in the cloud. with role-based access controls. Which means the doctor who got the patient’s census gains full control over the video. Furthermore, the platform is designed to avoid the creation of duplicate videos or the extraction from the cloud. Videolabs is fully committed to protecting both patients’ privacy and doctors from potential legal issues.