Skip to main content

Using digital assessment to implement Core Entrustable Professional Activities (EPAs) and support students entering residency

For some time, research has suggested that there is a fundamental gap in medical student knowledge and performance upon their transition from undergraduate medical education (UME) to residency.


For some time, research has suggested that there is a fundamental gap in medical student knowledge and performance upon their transition from undergraduate medical education (UME) to residency.  At this transition, student preparedness has been found to be inconsistent.  This is exacerbated by the fact that different residency programs determine different starting points in Graduate Medical Education (GME) milestone pathways based on differing specialities1.

To bridge this gap, in 2014, the Association of American Medical Colleges (AAMC) published new guidelines for both medical students and teachers that include 13 activities that all students should be able to perform upon entering residency, regardless of their future career speciality. To implement the new guidelines, the AAMC chose Entrustable Professional Activities (EPAs) as the framework as they offered a practical approach to assessing competence in clinical settings.

EPAs are independently executable, observable and measurable activities or responsibilities that can be carried out unsupervised once a medical student has achieved sufficient competence2. Currently being piloted by 10 medical colleges across the US, and now being used by medical colleges beyond the pilot, the 13 Core EPAs will support students during their transition from undergraduate medical school to residency.

In this article, we delve into the practicalities around assessing Core EPAs through direct observation in clinical settings and preparing medical students for residency. 

What are the main considerations when attempting to assess core EPAs in clinical environments?

As we have discussed, observation is one of the key components for assessing EPAs in practice. Although EPAs can be assessed in clinical skills labs and simulated environments, the EPA toolkit states that “central to the concept of entrustment is the performance of EPAs in real-life clinical settings where they are taught and assessed holistically.”However, carrying out observational assessment in a hospital setting can come with its own challenges.  Physicians available for observation are under constant time pressures and have limited time to carry out assessments.  Your approach to carrying out assessments needs to:

  • Be easy for the physician to carry out
  • Enable students and preceptors to track progress and assess performance
  • Encourage student reflection on practice
  • Support students in developing action plans based on performance

One of the most practical ways of assessing EPAs and conducting direct observation and assessment in practice is through digital and mobile assessment technology.

Whether assessment occurs in a clinical skills lab or hospital setting, it is important for any assessment tool to easily transition from one environment to another. By having a single vehicle for assessment throughout the entire student experience, it not only eases the student transition but unlocks the potential for a clinical environment to be your biggest classroom.  However, clinical settings present a number of challenges when implementing digital or mobile assessment tools.  It is therefore important for the technology to be designed and developed with clinical settings in mind. Below we’ve listed some of the key considerations to be taken when carrying out direct observation of EPAs in hospital settings.

  1. Does the digital assessment technology support assessment via a mobile app?

Moving to digital assessment technology allows the student to have a complete view of their performance against Core EPAs, see feedback and have all of their assessments ready to complete in one place.  This is great – except it can be difficult to easily use a digital or online platform in a clinical setting.  Through moving to mobile assessment technology, assessment is made all the more accessible to the student and supporting staff - all they have to do is hand over their device to the physician to start.  Having this flexibility empowers the student to carry out assessment in the moment, take control of their learning and actively identify learning opportunities.

  1. Does the mobile assessment app work offline?

Carrying out assessment on a mobile app has many benefits, but there is one important factor to consider with this approach – what if WiFi isn’t accessible or the student has no network coverage in the hospital?  How can they access their assessment mobile app? Clinical settings are notorious for having little or no WiFi or network coverage, therefore a key consideration when adopting a mobile assessment app is does it work offline? Will relevant assessments be available to the student offline and can they be completed offline?

  1. Does the mobile app support unknown observers in observational assessment?

Observational assessments are intrinsic to the assessment of Core EPAs, but when using mobile assessment tools, frequently the observers will not be known in advance of the assessment and will not be registered in the assessment tool.  Of course, it is not practical for a medical college to hold details of every physician who may be observing students in practice, but it is vital for medical educators to be reassured of the validity of the assessment.  Therefore, one of the key considerations here is can the mobile assessment tool support verification of assessments when they are unknown to the system? For more information on this, read our blog post on ‘6 Ways to Support Unknown Observers in Observational Assessments’.

  1. Can the digital or mobile assessment tool support ease of assessment and rubric creation for different types of assessment?

The EPA Toolkit recommends several entrustment scales (modified versions of the Ottawa scale and Chen scale) for the assessment of Core EPAs, and the pilot schools have been using these with success – the AAMC doesn’t prescribe any single assessment rubric, giving the flexibility to work with a scale that fits the culture of your school.   Therefore, one key consideration is whether the digital or mobile assessment tool can easily support flexibility in the creation of rubrics and assessment forms, and allow educators to easily deploy relevant forms and rubrics to students on clerkship. 

Overall, from clinical years throughout UME, to residency in GME, it is clear students and preceptors require the consistent support of an assessment tool designed for clinical settings. With the network of physician and preceptor support for the student throughout their UME experience, the assessment tool needs to be quick to pick up and easy to understand for all engaged within the assessment process.  Digital and mobile assessment tools not only support students in preparing for residency with Core EPAs, but also provide a foundational structure to support competency-based medical education, from milestones to EPAs.


1 – Franzen, D., Kost, A. and Knight, C. (2015) Mind the Gap: The Bumpy Transition from Medical School to Residency.  Journal of Graduate Medical Education. 7(4): 678 – 680

2 – Ten Cate, O. (2013) Nuts and bolts of entrustable professional activities. Journal of Graduate Medical Education. 5(1): 157-8.

3 - AAMC, Core EPA Toolkit [online] (2017)

Say Hello

Get in Touch with the Team

If you would like to discuss one how Myprogress can support your learning programmes,
or arrange a demonstration of our platform, get in touch with our team.

Back to top