Simulations Plus Has Acquired Pro-ficiency!

Analyzing Protocol Deviations Before Retraining

Protocol deviations can cause significant problems during clinical trials, forcing sites and sponsors to direct time and resources toward investigating and addressing them, potentially delaying completion of the study. And failure to address a deviation properly can lead to recurrences, costing the study more time and money.

What constitutes addressing a deviation properly? Any corrective and preventive action (CAPA) plan aimed at a deviation must be founded on a thorough root cause analysis, a step that the research industry seems prone to skip. Instead, a common response is to throw more training at the problem in the interest of checking a box on a form and moving forward quickly.

But taking a pause when a deviation occurs, performing a root cause analysis to identify the real source of the problem and tailoring a CAPA response specifically to that problem or set of problems will serve sites and sponsors better than just retraining site personnel on autopilot.

Root cause analysis is considered to be an integral part of overall quality principles, and is addressed in regulatory guidance. For instance, the FDA noted in a 2013 guidance document that a sponsor should have a “Processes to ensure that root cause analyses are conducted where important deviations are discovered and that appropriate corrective and preventive actions are implemented to address issues identified by monitoring.”

Similarly, ICH guidelines, Good Clinical Practice: Integrated Addendum to ICH E6, include specific requirements for a root cause analysis to be conducted in the event of a protocol deviation.

A frenzy often surrounds occurrence of a protocol deviation, focused on checking off the corrective action box and filling in the right forms. For many sponsors and research sites, the knee-jerk response to a protocol deviation is to retrain the erring staff member. This is an easy cause to address; site staff can be retrained and this is easily documentable. But retraining may not work to prevent the same problem from recurring.

But lack of training isn’t always the right answer. And if the problem recurs after retraining, one of two conclusions can be drawn: Either the problem wasn’t rooted in insufficient training or the training was ineffective. While inadequate or poorly understood training can contribute to deviations, it is not always the root cause. Other possibilities include issues with a process, with the technology used, or with communication among personnel, for instance.

In some cases, there may be multiple factors contributing to deviations; each of these must then be addressed effectively.

Applying the five whys

Thus, research sites and sponsors would be better served to pause and consider before moving to the easiest option—often retraining. But the preventive part of CAPA is just as important as the correction part; pausing and taking the time to conduct a root cause analysis can mean the difference between recurring deviations that must continue to be dealt with and a complete solution that forestalls future problems.

The only way to determine for sure that a deviation is rooted in training or another problem is to map out the process in fine detail. In this way, sites and sponsors can rule out potential causes and zero in on the real reasons behind a deviation.

A variety of methods may be applied. One of the most widely used approaches is the “five whys” method. By asking the question “why” of each stage of deviation evaluation, sites and sponsors can gradually work down to the root cause of the problem.

For example, the deviation may be that a site is not consistently performing protocol-required EKG assessments at patients’ baseline visits. The five whys approach can be used to dig down to the true reason these assessments have not been done, starting with the simple question: Why don’t sites comply with this requirement?

A possible answer is that the site doesn’t understand the measurement criteria, an answer that leads to the question: Why doesn’t the site understand? If the answer is that the site hasn’t been trained, that also begs a “why” question.

More often, training has been provided, but it hasn’t proven effective in ensuring the site’s understanding of how to perform the EKG assessment. When “why” is asked to that answer, several possibilities arise, including:

  • The training was so long ago that site staff forgot the instructions for this assessment;
  • The training didn’t focus on how to conduct the measurement;
  • The training approach wasn’t effective; or
  • Site staff simply weren’t paying attention during the training.

Depending on the answers at this level, the site and/or sponsor continue moving forward, asking “why” until it is no longer applicable. It may not always be necessary to go five layers deep. With regard to the potential answers above, for instance, the first issue could be addressed by providing job aids to jog staff memories when doing the EKG assessments. The last three points are common problems associated with traditional slide-and-lecture training that is too broad, not engaging enough and lacks feedback on absorption of information.

Additionally, root cause analyses also need to look beyond an individual incident. They must look at patterns and trends. This focus may reveal that a deviation was a one-off issue at a single site with one or a few employees. Or it could show that a given site has certain problems and root causes, or one particular region or sites working with a specific CRA. All of these situations must be handled differently to correct the current deviation and prevent future ones.

When is more training needed?

In some cases, insufficient training can be the root cause of a problem. Even in those instances, however, effective correction and prevention of future incidents should not rely on simply re-doing the same training. It’s important to make sure the right training is targeted to the right people in the right format at the right time and in the right way.

When considering whether training really is the root cause of a deviation, there are several considerations, including how the training was done and who conducted it. Some types of training can be more effective than others. The traditional slide-and-lecture approach to training may not be the most effective way to ensure correct performance of all protocol-mandated tasks, for instance. A simulation-based approach that guides staff through real-life scenarios may be more effective.

Who conducted the training may also be a factor. It’s not uncommon for a CRA to provide training, but this individual likely lacks training skills or credentials. A qualified trainer or team can mean the difference between training being well-absorbed or forgotten.

Use of a training matrix, to help develop effective, simulation-based training modules for clinical trials, can help guide development of effective training programs for both initial training and retraining that might be needed to address deviations.

And in some cases, the training may have been on point, but if a significant amount of time has elapsed between the training and start of enrollment at a site, staff may struggle to recall perfectly how some tasks are to be performed, particularly any that may differ from standard of care. In such cases, retraining might be helpful, as might job aids that staff can refer to when conducting the clinical trial.

Additionally, it’s important to consider root cause analysis to be a continuous process, not just a one-time event that’s done and checked off. Sites, sponsors and monitors must monitor that a corrective action worked to effectively fix the current deviation, and that preventive measures are able to keep similar deviations from happening during the study. And that means there must be methods for measuring the performance of the training or other intervention.

In the case of training, for instance, a simulation-based approach that provides both immediate feedback to research staff about their performance and a dashboard of information showing sponsors and site leadership how well staff perform in various areas can provide that means of measurement.

Developing a training strategy

All of this should be incorporated into a risk management strategy that includes an up-front risk assessment for each protocol. Part of a proactive risk strategy should include going through the exercise of anticipating where a protocol is likely to see deviations. This information can be gleaned from previous work, from questions posted by sites during feasibility assessments and from a thorough mapping of the entirety of the protocol, such as that performed as part of Pro-ficiency’s Pro-Active Protocol service. And this information can help to identify where training can be focused to prevent deviations.

Many sponsors don’t really have a training strategy. They tend to look at it as a box on a list that needs to be checked off, so they conduct training, but don’t really think about strategy. But this can lead to ineffective, one-size-fits-all training. A better approach would be to parse out who the specific audience is and what they need for each training effort. This includes whether a site is experienced—in the area of study, with clinical research generally and in working with the specific sponsor—or not; training needs will be different for a site new to clinical research versus one that conducts several trials each year.

This type of training assessment can be supported by creation of a training matrix that looks not only at site experience, but also at global regions, where language and cultural differences can cause confusion, and at specific roles within a clinical trial. A PI will have different training needs than a pharmacist, for instance. Targeting potential areas for misunderstanding up front can be more effective.

Sponsors can perform this work themselves or engage a company like Pro-ficiency to thoroughly evaluate the training needed, including focused modules for different staff. Coordinators, pharmacists, interventional radiologists, investigators and research nurses, for instance, will each be responsible for specific protocol tasks and procedures; focused modules for each position that simulate real-life scenarios relevant to each role can be most effective in ensuring that all positions fully understand how to perform protocol tasks.

And sites need to own their own quality systems, including thorough analysis of deviations and CAPA procedures. Sponsors can provide support via training and resources.

A similar approach can be applied to re-training, if that is needed to properly address a deviation. The focus should be on what problems the training is meant to solve, who needs that training and how it should best be presented to enhance understanding and avoid future deviations.

This takes time and effort and also means that existing training must be adjusted. But if the work isn’t put in up front, it will need to be put in later, after deviations have occurred.

Protocol deviations will happen. Companies can reduce the risk of deviations by taking a risk-based, proactive approach to training that includes a well-thought-out training strategy. Application of carefully targeted training modules that provide performance feedback to both research staff and study sponsors can help ensure that all employees get exactly the training they need to fully understand their roles in enacting the protocol.

And when a deviation does occur, a careful root cause analysis must be prioritized above a fast, knee-jerk response. Failure to conduct this analysis and craft CAPA plans tailored to address the true causes of the deviation means that researchers will end up chasing the same problems, treating symptoms rather than the true cause. Retraining may be part of the issue, but all other potential causes must be considered and dealt with.

Visit to discover how our simulation training can better your clinical sites.

114 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *