It astounded me when I found out a number of years ago that it takes on average 18 years to Implement a new piece of evidence in to routine practice. I have spent the last 10 years trying to bend that curve. As anyone who has tried it knows, changing clinical practice and behaviour is hard work – our clinicians are always busy and delivering increasingly complex care in a health system that is inherently resistant to change.
As an implementation scientist it is sobering to read the major reviews that highlight the low impact that many interventions have on clinical practice and behaviour, including traditional approaches to education. In the early nineties online learning arose as a new tool to deliver more targeted education. While I believe this is true, I fear we have lost our way pedagogically with many examples of mandatory online learning programs in health that rely on reviewing endless slide shows followed by poor quality multiple choice questions that test knowledge that is not important.
When I started using online learning in the 90s, before the advent of Learning Management Systems, we quickly found that learning needed to be scaffolded within a structured program that included a variety of learning tasks including multiple choice questions set within a clearly defined learning context such as a clinical encounter[i]. We supplemented this with structured forums seeded with questions and facilitated by content matter experts. We demonstrated that such systems were as effective as face to face programs in teaching evidence-based care in high stakes areas such as managing opioid addiction[ii].
While on sabbatical at Harvard University in 2008, I stumbled on a poster reporting early findings on a new platform called Spaced Ed and made the acquaintance of Price Kerfoot the inventor of the platform. I was intrigued that through using the spacing and the testing effect, alongside well-written and relevant clinical case scenarios, you could have impacts on behaviour that challenge academic detailing and other higher impact interventions.
I have subsequently used the Qstream platform that ultimately grew from Spaced Ed in a variety of contexts to implement behaviour change. I have used it as a standalone program to disseminate a change in practice relating to a new discovery such as recommendations for familial referral for genetic testing breast cancer[iii] or to disseminate changes in guidelines around head and neck cancer[iv]. I have also drawn on my experience with the value of embedding education as a tool within a more complex intervention such as reducing cancer pain where Qstream sits alongside other interventions such as changed workflows and forcing functions. I have also found value in wrapping Qstream into integrated and longitudinal safety and quality programs that meet regulatory requirements or respond to adverse events happening in the organisation.
In most of my programs I have found real value in using the gamification aspects of Qstream. I initially included this as a nice to have function but am increasingly seeing this as a must have feature to encourage participation. This was underlined for me when I ran a competition between young cancer doctors in Denmark, the United States and Australia and had over 80% completion in this voluntary program.[v]
As with any educational program, Qstream isn’t the solution for all situations but it has been an extremely useful tool in my armoury of interventions – especially when I identify a specific clinical behaviour change that will result in improved patient outcomes. Where to next? I am now focussing on how we can start to link micro-learning objects more directly with clinical practice and linking case feedback directly to clinical audit and feedback which we know impacts on practice.
I believe we are seeing a growing recognition of the value of targeted education in supporting implementation science.
References:
- [i] Shaw, T., Barnet, S., McGregor, D., & Avery, J. (2015). Using the Knowledge, Process, Practice Model (KPP) for driving the design and development of online post graduate medical education. Medical Teacher 37(1), 53-58.
- [ii] Ryan, G., Lyon, P., Kumar, K., Bell, J., Barnet, S., & Shaw, T. (2007). Online CME: an effective alternative to face-to-face delivery. Medical Teacher, 29(8), 251-257.
- [iii] Robinson, T., Janssen, A., Kirk, J., Defazio, A., Goodwin, A., Tucker, K., & Shaw, T. (2015). New Approaches to Continuing Medical Education: A QStream (spaced education) program for research translation in Ovarian Cancer. J Cancer Education32, pp476-482
- [iv] Olver, I., von Dincklage, J., Nicholson, J. and Shaw, T. (2016). Improving uptake of wiki-based guidelines with Qstream education. Medical Education 50(5), 590–591.
- [v] Janssen, A., Shaw, T., Bradbury, L., Moujaber, T., Nørrelykke, A., Zerillo, J., LaCasce, A., Co, J., Robinson, T., Starr, A., et al (2016). A mixed methods approach to developing and evaluating oncology trainee education around minimization of adverse events and improved patient quality and safety. BMC Medical Education 16(91), 1-9.
About Tim Shaw:
Tim is Professor of eHealth and Director of the Research in Implementation Science and eHealth Group (RISe) in the Faculty of Health Sciences at the University of Sydney.
Tim is Health Systems Lead and Director of Workforce Capability in the recently funded 7 year $112M Cooperative Research Centre (CRC). The CRC brings together academia, industry, government and service providers to transform care.
Tim has led an active research and development team in the University of Sydney since 2000 that has focussed on implementation science and how we capture, use and transfer patient and organisational data to support multidisciplinary care. His current focus is on how we use health data to impact on clinical decision making, quality improvement and professional development. He co-leads the strategic development of Digital Health at the University of Sydney and chairs the Digital Health and Informatics Network.
Over the last 5 years He has been a Chief Investigator on over 30 competitive or commissioned research and development projects totaling over $124M.
Tim has worked closely with lead organisations including Australian Digital health Agency, eHealth NSW, Australian Commission for Safety and Quality in Health Care, Cancer Australia, Health Education and Training Institute NSW, Cancer Institute NSW, WHO, Institute for Healthcare Improvement, International Society for Quality in Healthcare (ISQua) and Partners Healthcare.
Tim has also acted independently as a consultant to the Joint Commission Resources, Chicago, Partners Healthcare International, Boston and the Royal Australasian College of Surgeons, Melbourne.