Ahead of his upcoming presentation at Digital Health Rewired 2025, Devesh Sinha, Chief Clinical Information Officer and Stroke Physician at Barking, Havering and Redbridge University Hospitals NHS Trust, reflects on the trust’s experiences with implementing AI in clinical practice and the delicate balance required between innovation and cultural evolution.
As healthcare continues evolving at an unprecedented pace, integrating Artificial Intelligence (AI) in clinical settings presents extraordinary opportunities and significant challenges. Drawing from our experiences at one of London’s largest acute NHS trusts, I have witnessed firsthand how AI can transform healthcare delivery—but only when we carefully consider the human elements alongside the technological capabilities and promises.
Like many healthcare leaders, we were initially captivated by AI’s potential to revolutionise patient care when we began our AI journey. However, we learned that successful AI implementation in clinical practice requires much more than sophisticated technology. It demands a balanced approach between innovation and cultural adaptation.
For example, our implementation of speech AI for clinical letter dictation at Barking, Havering and Redbridge University Hospitals NHS Trust represented a significant shift. The traditional process involved clinicians dictating letters after patient consultations, which required medical secretaries to transcribe, format, and process them before sending them to general practitioners (GPs). This workflow could take over a month to complete. The initial pilot showed remarkable promise, reducing letter delivery times to GPs from 37 days to just four days. This success could be attributed to the clinicians’ enthusiasm for the pilot.
However, as we scaled the solution, we encountered significant operational doubts, cultural resistance, and varying adoption rates among clinicians. Some staff expressed concerns about job security, as their roles were deeply embedded in this transcription workflow. Others were hesitant to change their established documentation practices and worried about losing essential administrative support.
AI in clinical practice – The adoption curve: a journey of persistence
We’ve discovered that the adoption of AI typically follows a natural curve, which includes initial resistance, gradual acceptance, and, ultimately, meaningful integration. During the initial implementation phase, we observed an increase in average letter delivery times attributed to cultural barriers and a phenomenon known as ‘AI aversion.’ However, sustained support led to a stabilisation of delivery times at 19 to 23 days—an improvement from our initial starting point, though not as dramatic as the results suggested by the pilot.
Key factors contributing to our success included strong clinical foundations, operational partnerships, and perseverance. Over the course of a few months, we were able to reduce delays from 37 days to 21 days, with some letters being sent out on the same day as the clinic visits if they contained brief text.
This experience emphasises an important lesson: the success of AI in healthcare relies not only on the technology’s capabilities but also on our ability to manage the human aspects of change. As I often say, even the most advanced solution for a complex NHS issue is likely to fail if we do not adequately address how people will adopt it.
Ethics and accountability in AI implementation
One of our most important initiatives currently is to establish an AI and analytics ethics group. This is not merely a bureaucratic exercise but a fundamental step in ensuring that our AI implementations serve our entire patient population fairly and effectively. We have found a way to navigate this with thorough governance discussions. The ethics group will help us address complex ethical challenges, particularly in the context of population health-based AI, where biases can significantly impact equality.
The ethics group acts as a crucial checkpoint, ensuring that our clinical practice AI solutions not only solve technical problems but also align with our values as healthcare providers. This is especially important when working with diverse patient populations, where accountable organisations must carefully assess AI systems for potential biases or limitations.
Cultural change and professional impact
The impact of AI on healthcare professionals is a sensitive yet crucial aspect of its implementation. When we launched our stroke imaging AI solution, we received such a strong response that it was jokingly said that we needed to wear a helmet and shield when walking past our colleagues. This reaction highlights healthcare professionals’ genuine concerns about AI potentially replacing their roles. Initially, this was met with doubt; discussions focused on the allure of the product and the need to compare different solutions.
Fortunately, AI has become so integral to the stroke care pathway that regional clinical workflows cannot function effectively without it. Our experience demonstrates that successful AI clinical practice implementation involves more than mere replacement; it requires enhancing existing processes. We have built greater acceptance and trust in these technologies by actively engaging with staff, addressing their concerns, and showing how AI can support—not replace—their expertise.
Looking ahead: a five-year vision
As we look to the future, our vision for AI in healthcare is ambitious yet rooted in practical reality. We focus on developing clear problem statements before implementing AI solutions, ensuring that we use technology to address specific, well-defined clinical needs rather than adopting AI merely for its own sake. One important lesson I have learned is to avoid “shiny toys” and overhyped AI products.
In the next five years, I envision a healthcare system seamlessly integrating AI into clinical workflows, supporting decision-making while preserving the essential human elements of healthcare delivery. This vision includes robust ethical frameworks to guide AI implementation. Enhanced clinical decision support systems will complement professional expertise, streamline administrative processes, and free up clinicians for patient care. Additionally, AI-enabled tools will improve access to specialist expertise.
Implementing clinical AI is not a sprint but a marathon. Success requires patience, persistence, and a deep understanding of both the technical and human factors involved. As we continue to advance in this field, we must remain focused on our ultimate goal: improving patient care and outcomes.
The lessons we have learned about the importance of cultural adoption, the need for ethical oversight, and the value of clear problem statements will be crucial as we continue to develop and implement AI solutions in healthcare. By sharing these experiences and insights, we hope to contribute to a broader dialogue on effectively and responsibly advancing healthcare through AI implementation.
Our experience at Barking, Havering, and Redbridge shows that while AI holds tremendous promise for healthcare transformation, its successful implementation depends on navigating the complex interplay of technology, human factors, and organisational culture. I reflect on my background, where academic or scientific publications of AI and relevant degrees contrast with the real-life human aspects of AI implementation and delivery. While both are important, I later realised that balancing these perspectives is the key to success in implementation within the NHS. As we continue this journey, maintaining this balanced approach will be crucial for realising the full potential of AI in healthcare. I look forward to sharing more of these insights and experiences at the upcoming Digital Health Rewired Conference on 18-19 March 2025.
By Devesh Sinha, Chief Clinical Information Officer, Stroke Physician, Barking, Havering and Redbridge University Hospitals NHS Trust