03.2

The New Care Relationship

Chapter 3

Empowering Clinicians as Co-Pilots

People have become used to digital services that are intuitive, transparent and responsive. Yet many patients remain cautious about digital health - particularly where AI is involved - driven by concerns around safety, privacy and accountability.

This is why ensuring patients have autonomy and are engaged in their care is critical. It’s clear that patients who are more engaged in their care see better health outcomes. Engaged patients cost health care systems 8% less than non-engaged patients in the first year, rising to 21% less in future years.[V]

Concerns around safety and data privacy persist, as does a wider distrust of AI in providing clinical advice, with a 2025 report from pharmaceutical manufacturer STADA Thornton and Ross[VI] finding that around 60% of UK-based patients wouldn’t trust AI to deliver accurate or safe health advice. That caution is understandable: inconsistent experiences across platforms undermine trust, and digital exclusion remains a genuine risk.

But this also gives us a clear design brief for next generation care: trust is built through clarity, control and a dependable “human backstop.”

Wanting to be more engaged as a patient isn’t new;

the Personalised Care Institute[VII] conducted a survey in 2022 which showed that around 45% of patients wanted more involvement in their healthcare decisions. Patients should be active co-pilots in their care, with more agency over access, information and next steps. That means designing services from the perspective of someone using digital health for the first time, especially when they are anxious, unwell, or supporting a family member.

In practice, that requires:  

  • Clarity about what happens next and why.
  • Transparency about available options and how choices are made.
  • Seamless escalation to a human clinician whenever needed.

Virtual care can and should be the default where appropriate, but with safety, choice and escalation built in. Trust grows when people understand where accountability sits, when plain language replaces jargon, and when outcomes and experience data are visible.

Recommendations

  • Set national standards for plain-language transparency on how digital tools (including AI) are used.
  • Provide digital navigation support for less confident users.
  • Co-design services with patient groups and publish outcomes to build trust.
  • Strengthen redress and accountability mechanisms alongside data protection.
  • Apply real-time visibility via the NHS App: Make it a mandatory standard that all diagnostic outcomes, referrals, and 'next steps' are visible to the patient immediately, ending the era of uncertainty and 'waiting to hear’.

Chapter 4

Enabling Clinicians to Practise at the Top of Their Licence

Burnout is not simply about workload. It is driven by workflow: fragmented systems, duplication, and administrative burden that drain time and erode professional satisfaction. When clinicians spend large portions of their day documenting, chasing information, and managing logistics, empathy and relationship-building are the first casualties.

The British Medical Bulletin[VIII] recently analysed 15 separate studies that looked at how empathetic AI chatbot responses were compared to human healthcare professionals’ reactions. Its deeper dive found that in 13 of the 15 studies, the AI chatbot had significantly higher empathy ratings.

It is also why the debate about AI in healthcare can become distorted. The aim is not automation for its own sake nor is it focussed on replacing clinicians. The aim is to remove avoidable friction so clinicians can spend more time doing what only humans can do: deliver complex judgement, empathy and clinical accountability.

In next generation care, technology acts as a clinician co-pilot:

  • Triage and routing to the right clinician and modality first time
  • Documentation support (e.g., note drafting, coding prompts, structured summaries)
  • Workflow automation (results chasing, referrals, follow-up nudges, scheduling)
  • Standardised pathways that reduce variation while preserving clinician judgement

Operationally, this is where digital-first models unlock real capacity: fewer duplicated steps, fewer avoidable appointments, fewer broken handovers, and a single integrated view of the patient. Clinicians can work across modalities from async to chat, video and in-person, all without the system creating extra work at every transition. The impact of AI in healthcare delivery is clear. Research, in partnership with Microsoft and led by an academic at Goldsmiths University,[IX] found that by using generative AI tools, doctors could save four hours admin time each week, and nurses five.

Used well, technology makes the system more humane.

Ultimately understanding the role AI can play in healthcare is key to visualising what the next generation clinician experience can look like.

Recommendations

  • Prioritise technology-driven solutions and tools that reduce workflow burden.
  • Involve professional bodies in setting standards and assurance.
  • Tie procurement to clinician time saved and documentation quality.
  • Invest in clinician training and change management.

Go back:
03.1 System Under Strain

Continue reading:
03.3 Technology, AI and Economic Impact