Skip to main content

AI has been part of healthcare for years, often working behind the scenes in hospitals and diagnostics. But in care settings, like residential homes, supported living, and nursing services, we’re only just starting to see how this technology might support day-to-day care. From smart sensors that monitor residents at night to apps that help identify pain, new tools are beginning to play a role in how care is delivered. The aim is not to replace staff, but to help them provide safer, more responsive support.

At SNG Healthcare, we work closely with home managers, carers, nurses and support staff across the UK. Many of the organisations we support are now exploring how AI can help.

In this post, we take a closer look at what’s already being used, what might be coming next, and why people will always be at the heart of good care.

AI Isn’t New to Healthcare

AI has been used in healthcare for years. Hospitals and clinics already rely on it to help diagnose conditions, track patient data, and support medical decision-making. It’s been used in everything from analysing scans to supporting robotic surgery.

What’s newer is its arrival in social care. Here, the focus is less on diagnosis and more on daily support: monitoring residents, helping staff assess pain, or training carers more effectively. These tools are starting to appear in care homes, supported living settings, and other residential services.

Used well, this technology can help staff deliver safer, more responsive care. But it also raises important questions about trust, consent, and the role of people in care delivery.

What AI Can Do in a Care Setting

AI in social care usually means systems that collect information, spot patterns, or support decision-making. These tools don’t think like people do. They use data to offer alerts or suggestions, often in real time.

Some current examples include:

1. Fall and incident monitoring
Services like AllyCares use discreet sensors to listen for changes in movement or sound overnight. If someone falls, coughs repeatedly, or behaves in an unusual way, the system alerts the night team. This can reduce the need for hourly room checks and help prevent hospital admissions.

2. Pain recognition for non-verbal residents
Tools like PainChek use a phone camera to scan a resident’s face. It analyses tiny changes in facial expression and gives a pain score, helping carers assess whether someone is uncomfortable. This can be especially useful for people living with dementia or at the end of life.

3. Training robots for care students
Researchers at Oxford have built a robot that responds to touch. If a trainee applies too much pressure on a ‘painful’ area, the robot flinches. It’s designed to help new carers learn safe and gentle handling before they support real patients.

Each of these tools helps carers do their job better. But none of them works alone. Every system needs trained staff to interpret the results, talk to residents and families, and make informed choices about care.

Concerns About Replacing Human Contact

Technology can support safer and more responsive care. But it also raises difficult questions.

At the University of Oxford’s Institute for Ethics in AI, researchers are urging a cautious approach. Dr Caroline Green, who leads their social care work, warns against seeing AI as a complete solution. She points out that there is no national policy in place to guide how AI should be used in care, and no legal requirement for consent.

Her team highlights concerns about bias in some AI systems. For example, facial recognition software has been found to perform less accurately on darker skin tones. If these tools are used to detect pain or distress, they must be tested thoroughly for fairness.

Others raise questions about surveillance. Even when data is anonymised and securely stored, people may not want to be watched, even for good reasons. Families might feel reassured, but residents have a right to privacy and dignity.

The Importance of Choice and Consent

For AI to work well in care settings, people must understand how it’s used and why. Residents and families need to be informed and involved in decision-making. Staff need training not just in how to operate systems, but also in how to explain them to others.

For example, Christine Herbert, whose mother Betty lives in a care home using AllyCares monitoring, was initially wary. She worried about her mother being “watched”. But after seeing how the system worked and reviewing her mother’s sleep data, she felt more comfortable. The home had been clear, open and respectful—and that made a difference.

What the Future Might Hold

Some technology will likely become more common over the next few years. Apps that support decision-making, tools that reduce paperwork, and wearable devices that monitor health all have a place.

But human touch, empathy and relationship-building will always be at the centre of care. AI may help us do certain things faster or more consistently, but it cannot offer warmth, reassurance or kindness.

If we continue to invest in skilled carers, while using technology to support them, the future could be brighter for residents and staff alike. If we see AI as a shortcut or a cheap fix, we risk losing what makes care meaningful.

What Families Should Ask About AI Tools

If your loved one lives in a care home or supported living service using AI tools, you may have questions. Here are some things you might ask:

  • What kind of technology is being used?

  • How is data stored and who can see it?

  • How do staff use the information in practice?

  • Can residents or families opt out?

  • What training do carers receive on using the tools?

Good services will be able to answer clearly and confidently. They’ll see AI as part of a wider care plan, not a replacement for it.

References
  1. BBC News (2025)Can AI care for your loved ones?
    [BBC South Investigations, 6 May 2025]
    https://www.bbc.co.uk/news/articles/cz7eewqk17ro

  2. Skills for Care (2023/24)The state of the adult social care sector and workforce in England
    https://www.skillsforcare.org.uk/adult-social-care-workforce-data

  3. UK Government Home Office (2024)Immigration system statistics
    https://www.gov.uk/government/statistics/immigration-statistics-year-ending-december-2024

  4. Oxford Institute for Ethics in AIAI in Social Care Summit Highlights (March 2025)
    https://www.oxford-ai.ox.ac.uk/research/ethics-in-social-care 

  5. British Association of Social Workers (BASW)Guidance on AI and relationship-based social work
    https://www.basw.co.uk

  6. PainChek UKClinical validation and use in dementia care
    https://www.painchek.com/clinical-validation/

  7. AllyCaresSensor-based monitoring in care homes
    https://www.allycares.com

To speak with our team about staffing, care services, or support, get in touch with us today