Can AI overhaul the social care system? That’s the pitch underpinning UK home care provider Cera‘s plans. The startup has today taken the tiniest baby steps to launch an AI chatbot that it hopes will, at an unspecified point in future, be able to assist carers with recommendations for home care of people with conditions such as dementia. And even potentially steer off medical emergencies via pre-emptive alerts.
The far more basic reality of the chatbot it’s launching today is that Martha (as it’s called) will be able to recommend care packages to potential customers. Which underlines how the inflated promises of AI really do hinge on data acquisition. In Cera’s case it’s largely leaning on its social care workers to generate the underlying data to train the AI. These human workers will be tasked with creating the data points to fill out the care records that will be used to power the chatbot’s future care recommendations and alerts.
And while there are plenty of symptom-checker type AIs already out there, Cera’s positioning in the social care space sets it apart from other platforms, argues co-founder Ben Maruthappu, given it’s not aiming for the chatbot to be used directly by its clients (who may not be capable of using a smartphone app, for example), but rather to act as decision support for their carers.
Here “AI has the scope to be very impactful”, he argues.
The startup, which bills itself as a “tech-enabled home care provider” launched its social care matching platform last November, and has raised some $3.4 million to date from investors including Kima Ventures and Credo Ventures.
It has “hundreds” of care workers on its platform at this point, according to Maruthappu, and has delivered tens of thousands of care hours — “accruing millions of data points”, as it couches it.
Maruthappu says Cera is firstly using technology to accelerate the process of matching appropriate care workers to clients, as a route to outmanoeuvre traditional providers, and also applying tech to squeeze back-end costs so that it can spend more on front end care and compensation for care workers to try to raise quality standards in an industry that has been beset by scandals.
Ultimately, though, it is also hoping all those care-related “data points” being gathered by care workers on its platform will be able to power an AI that it can deploy to augment its future care services with decision support at scale, and provide even more of a differentiator vs traditional care providers.
The chatbot, which is being developed in concert with Bloomsbury AI, a machine reading spinout from London’s UCL, will use machine reading and deep learning to dispense personalized care advice.
Maruthappu gives the example of a care worker messaging Martha to say that a patient is feeling a bit hot and the AI then pulling relevant info from their care records — noting the patient had a cough last week, and telling them to check for a temperature and other symptoms in case the patient has a chest infection.
“We’re going to use Martha [for] supporting our care workers in providing better quality care. Essentially raising the ceiling on the standard that is delivered,” he says of this future plan.
“We [also] want Martha to be able to predict if people are going to deteriorate… Based on reading previous entries in care records Martha will flag alerts and essentially pre-empt a person’s deterioration so that care workers and family members can be adequately alerted and a proactive approach can be taken to their care.”
He won’t give a time frame for launching the predictive alerts, but decision-support should be coming later this year he says.
Of course should Martha actually be in a position to start dispensing care recommendations it would likely need to have been registered as a medical device with the UK’s regulatory body, the MHRA. And Maruthappu confirms Cera has not currently registered the app, since it’s merely dispensing sales suggestions to potential customers at this point.
Is the aim to use the AI to effectively upskill care workers with medical training? He says it’s not to upskill them to the level of trained nurses, for example, but to offer decision-support so they may be better able to identify when “escalated care” might be required.
Maruthappu also argues that a chatbot interface that can be used to keep track of individual patients’ care records can help quality of service in instances where a client might be seen by multiple care workers — helping to join the dots in their care over time.
Cera has partnered with ten NHS organizations at this point ten-weeks in, which Maruthappu says collectively cover a population of around six million people.
“We offer a higher quality, more efficient and transparent service,” he says, discussing the business’ pitch to the healthcare organizations it’s selling services to. “At the moment bed blocking is a tremendous issue in the NHS. This is essentially where a patient whose in a hospital could be discharged home and is medically fit to be discharged to go home but for non-medical reasons they don’t go home.”
“And if you look across winter the number one reason why people weren’t discharged when they could have been is because their home care package was not organized… This is a massively growing problem for the health service,” he adds.
What are the risks of having an automated technology dispensing what amounts to medical advice that may then be actioned by a human? “It’s fundamentally care advice, and it is decision-support, but these are all things that are within the remit of a high quality care worker. We’re simply trying to support and increase consistency in the care,” argues Maruthappu.
“As an analogy, if a taxi driver needs to go from A to B and they’re using a maps app, the maps app is supporting them but ultimately it’s the driver who is driving, who is making decisions about the route — and if they need to change the route it will do that accordingly.”
“Apps are simply an enabler which can potentially improve efficiency and quality,” he adds. “But ultimately it is up to the person delivering the services to make appropriate decisions and manage that responsibly.”