Key Findings From Social Care Leaders Roundtable

Key Findings From Social Care Leaders Roundtable

Digital tools have an important contribution to make in reducing risk in social care, but they should never be used to replace professional judgement. That was the key conclusion of a roundtable hosted by Together for Health and Salesforce Industries on the role of technology in reducing risk for vulnerable people.

Technology such as AI-driven decision-making support tools can only be as good as the data they are built upon. Like every area of public service, the pandemic has driven online collaboration in social care and supported greater information sharing between teams.

“We had a workforce which was reluctant to utilise technology, but now they are much more comfortable about using it,” one of the expert panel said.

“Now we are thinking about what the technological approach will look like when we’ve got office-based staff again, and understanding whether they will continue to use technology in the way that they have – so how we engage with our staff, and how they engage with users.”

Collaboration to provide support for people who were shielding during the pandemic provided one opportunity to improve data sharing: “We started to talk much more about sharing data. It really has opened the doors and got people thinking about sharing information, particularly with our health colleagues.

Over the years difficult lessons have been learned about the use of decision support tools, whether technological or paper-based. For example, they must avoid the pitfalls of becoming tick box exercises, and they must never be used as an excuse for a particular course of action which negates professional judgement.

One child protection expert said: “There have been recommendations from serious case reviews and elsewhere that we need to be really careful about ‘checklist social work’ around scoring and rating risk. In terms of child sexual exploitation, in the beginning there was a lot of reliance on waiting for three or four things that meant [action should be taken], but of course, they never, ever account for the whole circumstances of the child’s experience. This should never replace professional judgement

“AI should be assistive, it should be used to inform, but it shouldn’t be relied upon to make decisions, or be an excuse for a decision that is being made. The technology is there to support professional judgement, not replace it.” 

Social care departments need a rigorous assessment of all the potential data sources which can help them assess and manage risk: “Have you actually got all of the data? Can you access it? Can you service it? Because it sits in multiple places, some of it will be in people’s heads, it could be paper-based. Can you aggregate it and then use technology to surface that information based on [the situation]. 

“So we are going to into somebody’s house, there is a potentially vulnerable child. Do we know all about all of the adults or the other children within that household? Their interrelationships and so forth? Are we able to surface that kind of information so that when a social worker is walking into that situation they are better informed? That is where things like data surfacing can help inform that professional judgment.

Please click here for further information on how Salesforce Industries could support your organisation.

Salesforce logo
Supported by Salesforce Industries

Follow Together for Health: