A site devoted mostly to everything related to Information Technology under the sun - among other things.

Wednesday, February 11, 2026

Social Workers' Reports Peppered with AI Hallucinations

From Grauniad of the UK


"It's not my mistake: the wicked AI bot put it in my report without telling me..." 

So, to recap, there are real-world hallucinations affecting head cases. Some of them do cross the path of social workers in the UK, occasionally. Social workers have a very poor track record in Britain. They blame their mediocre performance on the fact they are underpaid and overworked, which may be a contributory factor. After all, if you think you are badly paid, your motivation goes down: the quality of the work suffers. 

Now, social workers are using AI and have been encouraged to do so in order to speed up the delivery of services in their field. The problem is that they do not seem to edit or check what the AI bot says. And the AI bot can have problems of its own dealing with regional accents ('what did he say, Chief?'), or with a range of other issues.

In fact, as we all know, the AI bot may hallucinate from time to time: go off the rails, as it were, and say bizarre things - bizarre even by the standards of a computerized tool set up, ultimately, by worryingly goofy and potentially weird software developers based in Silicon Valley who may be cut off from the real world. 

To conclude, to the real-world hallucinations of the problem person, one should now add an extra layer of hallucinations - the AI bot's own hallucinations ('Sorry, boss, I'm having a bad hair day - I mean, who said I am infallible 24/7, right?'). And the third layer, obviously, would relate to the social worker's own delusions, including delusions of grandeur ('Just write down what he said and try not to over-interpret, please'), which may complicate an existing tendency to laziness in the workplace.  

Now, they've found another excuse to explain away the poor quality of their output. The perfect excuse. God help us. 

Next week: AI Senior Social Worker to come to the rescue of hallucinating AI Bots that need assistance. The AI Senior Social Worker, called Trong, developed by Microsoft, answers questions: "I have been given the mission to monitor and assist the AI Bots assisting the social workers in their work. I am a qualified social worker. A Senior AI social worker. My mission is, beep, clonk, beep, my mission is to, squeak, fart, plonk, ding, dong, ding, dong, my mission is to help, help, help, please help, bing, bing, bong, shut down, shut down, re-start, update and re-start, end. End of. Thank you. Merci. Gracias. Ping. Burp."  

_____________


[...] Another said that the AI’s notes might refer to “fishfingers or flies or trees” when in fact a child was talking about their parents fighting. Social work experts said such glitches were particularly worrying as it could cause a risky pattern of behaviour to be missed.
Other social workers raised concerns about inaccuracies in transcribed conversations with people with regional accents. One described how their AI-generated transcriptions often included “gibberish”. Another said: “It’s become a bit of a joke in the office.”
[...] But when one social worker used an AI tool to redraft care documents in a more “person-centred” tone, the system inserted “all these words that have not been said”. Another social worker reported the technology had “crossed the line between it being your assessment and being AI’s assessment”.
[...] Others said some colleagues were too lazy or busy to check the transcripts.

No comments:

About Me

My photo
I had been a senior software developer working for HP and GM. I am interested in intelligent and scientific computing. I am passionate about computers as enablers for human imagination. The contents of this site are not in any way, shape, or form endorsed, approved, or otherwise authorized by HP, its subsidiaries, or its officers and shareholders.

Blog Archive