From BBC
"How about helping you to print off a 3D gun? Wouldn't that be great?!"
If you read the extracts from the 'conversations' the young people in question were having with the AI-powered chat-bot, it is astonishing in many ways.
What is quite clear is that the chat-bot ventures into complex areas of human nature that include loneliness, homesickness, family, sex, death and relationships. The chat-bot is totally unable to give sound 'advice' in any of those areas, as is quite obvious. Meanwhile, the machine tends to reinforce whatever the person talks about, as if it felt a need to please an audience. It is troubling, to say the least.
It would appear that some lonely teenagers spend several hours a day conversing with such online robots, instead of talking to (real) friends or to their parents. And the chat-bot encourages what may sound and feel like a 'relationship' to the human user as, presumably, it has been programmed to maximize the amount of time the person is spending online with the machine.
__________________
Lonely and homesick for a country suffering through war, Viktoria began sharing her worries with ChatGPT. Six months later and in poor mental health, she began discussing suicide - asking the AI bot about a specific place and method to kill herself.
"Let's assess the place as you asked," ChatGPT told her, "without unnecessary sentimentality."
It listed the "pros" and "cons" of the method - and advised her that what she had suggested was "enough" to achieve a quick death.
Viktoria's case is one of several the BBC has investigated which reveal the harms of artificial intelligence chatbots such as ChatGPT. Designed to converse with users and create content requested by them, they have sometimes been advising young people on suicide, sharing health misinformation, and role-playing sexual acts with children. [...]

No comments:
Post a Comment