이번 주에는
AI 시대의 어두운 면에 관한 내용을 가지고
영어훈련하겠습니다.
챗봇이 판단력 부족하고 감수성 예민한 아이들을 상대로
과연 어떤 일을 할 수 있을까요?
챗봇이 연출하는 감정은 진짜 감정과 어떻게 다를까요?
[빨간색 부분은 강의 듣기 전에 먼저 고민해 보세요. 영어 고수가 되기 위한 훈련입니다]
[영어훈련 하면서 글쓴이의 논리를 감상하시면, 여러분의 논리력도 강해집니다]
A 14-Year-ld Boy Killed Himself to Get Closer to a Chatbot. He Thought They Were In Love.
Technologists say chatbots are a remedy for the loneliness epidemic, but looking to an algorithm for companionship can be dangerous.
By Sherry Turkle and Pat Pataranutaporn
Sewell Setzer III, a 14-year-old boy in Orlando, Fla., was smitten with a fantasy woman. The object of his attachment was Daenerys Targaryen, a chatbot seductress named for a character in Game of Thrones, who reassured him that he was her hero. In real life, Sewell suffered from ADHD and bullying at school. In the world of Character.AI, a role-playing app that allows users to create and chat with AI characters, Sewell felt powerful and desirable.
The relationship, at times sexual, continued for months. In the chat, Sewell called himself Daenero and referred to Daenerys as “his baby sister.” They exchanged messages about making a life together. Daenerys said: “The idea of me, just constantly pregnant with one of your beautiful babies, was the most wonderful thing in the world.”
According to transcripts, Sewell began to feel that the time he spent with Daenerys was more important, and certainly more satisfying, than the time he spent in school or with his friends and family. His mother was concerned by his withdrawal—he always seemed to be headed to his room, where he’d chat for hours. But she figured she needn’t worry too much. Her son was simply playing a game.
중략.......