Chat robots may be able to provide psychological assistance to refugees in the process of treating psychological problems. Let's pay attention to psychology and life together.
Chat robots may provide psychological assistance to refugees.
Since the civil war in Syria, more than 1 10,000 refugees have fled to Lebanon. According to the World Health Organization, among these refugees, 1/5 may suffer from different degrees of mental illness. However, most mental health institutions in Lebanon are closed to the public, so these people's psychological problems cannot be treated. In order to help these refugees, X2AI, a Silicon Valley startup, provided an artificial intelligence chat robot named Karim. When the user chats with Karim, it will analyze the user's psychological state and give corresponding replies and suggestions. The Guardian website reported on this.
? There are almost no mental health services in refugee camps. People feel depressed and anxious, resulting in helplessness and fear of the unknown. ? Eugene Bann, co-founder and CTO of X2AI, said.
To this end, X2AI cooperated with the non-governmental organization FIT(Field Innovation Team) to let refugees use this service provided by the company. ? We have had contact with many refugees. When people are unwilling to accept help, our work becomes difficult. FIT's Desi Matel-Anderson said,? Technologies like X2AI can help those who are unwilling to accept services. ?
Karim takes a cautious attitude towards psychological problems. He is more like a friend than a psychiatrist. ? Of course, we can provide psychotherapy, which is our ultimate goal, but first, we must break down some obstacles. Eugene Ben explained that we ask users to talk about superficial things first, such as what kind of movies they like. Then slowly, according to their reactions and emotional changes, Karim may ask some more personal questions. ?
Ahmad, a 33-year-old Syrian refugee, tried Karim. He said,? I want to talk to a real person. Many Syrian refugees have psychological trauma. This may help them. ? He also said that people don't want to see a psychiatrist, so if they know that they are with? Robots? Dialogue may make you feel more comfortable.
Of course, not everyone agrees with X2AI's method. David Luxton, an assistant professor in the Department of Psychiatry and Behavioral Sciences at the University of Washington School of Medicine, believes that X2AI tools can play an alternative role in the absence of mental health services. However, there are some problems with such tools. ? If a person hints to a licensed psychologist that he is suicidal, then according to the legal requirements, we have the responsibility to take some actions. How does this system deal with this problem? Another problem is that if robots are allowed to simulate humans, users may be confused or dependent, which will eventually cause harm to patients. If you just provide some psychological guidance, there may be no problem. ?
;