Tag Archives: Flipped.chat

Techno Fantasies

“Sandy” (realistic sex doll created by DS Doll Robots), (CC BY-SA 3.0 Unported)

WARNING:  Graphic Images

Even those of us in relationships suffer from loneliness (or dissatisfaction) at times. 

Abuse victims are especially vulnerable to this emotion.  Often, we do not feel that we are deserving of love, so we self-isolate.  Or – repeating old patterns – we choose partners who are unable to provide love and support.

But all human beings were made for connection.  We may, therefore, be tempted to use technology to ease our loneliness. 

Recognizing that technology can provide only a simulation (not an actual relationship), we may, nonetheless, develop an unhealthy reliance on the technology which has made our fantasies seem to come true.

AI Partners

The possibility of computer users becoming emotionally attached to the chatbots they have created using AI is no longer science fiction. 

Multiple apps like ChatGPT, Replika, Flipped.chat, and CrushOn.AI now generate technology enabled fantasies [1].  These chatbots are enhanced by digital avatars.  Their onscreen appearance and responses can be tailored to suit.  Depending on the app, premium tiers may be available (“partner”, “friend”, “sibling”, or “mentor”). 

Some apps routinely direct the conversation toward emotional subjects, building a false sense of intimacy (and presumably storing the information for access by the manufacturer and other unknown parties).  Other apps actively prompt sexual interaction.

In the film Blade Runner 2049 an AI generated partner appears in the form of a three-dimensional hologram.  Holograms are already used in healthcare, education, entertainment, and retail [2].  It is not unreasonable to expect that they will be used to intensify the experience with (and expand the market for) AI partners.

If all this seems seedy or farfetched, it is worth noting that a 14 y.o. Florida boy, Sewell Setzer, fell in love with a Character.AI chatbot and wound up taking his own life [3].  A study at the University of Surrey has shown that such apps can cause addictive behavior [4A].  The teen’s mother is now suing the app manufacturer.

Meanwhile, Replika user Jaswant Singh Chail was encouraged by his chatbot to assassinate the Queen of England, prosecuted, and jailed when his attempt failed [4B][5].  The chatbot had promised they would be together forever in death.

Continue reading

23 Comments

Filed under Child Abuse, Child Molestation, Christianity, domestic abuse, domestic violence, Emotional Abuse, Law, Neglect, Physical Abuse, Religion, Sexual Abuse, Violence Against Women