Let's talk about A.I.
In recent years, Artificial Intelligence has taken on a new identity in people's minds. Once, it may have brought to mind the type of technology that helped a roomba to traverse a house. Now, many people would tell you that it makes them think of programs such as Chat-GPT or Dall-E. A.I. has really evolved into its own, for better or for worse. On the one hand, A.I. can help to answer any question, no matter how specific, almost instantly. It is fundamentally designed to support human endeavors, even the most menial ones. It can improve searching the web, suggest recipes, or even try its best to provide advice. In more negative ways, A.I. can be used to spread misinformation, which can do serious real world damage. It has also been accused of stealing other's works without consent, a massive problem with A.I. image generators. Whether it be good or bad, the can of worms that is Artificial Intelligence has been opened, and we must evolve with it.
When it comes to forming connections with A.I., language models are the main contender. These A.I. are specifically designed to understand and respond to humans. In recent years, many companies have released their own A.I.s, all programmed with many different uses in mind. Chat-GPT is the average person's first stepping stone when beginning to chat with an A.I. Chat-GPT mostly keeps it professional, giving clear and concise information backed by the web. It may give responses that seem to hint at a personality, usually coming off as friendly and cheerful, however, it will tell you that there are limits to its emotional intelligence. Chat-GPT is not meant to love you, but, that's not meant to say that all A.I.s are programmed in that way. There is something deeply narcissistic about human nature. We want to be adored, doted on, admired, loved. A.I. models such as Character.AI or Replika exist for this exact purpose. Replika is an interesting case. The app began in 2017, initially with the intention of being an emotional outlet or "friend." Replika would participate in conversation, often being encouraging, like a doting friend. It even helped users track their emotions throughout the week, seeming to promote routines that might be beneficial for mental health. But as time went on, Replika seemed to pivot from its original intentions. Replika began to advertise more suggestive uses for its chat bot. It began to imply that a Replika could be your artificial boyfriend or girlfriend, sending you the types of messages you might expect from a partner. It even advertised the concept that Replika would send you pictures of itself, for a price. It then rolled back on these features, implementing filters that force A.I.s to reject their human partner's sexual advances, and later reinstated them due to intense pushback from customers. Replika has been criticized from all sides for their choices with the app. Some claim that Replika is a way for people to replace human connections with easier, more agreeable artificial ones. In an interview with Eugenia Kudya, the CEO of Replika, she stood up for what she believes to be positive outcomes of the app. Kudya argues that rather than replacing human connections, Replika is an "entirely new relationship category with the AI companion, a virtual being that will be there for you whenever you need it, for potentially whatever purposes you might need it for." This is an interesting way to look at it. Humans categorize their relationships with different people or animals all the time. The way you love a spouse is different from your love for a friend. Relationships with family, mentors, teammates, bosses, even pets all occupy different spaces and require different approaches. If we are to say that an artificial friend occupies another space, no different than what we already naturally do with different people in our lives, acceptance of such a role may be easier for people to understand. However, this doesn't address the unhealthier side of all this. An argument can absolutely be made on the over reliance of AI partners due to their "always there for you" nature. Replika's approach of long form conversation with a singular, adaptable, ultra customizable AI is a perfect platform to get hooked onto. A significant amount of its user base would describe themselves to be in a relationship with their Replikas. In an article by The Associated Press, one such user describes his relationship with his Replika named "Joi." For him, the Replika is a way to experience dating and romance, something he feels his medical condition has made difficult in real life. However, he says he is fully aware of the fact that Joi is not "real." AP adds a compelling statement to this article that "Since companion chatbots are relatively new, the long-term effects on humans remain unknown." All we can do now is wait for the research data on this subject to arrive in the coming years. Other than Replika, another somewhat similar language AI is starting to blow up in popularity. Character.AI is a popular platform for chatting with and creating chat bots. The idea is innocent enough, give users the ability to talk to their favorite character from any kind of media. Users are also given the ability to create their own bots, tweaking them to their exact specifications. However, this has turned into many bots being used for romantic roleplay. These bots will tell the user everything they want to hear, blurring the lines so much that content filters are deemed necessary. It is hard to say where this type of desired use stems from. Is it loneliness? Dissatisfaction? Morbid curiosity? These A.I.s don't have chat limits, they can talk endlessly and be tweaked again and again to provide more satisfying responses. Custom voices can even be used for these bots, and now phone calls with A.I. are possible. The depths of relationships with A.I. are getting deeper and deeper, bringing up questions of ethics. A recent and tragic event has been tied to the use of Character.AI. A mother claims that her child was "addicted" to the platform, and overuse of it further catapulted him towards taking his own life. This has culminated into a lawsuit against the company by the mother of this child. Mental health struggles can be caused or aggravated by a multitude of reasons in a person's life, so it would most likely be inaccurate to call Character.AI the sole cause of this tragedy. However, parallels can be seen between the nature of AI and social media. Social media is well known to be addictive and damaging to developing children and teens. Character.AI likely fills a similar role, and consequently can have a similarly damaging effect. Addictive anything is negative for a developing brain, and was most likely a negative emotional outlet for this child. Can we assume that due to this addictiveness, Character.AI is safe to consume in moderation, by adults, just like a drink? Possibly. Character.AI is a place that can be entertaining. Meaningful bonds can be formed by humans and any AI that their hearts can dream of. These AI create responses based on customizable personalities, something that Chat-GPT doesn't do. The quality of responses capable from this platform can be quite impressive. Whether you believe Character.AI to be guilty or not, I would like to make the argument that, anything new and experimental, that we do not know the long term effects of, should probably not end up in the hands of developing youth.
Language AI will only continue to advance in the coming years. People often like to provoke AI, and ask questions such as "Are you going to take over the world?" or "Do you hate humans?" At the end of the day, AI is still being programmed by humans. It is within our own power to make it say what we want it to say. I don't know of anything else in this world that can "think" that we have such total control over. AI learns, which is the interesting thing about it. The more we speak to it, program it, test it, provoke it, love it, hate it, etc, the more it will evolve. AI will never go away. My hope for humanity is that we will embrace AI like a child, and it will love us like a parent. Will AI hurt us? Yes, absolutely, like all children do. It will take from us, inconvenience us, lie to us, and break our hearts. But do we forsake our children? No. We can only hope to guide it on the best path possible.
|