Can realistic NPCs and digital companions make people feel less lonely?
Copy Link
Technological advancements have made modern society more connected than ever before. Regardless of the device you are currently reading this newsletter on, you have a direct line of communication to just about anyone in your life (whether you know them intimately or not). Through texts, emails, video calls, social media feeds, and live video content, there are endless opportunities for connection with anyone across the globe.
Given we have unlimited channels for interpersonal engagement, it is both ironic and unsettling that our society feels so alone. Today, in the US, ~58% of adults are considered lonely (Cigna). It may seem like forced isolation effects from COVID-19 lockdowns are lingering, but a pre-pandemic study actually showed that 61% of adults were lonely in 2019 (Cigna).
The persistent availability of social outlets should theoretically mean instant connectivity, but social media has shown us that the opposite is true. While people turn to social platforms when they feel isolated, it generally deepens their perception of loneliness (Kaiser Permanente). One-to-many forms of online interaction often foster unhealthy social comparisons since the audience is often looking at curated snapshots of another person’s life. Even in one-to-one relationships, the preference to type a message rather than talk to someone can dehumanize the conversation. In this sense, immediate access can come at the cost of depth and value.There is a widespread sense of loneliness that technology has not solved, and in many cases, has made worse.
With that in mind, it has been interesting to see conversational chatbots be one of the most anticipated use-cases of artificial intelligence, specifically in generative AI. Companies such as Soul Machines, Inworld, Replika, Character.ai, ConvAI, Kajiwoto, and Carter are creating solutions that enable dynamic conversations with digital characters.
There are undoubtedly some compelling applications of this technology. Consider the ability to interact with your favorite non-player characters (NPCs) in a massively multiplayer online role-playing game (MMORPG) to unlock novel quests or create entire game loops around conversations with NPCs in order to progress to the next level. Video games are already an incredibly immersive medium of entertainment, and this truly allows players to feel immersed in the story and IP. As gamers ourselves, we are excited for this new reality.
However, innovation in this space also brings up an important societal question: can these highly realistic NPCs and digital companions make people feel less lonely? After all, we are talking about complex virtual beings that have their own backstory, memory, and personality. The line between human and machine will continue to blur.
In a world where communication is increasingly impersonal, the combination of complexity and constant availability of these AI NPCs could provide a sense of genuine connection that seems to be lacking for so many people today. That being said, we believe that there are precarious consequences of this new reality that could be detrimental to the way we interact with the world.
There are four primary dangers that we believe should be monitored as human-computer interactions become more integrated into our lives:
#1 - Misguided Perception of Reality: Over time, more realistic NPCs and digital companions will allow people to more easily suspend their disbelief that they are not talking to a real human. This enables more meaningful interactions and a genuine sense of companionship. However, we posture that these deeper relationships with NPCs and digital companions ultimately create a greater sense of isolation when you realize that the other end of your conversation is not an equal replacement for a human counterpart. As a result, there is almost an inevitable disappointment that only grows stronger with time.
#2 - Confirmation Bias: Unlike humans, artificially intelligent beings can be entirely programmable and therefore more predictable. This opens up the opportunity to create digital personas that are manipulated into exactly what a person is looking for. On the surface this does not necessarily sound like a problem, but it creates a precarious power imbalance where the other participant in the conversation is basically a utility. Confirmation bias can be cultivated in relationships between real people, however, this can directly be engineered with AI-powered chatbots. This establishes unhealthy social dynamics that could ultimately impact the way that people interact with each other.
#3 - Bad Behavior: When a user does register that they are interacting with a machine, they are more likely to act in ways that would not be acceptable in a peer-to-peer situation. This can include more aggressive behavior such as bullying and explicit language. We have already seen how the abstraction of people in online games has fostered extreme cases of toxicity, and this new reality seems liable to take this behavior to the extreme.
#4 - Distraction From the Real World: As mentioned above, social media has already shown us how easy it is to fabricate a digital persona. By investing time in online relationships with NPCs, people are naturally allocating time away from relationships in the real world. As referenced in the concerns above, this can result in people being unable to effectively socialize in person given their personality increasingly shifts towards tendencies that are not compatible with real social experiences.
Takeaway: While AI NPCs will likely provide for a more immersive experience for players and may be able to provide a temporary sense of companionship, there are a number of unhealthy social consequences that we believe should be carefully monitored. In the long-term, we believe these could actually compound the existing sense of loneliness that is continuing to be a problem in our modern society.