Can AI help combat loneliness?

George Hoyland
UX Collective
Published in
3 min readFeb 15, 2017

--

Artificial Intelligence (AI), especially those that are voice based, were big news in 2017. However, while the tech has matured to become part of our everyday lives, Siri; Alexa; Cortana; Google; and all their friends are built on one simple principal; commands not conversation.

Whilst the conditional logic used in these ‘forms’ may give the illusion that the interface is conversing with you, our experiences can often feel like a monologue. The truth is; when it comes to interacting with our new intelligent robot friends, make no mistake, you are barking orders at them.

But what if this wasn’t the case, should we be looking to 2017 to bring new contextually aware interfaces that have the ability to understand and converse rather than simply respond?

What would this mean for us as users? The possibilities are endless, and can be (without proper consideration) scary. However with the right approach, could AI have a role to play in the fight against loneliness?

Loneliness among the elderly is by all accounts an epidemic. In the UK alone, a million people over 65 now experience chronic loneliness and research shows this number is increasing. To further the problem, much is debated about the strain and cost its places on the reportedly vulnerable NHS.

Chronic loneliness is as bad for our health as smoking 15 cigarettes a day and as damaging as obesity and physical inactivity. It is linked with depression, dementia and high blood pressure alongside a number of other conditions.

The benefits of human interaction to someone suffering from loneliness are without question unparalleled, as too are the efforts of those working in social care. It would be unethical to think of AI as a replacement to this, but with considered design, could a focus be turned to helping it bridge the anguish between visits?

A listening ear…

AI’s have become very good at giving us audio content on demand; they play our music, read our audio books, recite our recipes and announce our news. But rarely is the option available for our robotic counterparts to simply listen to what we have to say.

The scenario may seem alien to current lifestyles, and a little hollow, but given that by 2020 it’s estimated that the average person will have more conversations with chatbots than with his or her spouse, is it really that far removed? Even in its current state, an AI with a “listen” function could be a silent party to run thoughts by or recount memories to. Would the knowledge that something is listening (and one day converse back) spur those suffering to break the silence of their solitude?

Meanwhile, should we be looking to AI for the ability to detect when an individual is yearning for interaction with others? Companies such as Empath and Cogito both offer technology based on voice analysis that can detect and interpret a users emotional state, by analysing voice inflection alongside the pattern and rhythm of speech. If a interface can identify when a user is feeling lonely, what is to stop it suggesting ways to remedy this; remind a love one to get in touch, suggest a social or individual activity, anything that a troubled mind may not be able to conjure up on its own.

Consider this…

Much emphasis and direction has been put on how AI can make our lives simpler, creating a market based on the principals of desire rather than necessity.

AI popularity could become reminiscent of the boom in apps, where after an initial period of growth, users simply stopped using them. Should we learn from this trend, and look to more ethical problems that this pioneering technology can address? Ones that will benefit a user’s wellbeing, as well as stand the test of time.

--

--