Help Before We Know We Need It?
Leave a Comment
ManMade // Pete Trainor
It’s been a week since ManMade | The Conference at Himley Hall nr. Dudley there’s been a lot to think about. I was fortunate enough to spend time with the other speakers on the lead up to the conference and each one them, along with all the organisers, are inspirational, brave and fighting this epidemic in their own way. All of them are helping to make it easier for men to find help without shame or stigma. It feels like we’re just at the tip of the iceberg on this one too.
If over a third of local authorities do not collect information about suicide, and do not have a suicide prevention action plans, or a multi-agency suicide prevention groups, then the issue is going to need to be tackled outside of the system.
As I explained on Monday 13th June to all the delegates, I’m fighting the problem with the gifts that I’ve been given — technology, design, psychology & a passion to disrupt traditional problems with unconventional methods. I’m also a man, so I know full well how tough it can be to carry on in this ever-changing world. I’m a fairly typical bloke about a lot of things and have had many a “crisis of masculinity” in which I failed to seek help even when catastrophic events hit my life. It could have led to tragic consequences for my families and I’m thankful it didn’t. I’m one of the lucky ones. I’ve worked in technology for 20 years now and have been fortunate enough to work all over the world, with the biggest tech companies on the planet. When we set the business up several years ago, we vowed to use all the technological knowhow we’d accumulated and all the emerging technology we’re exploring with our clients, to actually do some good for society. We could build you a behavioural analytics platform that tracks browsing behaviour to sell people better products, or we could use exactly the same technology and build a platform that helps people with more human issues.
The rising suicide rates among men should be treated as a national public health issue on a par with smoking, obesity or pollution and yet the government does so little to support the amazing organisations tackling the problem. So I figured — maybe we could have a crack at helping there.
A large portion of men never talk to anyone about their problems, variously because they feel ashamed, do not want to discuss feelings or simply don’t “want to make a fuss”. What if we could give these men something they are happy to talk about? There’s a generation of men whose adult lives have been marked by major social changes affecting the workplace and family. They’re in pain.
They don’t have a way of offloading all that stress and inner turmoil that speaking so often releases. Maybe technology can help them to talk? In the technology world we recently hit an inflection point that’s going to give us a huge opportunity to do what I always dreamed we could do — help vulnerable people, even before they know they need help. Artificial intelligence, which has always been the subject of science fiction, is now mature enough to handle some of the most complex challenges. Perhaps even the kinds of human challenges that traditionally a trained professional would be relied upon to handle. Now, that statement might sound controversial, but I just want to point out that we have to train a machine to learn in much the same way that a human needs to train to be a fully qualified professional.
So we can really get a machine on par, if not smarter than a human on any given subject. The real challenge comes with empathy — because surely that’s a trait that only humans can learn, right? To a degree yes, but we now have such sophisticated emotional and sentiment analytics software at our disposal, that we can generate dialogues between a human and a machine that are so intelligent and delivered in such an elegant style, that they become almost as good, if not better than the real thing. It’s worth also keeping in mind that artificial intelligence doesn’t sleep, it doesn’t eat, it can’t make mistakes or have a bad day and it can service tens, hundreds, thousands, even unlimited numbers of people simultaneously.
The implications, although uncomfortable to some, are profound and game changing for others.
In my talk at ManMade I covered the three types of artificial intelligence, from narrow (basic decision support tools) to general (services like Siri) through to super intelligence (cognitive intelligence) which is already used by a lot of companies to sell us more products or monitor our patterns in order to market to us in a more tailored way and all this is going on while suicide, like a virus we don’t fully understand, is killing men in record numbers. It kills three times as many British men as women, although nothing has ever adequately explained why. While almost all other leading causes of death are being slowly eroded by medical and social progress, deaths caused by suicide are at their highest for decades.
So it seems such an incredible waste not to use these wonderful, powerful technological advancements on tackling the bigger issues, rather than trying to sell more shiny things to people. Let’s just imagine a man who is lonely, introverted with difficulties expressing his feelings. He’s unhappy because of his impending divorce from his childhood sweetheart. He’s connected to his phone almost constantly because the void created by his loneliness is filled with hours browsing the internet, reading news (which is usually contributing to the unhappiness!) being on social media and so on.
What if I could give that man a talking operating system with artificial intelligence, designed to adapt and evolve like a human being. He can choose the gender and personality of this operating system and it will adapt very quickly to the individual just by listening to his voice bio-metrics, by reading all the environmental and other data factors from that smart phone. It can’t judge him because it’s not programmed to judge — just support. The artificial intelligence has the ability to learn and grow psychologically, and bond with the man over discussions about past times, life events and maybe even eventually learn to talk to the man about the factors that make him so lonely and unhappy. It’s Socratic, so it asks a lot of questions and he’s happy to talk, because he knows he’s not being judged. It’s literally ‘artificial’ and ‘intelligent’.
It might sound like science fiction, it might also sound a little sad to some people who think our reliance on technology is already eroding some of our humanity, but I see the opposite — an opportunity to give some vulnerable people back something missing.
I get the impression that we still think that the type of men who will die by suicide are the unwell, the disturbed, the unlucky; the ones who stumble at life’s biggest hurdles and are too weak to get back up. But in reality 75 per cent of people who take their own lives have either never been diagnosed with a mental health problem or been in touch with mental health services in the previous year and only five per cent of people who do suffer from depression go on to take their own lives — they’re what society would deem to be normal. But they still need someone to ask them how they are everyday. Someone to talk with, someone to analyse behaviour in intimate detail and provide help, wisdom, and guidance and potentially even warn a family member, friend or professional if it is felt that the man is a danger to himself.
When asked what counts as emotional support, many men do not describe relationships based primarily on ‘talking about feelings.' What we count as support is ‘being there’, ‘being alongside’ and understanding based on personal experience, or knowledge of the person, and being reachable if needed. We can do that with technology in so many ways.
Let start to conclude by telling you about an experiment we recently conducted with 200 male volunteers who were studied using a chatbot we’d built that asked men questions about how they felt about life. On the back of the chatbot was a powerful sentiment analysis tool to track how the men responded. We told half of them the chatbot was being controlled by a person ("like a puppet"), while the other half were told it was computer-controlled ("fully-automated") and there was no human on the other end.
The volunteers who thought they were talking to a computer tended to engage in less "impression management" and also displayed emotions like sadness more intensely they also said they felt less afraid to disclose personal details about themselves than those who talked with the supposedly human-controlled program. Who says the machines aren’t as valuable as people? At Nexus we want to build a future where stretched, professional people are augmented by smart, accessible, beautiful technology. We believe the future of supporting men of all ages, in an ever changing world, is predictive, reactive, artificial intelligent support and lies between two important things: anonymity and rapport.
We’re not trying to diminish the role of the professional in this battle against suicide; we’re trying to take up some of the slack. If this kind of approach, using technology like artificial intelligence, is enough to keep somebody vulnerable talking in the black spot between an alert and a response from a community mental health team or the police? Is that itself not worth exploring? I vote yes.
About Pete Trainor & Nexus Pete Trainor is a behavioural designer, mental health campaigner, accidental polymath and founder of Nexus, The Human Centred Design Company. He talks all over the world on creative & social technologies & the physiological & psychological effects on their audiences. Pete regularly appears in UK national and international press as an analyst on mental health, digital media, creative industries, emergent technologies, and tech markets. He has a very simple mantra for the business: Don't do things better, do better things. @petetrainor / @nexushcd / www.nexus.design
ManMade|The Conference was organised by Midlands-based social enterprises Forward for Life and Common Unity. Together, they conceptualised, designed and delivered ManMade, an innovative peer-led support service aimed at reducing male suicide. Initially piloted and recommissioned in the Midlands, the developers of ManMade are looking to establish it as an approach further afield. Terry Rigby – Co-Founder of ManMade @ukManMade // @forwardFORlife // www.manmade.org.uk e. email@example.com t. 07585776800
If you are having thoughts of suicide or concerned about someone else please go to The Urbrum Waiting Room Or contact – Samaritans // Listening service – 24 hours a day, any day – CALL 116 123 (UK, ROI) // EMAIL firstname.lastname@example.org CALM // Suicide Prevention support for Men (5pm – Midnight) CALL 0800 58 58 58 // SMS (text messsage) 07537 404717