Mood music will be the death of us
Posted on 29th September 2021
Over the years, we have published many blogs that highlight the great benefits that technology has brought to our lives, both personal and business. We have also flagged our concerns about the areas where tech can have a detrimental effect.
It’s not hard to find lots of examples of tech companies interfering in our lives or using their algorithms to, in effect, govern what we can and can’t do. Some are comical, such as the occasion in 2018 when Facebook branded parts of the United States Declaration of Independence ‘hate speech’ and removed them from its pages. More prosaically, a friend of a friend posted a reference to ‘Arabs,’ followed by a rather rude noun. He received a warning and a temporary ban from Facebook, but he was not referring to Arab people, rather Dundee United FC, whose nickname is ‘the Arabs.’
If you think that’s mad (and I’m aware there will be some who don’t), consider this: Apple is now deploying sensors to monitor our moods. They have declared there are 10 emotional states that their algorithms can detect. One of these emotional states, only one is positive, and that’s the one actually called ‘positive.’ All the rest are in varying degrees negative, such as ‘angry,’ ‘anxious,’ ‘confused’ and ‘sad.’ And, believe it or not, one is ‘death.’
Apple is only following where Amazon has already gone. Last year, Jeff Bezos’ mob introduced their ‘Halo’ watch. As well as all the fitness trackers, it also listens to your voice and then if it thinks you are being too aggressive it tells you to moderate your language or, at least your tone.
Now you may think this is fine. On paper, facial recognition technology can be thought of as a good thing, allowing the police to capture baddies. However, as we’ve written here many times, there are many dark sides to FRT and it’s not hard to see how technology that monitors moods could be similarly misused, especially by unscrupulous employers. The huge focus on wellness is, for the most part, to be welcomed (it’s something we do at Be-IT), but what happens when your computer tells HR that you are getting too angry too often and you end up in a disciplinary hearing. The problem, as with the fans of Dundee United, is that nuance and visual/oral clues are missed and the wrong decisions are taken, potentially exacerbating a situation that a good manager would deal with far more efficiently using common-sense, something that algorithms tend to lack.
Of course, there will be some advantages in this sort of technology, especially if it helps us spot people who are genuinely at risk (something that in this post-pandemic world is more likely than it was previously). However, getting the balance right it vital, and while I yield to no-one in my admiration for the superb techies who come up with this stuff, I do also think that we need a bigger debate around the pitfalls that come with its use on a wider scale.
Scott Bentley, Be-IT
.. Back to Blog