In 2014 I completed a project with more than 300 teenagers around their use of social media. I ran a number of sessions over the school year, talking to them about their online experiences. One thing is abundantly clear: nobody is talking to them about the internet. Currently, schools focus narrowly on ‘online safety’; a response to concerns about grooming, vulnerability to predatory adults and cyberbullying. While these issues are undeniably important, worries about online safety in the traditional sense are hugely overplayed and driven by media-fuelled hysteria.
When running the project, I was influenced by Danah Boyd’s book Danah Boyd’s book, It’s Complicated: The Social Lives of Networked Teens. A core theme of her work is summarised thus: ‘Teen “addiction” to social media is a new extension of typical human engagement. Their use of social media as their primary site of sociality is most often a byproduct of cultural dynamics that have nothing to do with technology. Teens aren’t addicted to social media; if anything, they’re addicted to each other’.
Boyd suggests that the technology itself is neutral, a view I once subscribed to. I listened to the kids and mediated our conversations while viewing social media through this lens. Three years on, I have come to a disturbing conclusion. The technology is far from neutral. I was looking in the wrong direction, at the wrong aspect of safety.
How do we make decisions?
Throughout Western society, the concept of reasoned, analytical thinking is championed as a core aspect of our moral and democratic ideals. Problematically, our belief in our capacity for reasoned thinking far outways our ability. As described by sociologist Jonathan Haidt, reasoning is not the cause of our judgements. Reason usually happens after a judgement has been made. In reality, ‘the emotional tail wags the rational dog’. Just take a breath and think about this: Most of our reasoning takes place after our decision has been made.
Thinking, Fast and Slow, an extraordinary book by Daniel Kahneman (cognitive scientist and Nobel laureate) presents decades of research evidence revealing our cognitive shortcomings across dozens of contexts. ‘System 1 is the hero of the book’, the main protagonist. He describes System 1 as a ‘mechanism that offers effortlessly originating impressions and feelings that are the main sources of explicit beliefs’. It operates ‘automatically and quickly’ with ‘no voluntary control’.
System 2, in contrast, swings into action only with targeted, conscious effort and is able to complete ‘complex computation’. The difficulty is, most of the time, even when we feel we are engaging in reasoned thought (System 2), our intuitions (System 1) tend to subvert this. It takes significant effort to engage System 2, override System 1 and ensure ‘the rational dog wags the emotional tail’. When we’re busy, distracted, or hold a strong core belief, it is almost impossible to avoid following our intuition, regardless of the evidence presented to us.
This isn’t comfortable, it simply is. We can’t help it. An Oxford don is as prone to these cognitive errors as an 18th century chimney sweep. As summarised by Haidt, ‘each individual reasoner is really good at one thing, finding evidence to support the position he already holds, usually for intuitive reasons’.
Why does this matter?
It matters because those in-the-know are using knowledge of the above to get us hooked on technology. Much of this technology directly targets young people. Take Ramsay Brown , for example. He is a computer programmer who studied neuroscience before co-founding ‘Dopamine Labs’, an app design company named after the dopamine molecule in our brains that aids in the creation of desire and pleasure. He designs computer algorithms that give rewards at just the right time to keep you coming back. These rewards have no intrinsic value, but make you want more and keep you online.
Former Google Product Manager Tristan Harris has been speaking out about the insidious aspects of such technology, arguing that tech companies incentivise to keep you online for as long as possible because that’s how they make their money. As also stated stated by Ramsay Brown: ‘Advertisers pay for Facebook. You get to use it for free because your eyeballs are what’s being sold there’.
Of most concern are recent inventions such as Snapchat ‘streaks’. which shows the number of days in a row that you’ve sent a message back and forth to someone. In a TED talk, Harris states that many kids feel so compelled to keep a ‘streak’ going, they give their password to other kids to keep their streaks going in their absence. This behaviour reveals a well-established cognitive trap which Daniel Kahneman would call: ‘aversion to loss’. A System 1 phenomenon hard-wired into our brains, acting at an unconscious level and ripe for manipulation.
If tech companies were unaware of the psychology, these developments could plausibly be seen as an unintended consequence. But knowing what they do and going ahead anyway is at best morally dubious, particularly when deliberately aimed at young people. Harris states ‘Inadvertently, whether they want to or not, they are shaping the thoughts and feelings and actions of people. They are programming people. There’s always this narrative that technology’s neutral. And it’s up to us to choose how we use it. This is just not true’.
Alongside tech companies vying for attention and advertising revenue, others such as Cambridge Analytica (CA) used data to tap into System 1 and influence political perspectives. In 2018 their (now disbanded) website stated: ‘We find your voters and move them to action… CA Political has redefined the relationship between data and campaigns. By knowing your electorate better, you can achieve greater influence while lowering overall costs’. Unashamedly, CA implied that for a price, voting preferences can be purchased as simply as buying lunch. Chilling. The company officially dissolved in May 2018 following revelations that they had mined more than 50 million Facebook user profiles with the aim of rigging elections. That said, there are serious concerns that CA’s work will continue following a rebrand under the name Emerdata.
It is heartening to find some big movers in the tech arena also starting to raise serious concerns. Alongside Tristan Harris, James Williams (ex-Google strategist), recently left the company.
Together, they co-founded the advocacy group, ‘Time Well Spent’. The website states that: ‘Advertising-fuelled technology companies are trapped in a race to get our attention’. Others, such as software developer Justin Rosenstein (the Facebook ‘like’ co-creator) have spoken about the negative psychological impact of apps and has stated a case for regulation of ‘psychologically manipulative advertising‘ online. It is not difficult to see why.
At the same time, research academics such as Dr Larry Rosen are studying the ‘psychology of technology’. He raises concerns about ‘phantom pocket vibrations’ that ‘induce a sort of obsession or compulsion to constantly check in with our virtual worlds’ that may be contributing to ‘signs and symptoms of psychological disorders’. Rosen coins the phrase ‘iDisorder’ to explain this phenomenon. Preliminary findings suggest that young people, the iGeneration, may be most susceptible.
In my work as a psychologist, I’ve seen clear indications when we ‘use’ technology rather than being ‘used’ by it, there are numerous potential benefits. All those mentioned above, from ex-Google employees to university academics would agree with this position. However, there are certainly dangers, particularly when users are not self-aware and do not know how things work.
The danger
Some would argue that we are seeing is the rapid, international, commodification of human beings, both in consumer and political terms. A few powerful people in a tiny number of tech companies have the capacity to activate our unconscious biases and shape our sense of who we are.
And this is what my project should have focused on; this is what makes young people unsafe online. I was drawn more to attention grabbing, negative, sensational reports about cyberbullying and online grooming, rather than the subtle and deliberate machinations of tech, which takes effortful, System 2 reasoning to unpack.
Navigating information, the use of tech devices (to suit our needs) and the psychology of decision-making must be embedded aspects of the school curriculum. It is increasingly important for young people to have the skills to understand the impact of tech, deconstruct information presented online and pose questions such as: How is technology effecting my life? How much time online is too much? What is this information saying? What is its source? Who wrote it? What is their agenda? Doing so requires knowledge of the workings of System 1 and how to deliberately activate System 2 and the reasoning faculties. No easy task in a world where partial attention, at best, is the prevailing norm for most people with a smartphone.
We cannot afford to wait. This won’t be a tale of the hare and the tortoise. The battle for attention and influence shows no signs of fatigue and intensifies daily. Education didn’t even hear the starting pistol. Young people need to know how best to use technology to enhance, not ensnare them.
*** Image credit: Freepik
Dr Christopher Bagley is an educational psychologist. His goal is to co-create a psychologically healthy education system for young people.