Where’s the Money in That?
From the abstract:
Divisiveness appears to be increasing in much of the world, leading to concern about political violence and a decreasing capacity to collaboratively address large-scale societal challenges. In this working paper we aim to articulate an interdisciplinary research and practice area focused around what we call bridging systems: systems which increase mutual understanding and trust across divides, creating space for productive conflict, deliberation, or cooperation. We give examples of bridging systems across three domains: recommender systems on social media, software for conducting civic forums, and human-facilitated group deliberation. We argue that these examples can be more meaningfully understood as processes for attention-allocation (as opposed to “content distribution” or “amplification”), and develop a corresponding framework to explore similarities — and opportunities for bridging — across these seemingly disparate domains. We focus particularly on the potential of bridging-based ranking to bring the benefits of offline bridging into spaces which are already governed by algorithms. Throughout, we suggest research directions that could improve our capacity to incorporate bridging into a world increasingly mediated by algorithms and artificial intelligence.
This is all well and good, but the economic model of all of the social media sites is to encourage outrage, because outrage = engagement = profit.
This is why Instagram turned a new father’s baby pictures into a horror show, bombarding his timeline with sick and injured infants, because that encouraged engagement.
When my son was born last year, friends from all over wanted to share in my joy. So I decided to post a photo of him every day on Instagram.
Within weeks, Instagram began showing images of babies with severe and uncommon health conditions, preying on my new-parent vulnerability to the suffering of children. My baby album was becoming a nightmare machine.
This was not a bug, I have learned. This is how the software driving Instagram, Facebook, TikTok, YouTube and lots of other apps has been designed to work. Their algorithms optimize for eliciting a reaction from us, ignoring the fact that often the shortest path to a click is fear, anger or sadness.
This is not a lack of the appropriate technology, this is a deliberate policy of hurting their customers (really their product to advertisers) to keep them on the site clicking on ads.
Designing a more humane algorithm is easy. Absent government action, it will never be used.
Doing so would involve leaving money on the table, and the principle of shareholder value demands it.
The for-profit publicly held corporation is a monster by design.