29. April 2021

The shadow sides of today’s digital business models… and what we can do about it

Despite the broad use of social media, we somehow experience an increasingly unsocial world. The Netflix documentary The Social Dilemma does a great job to help us understand why. It sheds light on how technology firms have innovated the use of machine learning and AI-induced algorithms in such impeccable ways that they perfectly respond to our habits and preferences to serve their business goals. Unfortunately, this perfection results in an unintended (and maybe in some cases intended) crossing over the ethical line separating inspiring a user experience and manipulating the user for the sake of reaching business goals.


For the intention to serve markets, today’s AI-induced business models work perfectly. But they produce side-effects that do not work in favor to preserving socials peace and welfare in the long-term. The technologists in the documentary makes a great attempt to explain what is invisible to most of us: How AI that is implanted in most digital business platforms have the power to gradually shift our perceptions. Reality is: The news feed I see is not the news feed you see, my top news is not your top news. Whatever I click everyday influences what I get to see first, filtered by the machine. The rest stays in the dark, as if not existing. There is only a few internet services left, such as Wikipedia, where we all see the same thing. Clearly, getting machine help in filtering the tons of news and feeds for us each day is a great thing. The fact that we don’t know how it works and cannot define the filter-lens ourselves, is the manipulative act with dangerous side-effects. It’s like you would offer a child every day only what she/he likes eating most, e.g. candies. Intuitively, we know this can’t be a good thing in the long run. Indeed, echoing constantly our digital choices has severe behavioral consequences. If we get shown constantly and only those news and feeds that perfectly conform to our own views (on top of that, not knowing about it), we run into a so-called filter bubble. The daily reconfirmation that the whole world seemingly shares the same truths as us makes us feel entitled that our view is superior to others. This results in fighting more aggressively for our views and being less compassionate to other views. The societal polarization and divide we currently observe in some countries already takes its toll, and might even lead to civil wars.


This film is an alarming yet insightful wake-up call for all of us. Apart from calling their clients “users,” an obvious link to the business of illegal drugs, the documentary tries to abstain from any kind of blaming. As in fact there is nobody to blame - apart from all of us. It’s like with climate change: Together as a society we created these worrisome side-effects that no one wanted or intended for. That’s why I’ll recommend this film as a must-watch to all of us: Users, techies, parents, managers, business model designers, entrepreneurs, innovators, politicians, etc. We all need to better understand the mostly unintended shadow-sides of today’s free-subscription business models based on usage-data analytics and how they might impact our society negatively in the long run. Awareness is the first stepping-stone to actively tackling these effects effectively.


A call for responsible digital business models. As much as I value wake-up calls, it’s important to complement them with ideas about how we can address the issues they present. So, allow me to shortly brainstorm how a world with more responsible digital business models would look like from my very naïve user perspective: Maybe I would experience social media platforms where it would be up to me to decide which algorithm-lens I want to apply to filter my feeds and news? Maybe including an explicit algorithm called “counterintuitive” to present me with very diverse opinions from my own? Maybe I would have the choice to decide where and how much of my usage data gets stored? Maybe subscription providers would alert me proactively and early enough to preserve my timely right of cancellation?


It starts with all of us - our daily digital micro-decisions can make a difference. Melinda Jacobs from the UX community did a great job in summarizing a charter that designers can follow to embed more values and ethics into digital user designs. She opts for taking each design micro-decision consciously, such as, do I put the cross to close the advert pop-up on the right-hand side, do I put it somewhere difficult to find or leave it out completely so that users are forced to go through the ad? She reminds us that with each of these seemingly little micro-decisions, companies and designers “vote” each day for the type of digital future we will live in. I believe the same applies to us, the users. With each click on a video, a comment , using a certain webservice or search engine, we “vote” for certain digital norms – or against. Let’s take this voting right serious. Our future and the future of our next generation depends on it.


Dr. Eva Bilhuber
Dr. Eva Bilhuber
Thank you for your contribution. We will activate it shortly.