FAIRDIENSTE: Designing fair data economy approaches

In 2022 an increasing number of digital businesses are building their business models on data collection, its usage, and further processing. Data and especially so-called big data holds the potential for a lot of innovation and empowers the advancement of all-day-technologies, knowledge engineering and the efficiency of economic processes. But behind those advantages appear underlying value conflicts.

Data-driven companies need user data to assure a potent software or digital service whilst on the other hand users claim privacy and self-determination – values that are potentially denied if data collection processes are unconscious and non-transparent.

In addition, the ownership and processing of huge amounts of personal data mean a shift in power relation giving companies the ability to manipulate, nudge or discriminate against marginalized user groups.

When talking about data economics on a societal level two growing systems can be seen: On the one hand the rise of hegemonic companies like Google, Amazon, Meta (formerly known as Facebook) and Apple dominating the Western and US-American culture. Problems arising are for instance that those companies are becoming so-called gatekeepers of information, web infrastructures and e-commerce itself. Therefore they grow lobbying power: their economic power turns into political power, and they have a great impact on users and democratic opinion-forming. Sometimes they even use their power maliciously by spreading misinformation or supporting specific political departments.

On the other hand, the Chinese system is best described by (mass) surveillance by state paradigm. Data is collected on a daily basis, in fact there is no enduring possibility to avoid its collection because the Chinese government is using a network of over 200million so-called China Central Television (CCTV) monitoring cameras connected as the Skynet to watch its citizens. Furthermore, a nationwide scoring system is established, “The Social Credit System”, with the help of local companies like Tencent, Huawei or ZTE. To obtain the scoring system facial recognition technology, big data analysis, surveillance drones and big data collection targeting online social media platforms are used. The system was established to prevent crime and reward community-centered behavior and in general to strengthen China’s power.

Kai Strittmatter, a German journalist who has studied China for more than 30 years, says “the Chinese state has amassed an astonishing amount of data about its citizens, which it uses to punish people for even minor deviations from expected norms.”  By alleging to create a value for the Chinese society in total the ownership and usage of data belongs to the state fitting the Chinese self-conception of community over individual.

Encircled between data capitalism and a surveillance state Europe is struggling to find its role offering a decentralized and human-centered solution, which in the best case ensures values like data sovereignty, privacy, transparency, informational self-determination and non-discrimination.

For Europe to deploy its own model, if you want to call it that way, is difficult because of American tech companies not being tied to geographical boundaries and having spread their operational range over nearly the whole western society. Therefore, the European solution needs to be compelling to compete.

Main efforts have recently been made on a judicial level, bringing the Digital Services Act and the Digital Market Act onto the field. Currently passed by the European Parliament the Digital Services Act focuses on the wealth of the user, preventing manipulation, protecting the freedom of speech and privacy as well as fostering transparency.

Those efforts are made for example by prohibiting the use of “dark patterns”, prohibiting companies from using sensitive personal data like health data or data on the political and sexual orientation to form user profiles, or by requesting companies to reveal their underlying recommendation algorithms.

The Digital Market Act on the other hand addresses the market power of the big platform giants to restrict them from hindering a fair competition. Those two laws are claiming to be the first fundamental laws for the internet. But till they are fully in charge and passed by the European countries as well as implemented it will take up at least till the beginning of 2023.

In addition, Europe must not only restrict existing platforms but develop a satisfactory counteroffer to guarantee a fair and still compelling data economy.

Steps have already been taken on a small scale for example by non-profit organizations like MyData.org, which are advocating a personal data concept built on digital self-determination and a balanced relationship between economics and users. Another example is the company Polypoly, which is even claiming to ‘build the European data economy’ working on several levels and several technical products towards a user-centered data economy and likewise indicating the fostering of data usage for the European state and the European economy.

The scientific community recognizes the necessity of investigating data economies and their infrastructures, opportunities, and development potential as well and addressing the research question of how to design such a compelling European data economics counteroffer taking the interests of individuals, businesses and society into account. 

The Research Center of Information Systems Design of the University of Kassel and the Munich School of Management of the Ludwig-Maximilians-University Munich conduct collaborative research on design methods for alternative business models in the data economy. The project is called `Fair digital services: Co-valuation in the design of data-economical business models´ (in German: Faire digitale Dienste: Ko-Valuation in der Gestaltung datenökonomischer Geschäftsmodelle) and is a three-year project fostered by the Ministry of Education and Research.

The focus lies on fairness – a term which needs to be further defined within the project and yet which all stakeholders involved are trying to achieve – and on the application of the multidimensional methodology of “co-valuation”. This methodology works on three different levels of research:

  • How can different understandings of values be transferred into an economical value (money) and be part of a fair share?
  • How can companies use their economical shaping power to canalize value conflicts?
  • How can the discussion and negotiation of value conflicts be handed over to social media channels and elements to foster a culture of fairness between users?

By addressing these questions, the project not only investigates the fair data economy in a multi-dimensional way but from a multi-disciplinary research viewpoint as well.

The project is drawing from perspectives of sociologists, who are in charge of project coordination, data engineering scientists, business informatics specialists and human-computer interaction researchers who are working together in the project.

Eventually two practice partners are complementing the project consortium: The Burda Forward Media Group and the Institute for Technology and Journalism. Burda Forward unites a lot of digital media products in one media company, like Focus Online, Chip, Efahrer.com, and so on. Burda Forward enables the project members to get an insight into a working field which has massively been changed and is still being changing by the data economy. This is very useful when it comes to identifying value conflicts and finding and defining use cases.

The Institute for Technology and Journalism on the other hand offers connections to developers working in the mobile business industry. Furthermore, it has established its own scoring system for mobile applications concerning privacy determinants.

The work group Gender/Diversity in Informatics Systems (GeDIS), as one part of the scientific team and inter-disciplinary itself, is investigating data economy issues from a human-computer interaction and diversity studies point of view.

Its research focuses on the question how to design fair technical artifacts and the development processes that create them. One approach is to use participatory design to democratize those development processes.

In the first year of the three-year project theoretical considerations about fairness have been made, an analysis of the ecosystem, value conflicts have been defined and the cooperation between academia and practice has been fostered.

In the just yet started second year of the project the theoretical considerations are tested with the practice partners and an integrative process model is prospectively aimed at.

GeDIS’s responsibility is to develop and advance methods that ensure a diverse mapping of values into technological artifacts especially considering the perspective of users and marginalized groups.