New study on platform labour in the EU & Project Liberty

I am happy to report that our datafication course is still alive and thriving. I am teaching this semester again this course at Leuphana University of Lüneburg to a group of highly motivated master students. So far, we have addressed the theoretical foundation and key constructs of the scholarly debate around datafication.

This week, we talked about algorithmic management and platform labour. Based on the lecture by Armin Beverungen, we reflected upon the question of how algorithmic management puts human dignity into jeopardy if humans are reduced to sets of data points and only their input and output matter for the firm or client. In this regard, the newest publication from the Friedrich-Ebert-Stiftung on platform labour in the European context might be of interest to students of this class, which shows that besides product delivery and driving services, also sectors such as care, cleaning and domestic work and routine office tasks are increasingly organized through platforms.

In previous sessions, we reflected upon the notion of governance in/and datafication, and have discussed as to which actors can and should engage in the governance of datafication. A key insight from our lectures so far is, that the Internet consists of highly centralized infrastructures that are often privately governed by a few firms. Thus, users have limited control or ownership of their data within these infrastructures, alongside the relationships that they forge within these infrastructures. Say, for example, a user does not confirm the rules of a given platform, they have few options except to leave the platform. Yet, leaving platforms or other digital infrastructures is extremely “costly” because this means a user is being cut from their social networks to friends, colleagues, and families. Even more problematic, if a platform owner decides to kick a user out, they involuntarily use access to their social networks. So the question is, how do users potentially (re-)gain control and ownership of their “social graph”, and how to overcome private governance?

An intriguing project attempting to address this concern is the Liberty Project, an initiative that works on the development of a “Decentralized Social Networking Protocol” (see here a laymen’s description of the technology). Essentially, the initiative aims with such protocol to “create a new civic architecture for the digital world that returns the ownership and control of personal data to individuals, embeds ethical values into technology, and expands economic opportunities for web users and developers alike”. The project makes the important argument that the Internet needs to be fundamentally restructured, in order to allow for the creation of a more equal, fair, and human-centred digital ecosystem, in which users are in control of their data. More information on the project can be found here. Interesting side information: The project is funded by Frank McCourt, a man who made his fortune in real estate, and who is considered an antagonist of Elon Musk. (Thanks to my student Michael, who mentioned this case in class!)

A new era of (data-based) climate transparency? The Climate Trace Project

Our datafication course addresses two topics that are currently discussed together in the media: Transparency and Sustainability. Recently, a consortium of non-profits, backed up by former US presidential candidate Al Gore, has published a database that allows to identify the top global CO2 hotspots. The project Climate Trace allows to trace global hotspots that produce CO2 emissions. The site allows also to select and compare various polluters. The project builds on the analysis of large amounts of data, machine learning technology, and additional forms of analysis. Representatives of the project suggest that from this transparency can come accountability as the project would allow policymakers to make better-informed decisions with regard to regulating emissions. After all, the project indicates that some polluters of the oil and gas sector have a much higher CO2 emission than they claim to have. The question obviously is: Does transparency readily translate into accountability? At the very least, the initiators of the project hope that the project will fuel collaboration between firms, regulators, and other actors to reduce CO2 emissions. Find out more about the project here.

Course update, October 2021

Hi there!

This course is still alive and thriving. This semester, it will be taught at Leuphana University Lüneburg (Germany). Again, I (Hannah) will be teaching to an interdisciplinary group of students the course in the master-level complementary studies (Komplementär), in the module “Connecting science, responsibility, and society”!

For the upcoming semester, the course has received some additional material on three aspects of datafication from three outstanding contributors. Ursula Plesner contributes a lecture on datafication in the public sector, Maximilian Heimstädt lectures about accountability and datafication, and Gazi Islam reflects on the ethics of quantification, a phenomenon closely associated with datafication! We thank all contributors for their valuable contributions!

We also added new animated videos, to illustrate two additional core concepts of the course: platformization as well as algorithmic governance.

Based on the student feedback from the previous year, the course material was also slightly rearranged. It now follows the following logic: The first lectures establish a broad foundation to datafication. The next lectures focus on datafication in organizations, while the following lectures focus more on the societal implications of datafication, and the final lectures address ethical and sustainability related concerns.

Update, June 2021

We are currently updating and extending the content of this open access course!

Recently, we added the first out of three animated videos to core concepts of this course to this site. You can find our short animated introduction to the concept of datafication here: https://dataandorganisations.org/class-1-introduction-to-this-course-and-to-datafication/

We are also working on new material for additional lectures on topics such as the quantification of self, openness and corporate digital responsibility. We will announce the publication of these lectures in due time!

End of project funding, but teaching and blog continue

We have reached the official end of the funding period of our global classroom project 2020. The goal of the project was to develop and implement a fully virtual seminar with the title “Global Issues in a Global Classroom”, which the Leuphana University of Lüneburg conducted in collaboration with Copenhagen Business School. We thank the state of Lower Saxony for its funding for the project. The project lead draws a conclusion on the project in a short video which is published on the open education portal of the state of Lower Saxony. You find the video here.

Because of the great success with the students, Nanna Bonde Thysrupp and Hannah Trittin-Ulbrich will be teaching this seminar again in Autumn 2021. This page will be also continuously updated until then.

Exemplary student response to a course assignment

The course “Challenges and Opportunities of Datafication: Interdisciplinary Perspectives” is designed in a way that you can customize the content to your needs. Whether you want to use all the material or just individual classes, you always find a whole set of lectures, assignments and readings that fit eight specific themes. In order to give you a sense of how students interact with the material, we present to you here the response of one of our students, Sinziana Thurm, who takes part in our global classroom project. Sinziana is in her 3rd semester of the Sustainability Science (Nachhaltigkeitswissenschaften) Master’s programme at Leuphana University Lüneburg. She chose to take part in this course as a result of the positive experiences obtained during the last semester in the module „Digital Media and Sustainability“, where she touched upon sustainability in the ICT sector, sustainable and energy-efficient software and data centres. For her, “digitalization is an important megatrend inside the ICT sector, which is why I wish to link it to the interdisciplinary perspectives that I am obtaining in my Master’s since I believe that not only technologies (hardware) but digitalization itself should be shaped in a sustainable and ethical way in the future.”

Sinziana submitted the following, excellent response to the assignment linking to Mikkel Flyverbom’s lecture on transparency and the digital prism: Reflect on the notion of transparency and visibility relating to your own usage of digital technologies. In which ways do you feel visible? Does this concern you? Why, why not? Note down your reflections in about 500 words.

Sinziana writes ((For the purpose of this weblog, we slightly shortened her response):

Personal data privacy rights as an intrinsic human right have started to become acknowledged only recently (United Nations, 2020). While privacy is recognized as a universal human right (§12, Universal Declaration of Human Rights, 1948), data privacy is not recognized as a human right from a legislative point of view yet (Lathrop-Melting, 2020). Institutions and bodies have the “power” (or better said in Foucault’s words “metapower”, which is everywhere, see Foucault, 1998) of taking legislative decisions affecting the lives of millions of people. Hereby, power pervades the digitalized world since it is constituted through knowledge, a component that any type of company collecting data possesses for example.

Nevertheless, such decisions must be taken openly and be transparent since it is one’s data that is being collected, used, and processed in many ways. Hereby, “transparency is a core principle of data protection” (European Data Protection Supervisor, 2020). At the same time, Article 14 of the 45/2001 Regulation (EC 45/2001) (EC, 2001) requests the presence of transparent information and communication together fairness concerning the usage of data. The regulation is based on the following principle: “fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes”, while Article 15 suggests that the regulating institutions managing our data are open and transparent (ibid.).

Thus, we have the right to know about the stakeholders involved in the collection of the data and the true motives of the data collection. Although measures have been taken in this direction, the effects of the taken measures were rather unsatisfactory since only internal instead of external measurements transparency were developed (Albu & Flyverbom, 2016). While relating the notion of transparency to the own use of digital technologies, I am somewhat concerned (approximately on a 6 on a 1 to 10 scale) yet since I do not feel like the information that stakeholders, companies, or the government are profiling/collecting about my persona/profile is necessarily be helpful for “dark” purposes rather economic gains from advertisements or more. Nevertheless, I am rather concerned about other purposes for which such information could be used. At the same time, I am not a big fan of social media nor do I use anything, except for LinkedIn. For example, users trust fewer social media platforms with their data than others such as doctors (Morey & Schoop, 2015). Nevertheless, I feel like due to the fact of being aware of the data privacy issues, I am more detached from social and digital media/technologies and I try to always decrease my usage and time spent on any apps, although this is not the solution since not everybody is aware of this issue and many users continue using digital technologies without taking into consideration the risks that it imposes.

An ideal solution for improving the transparency of digital technologies would be for example integrating as transparency is a novel non-functional requirement for software systems (Spagnuelo et al., 2016) or the usage of blockchain technologies, which are an open-source and decentralized database for tracking transaction information (Francisco & Swanson, 2018). Furthermore, companies need to promote a fair trade of their data which could benefit more the customers than the companies (Morey & Schoop, 2015).

Nevertheless, there is much legislative progress being achieved being made in this direction; especially with the accountability principle and new transparency requirements under the general data protection regulation (GDPR), which has released its guidelines on transparency, where the data controllers must be able to prove that the processed data is being processed transparently concerning the data subject (Regulation, 2016). Finally, I believe that users should inform themselves better about the current situation regarding data privacy before choosing which information to share, especially on social media platforms, and be overall more aware.

Bibliography

A., Lathrop-Melting (2019a, Oktober 11). HUMAN RIGHTS HORIZONS: Are data rights human rights? Center On Human Rights Education. https://www.centeronhumanrightseducation.org/data-rights/

Assembly, U. G. (1948). Universal declaration of human rights. UN General Assembly, 302(2).

European Data Protection Supervisor. (2020). European Data Protection Supervisor – European Data Protection Supervisor. https://edps.europa.eu

Foucault, Michel (1998). The History of Sexuality: The Will to Knowledge, London, Penguin.

Francisco, K., & Swanson, D. (2018). The supply chain has no clothes: Technology adoption of blockchain for supply chain transparency. Logistics, 2(1), 2.

Morey, T., Forbath, T., & Schoop, A. (2015). Customer data: Designing for transparency and trust. Harvard Business Review, 93(5), 96-105.

United Nations. (2020). Government Policy for the Internet Must Be Rights-Based and User-Centred. https://www.un.org/en/chronicle/article/government-policy-internet-must-be-rights-based-and-user-centred

Regulation (EC) No 45/2001 of the European Parliament and of the Council of 18 December 2000 on the protection of individuals with regard to the processing of personal data by the Community institutions and bodies and on the free movement of such data

Exposing “invisible” information – Investigation with the help of the Kit

If you are a student or a scholar, or even a journalist or citizen searching for information, the Internet is a fairly daunting place. Where to start? Is a popular search engine the best place to start, given that its algorithms expose me to information that is most likely filtered? Or, on the contrary, do you struggle to find any information on a given topic?

The Kit is an attempt to help anyone interested in investigating social phenomena with tools and methods that allow for alternative forms of investigation. In its own words, the team behind the Kit views the project as “a starting point for those who believe in the power of information as evidence but who recognise that working with information does not necessarily lead to immediate results or desired changes.” So they want to help “people to develop the ability to question information that is false, find information when it is scarce and filter information when it becomes overwhelming”. To this end, the Kit is a great source of students and scholars interested to learn about the investigation of information, and to enhance their investigation skills! Importantly, the Kit also provides several interesting examples of how to investigate the invisible. Find more information here.

Research talk on the dark sides of AI and how to govern AI responsibly

Yesterday, I (Hannah) had the honour to present my research on the dark and unexpected sides of digitalization and artificial intelligence at the virtual workshop “AI in academia and practice: Beyond the hype”, organized by the German association of business professors (‘VHB’) and the #diginomics research group at the Universität Bremen and the European New School of Digital Studies. Thanks to the organizers and to the audience who posed interesting questions. Here you can find my slides!

Automating Society Report 2020

Recently, AlgorithmWatch and Bertelsmann Stiftung launched the #AutomatingSociety Report 2020. The report gives an insightful overview of how private companies and public authorities in Europe use automated decision-making and outlines differences and commonalities amongst European countries with regards to approaches and applications of automated decision-making.

The report can be found here and here you find the video recording of the launch event.

This report is a nice teaching source if you teach this course to students from different backgrounds, just as we do in our global classroom project between Leuphana University and CBS. You may want to ask students to read through the section of their home countries, for example, and let them reflect on whether they had experiences with these automated systems, how they evaluate them, and what could, for example, explain considerable differences between the national approaches to automated decision-making.

Teaching material now online

All teaching material for the course “Challenges and Opportunities of Datafication: Interdisciplinary Perspectives” are now ready available (see course material). We have included many open access readings, as well as additional material for preparing the lectures, but also allowing students to explore further aspects relating to the lectures. The additional material will be updated, as we continue to teach the course.