Samhälle
Kultur
OL Gräver
Underhållning
På Campus

The Power of Artifacts

Emma Jung - Authoremma.jung@osqledaren.se

Emma Jung - Illustratoremma.jung@osqledaren.se

When developing new technology, it’s often natural for us to consider its purpose and who it’s intended for. However, we seldomly acknowledge against whom we discriminate through our own limited perspective. Join me on a journey from discriminatory algorithms to racist soap dispensers and why we are in need of diverse development teams.

In recent months, AI products such as ChatGPT by OpenAI have gained increased popularity, and along with this the discussion around discrimination through digital technologies has risen. “AI algorithms […] find patterns within datasets that reflect our own implicit biases and, in so doing, emphasize and reinforce these biases as global truth.” (Howard and Borenstein 2018, p. 1524). [1]

Algorithms require data to make their decisions. This is the crux of the problem: both the quality and composition of that data affect the assumptions the algorithm makes. If data sets do not represent a diverse group of people about whom a decision is to be made, there is a possibility of discrimination. [2] Researcher Caroline Criado-Perez points out that datasets are predominantly made up of data from men and thus exclude "half the population". [3] In addition, a study by US scientists Joy Buorumwini and Timnit Gebul found that facial recognition algorithms were trained by large technology companies using images of predominantly European-looking men.

As a result, black women were either not recognized at all by the software or were wrongly classified as male. [4] This is due to the uniform nature of the data, which excludes many features, that are used by the algorithm for its decisions. [5]

Even though the current debate in society is very much focused on the issues of AI algorithms when it comes to discrimination through technology, it should not be forgotten that no form of technology is fundamentally neutral, rational, and prejudice-free. So it is not only future computer scientists, but also developers and engineers, who should consider this aspect during development.

But what power can objects have at all?

The US philosopher of technology and professor Langdon Winner has made the following classification of the power of artifacts: [6]

Here, an artifact is to be understood as a man-made – material or immaterial – object.

So not only humans can act and exercise power, but also objects that we develop. Actors can be both – “human or non-human" (Latour 1996) [12] and discriminate, as can their developers, whether intentionally or unintentionally. The realization that “autonomous technical systems [...] manifestly make decisions and perform actions in ways that were previously the exclusive responsibility of human actors” (Weyer and Fink 2011, p.40) [13] should definitely be taken into account in the development process.

However, it is important not to project responsibility onto the objects or even the disadvantaged groups. As Winner already wrote in 1980 in one of his most famous publications "Do Artifacts Have Politics?": “We all know that people have politics, not things. [...] Blaming the hardware appears even more foolish than blaming the victims when it comes to judging conditions of public life.” [14]

So what can we do?

One way to achieve less discrimination through technology is to provide comprehensive social education for technology developers. Whether at school, in a company, or at university, prospective developers should be informed and educated about existing power structures in order to be able to question and break through them. ”The real goal of building capacity for racial literacy in tech is to imagine a different world, one where we can break free from old patterns” (Daniels et al. 2019, p. 2). [15]

Further, there is a need to revise and rethink the development process. The weakest spot in the standard development process for new technologies: the "I-Methodology" (Akrich 1995). [16] Developers see themselves as users of the technology and thus unconsciously assume that future users will be like themselves. Their perspective on the world and the future is limited by their own horizon, and so is development. A homogeneous development team does not intentionally discriminate or act maliciously, but the problem of discrimination creeps unnoticed and unwanted into the end result of the development process and thus makes it biased.

Participatory technology development can compensate for this weakness through "We-Methodology": This involves as many relevant social groups as possible in the development process in order to identify weaknesses. To go back to our example from the beginning: A racist soap dispenser, in a diverse team with participatory development, could never have been created and brought to market in this way. [17]

So when developing, we should not only think about what people will do with the objects we develop (f.e. weapons) or for whom we plan them (f.e. companies), but also against whom we discriminate through our own limited horizon.

It is in our hands to develop technologies in such a way that in the future, articles like this one will hopefully report on the past and not the present.

[1] Howard, Ayanna/Borenstein, Jason (2018). The Ugly Truth About Ourselves and Our Robot Creations: The Problem of Bias and Social Inequity. In: Science and Engineering Ethics 24 (5), S. 1524.
[2] Schelenz, Laura (2022). Rassismus und Diskriminierung durch Algorithmen. Available online: https://rise-jugendkultur.de/artikel/rassismus-und-diskriminierung-durch-algorithmen/
[3] Criado-Perez, Caroline (2020). Unsichtbare Frauen. Wie eine von Daten beherrschte Welt die Hälfte der Bevölkerung ignoriert. Munich: btb Verlag.
[4] Buolamwini, Joy/Gebru, Timnit (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. In: Proceedings of Machine Learning Research 81, S. 1–15.
[5] Noble, Safiya Umoja (2018). Algorithms of Oppression. How Search Engines Reinforce Racism. New York: New York University Press, S. 70.
[6] Winner, Langdon (1980). Do Artifacts Have Politics? In: Daedalus, Vol. 109 No. 1, Technology: Problem or Opportunity? S. 121–136.
[7] Nyheter, S.V.T., Nyberg, M. and Isberg, C. (2015). Här är det förbjudet att ligga ner. In: SVT Nyheter. Available online: https://www.svt.se/nyheter/lokalt/norrbotten/nya-bankar-i-stationshallen-kritiseras
[8] Schwarz, Carolina (2020). „Die Datenlücke tötet Frauen“: Der männliche Körper gilt als Norm für die Wissenschaft. Das benachteiligt Frauen vielfach. In der Coronapandemie schadet es aber auch Männern. In: Tageszeitung (taz). Available online: https://taz.de/Gender-und-Wissenschaft/!5685021/
[9] Wolf, Thembi (2019). Sexistische Crash-Test-Dummies: Warum Frauen in Unfällen öfter sterben. In: Spiegel. Available online: https://www.spiegel.de/panorama/autosicherheit-immer-nur-maennliche-crash-test-dummies-gefaehrden-frauen-a-76b3034e-31bf-4788-bbda-330658e73b1a
[10] Pavey, Harriet (2017). Automatic soap dispenser sparks 'racism' outrage after footage shows it doesn't work for dark-skinned people. In: Evening Standard. Available online: https://www.standard.co.uk/news/world/automatic-soap-dispenser-sparks-racism-outrage-after-footage-shows-it-doesn-t-work-for-darkskinned-people-a3615096.html
[11] The icons have been designed using graphics from Flaticon.com
[12] Latour, Bruno (1996). “On Actor-Network Theory: A Few Clarifications.” Soziale Welt, Vol. 47, No. 4, S. 369–81.
[13] Weyer, Johannes & Fink, Robin (2011). Die Interaktion von Mensch und autonomer Technik in soziologischer Perspektive. In: TATuP - Zeitschrift für Technikfolgenabschätzung in Theorie und Praxis. S. 39-45. Vol. 20. (S.40).
[14] Winner, Langdon (1980). Do Artifacts Have Politics? In: Daedalus, Vol. 109 No. 1, Technology: Problem or Opportunity?, S. 122.
[15] Daniels et al. (2019). Advancing racial literacy in tech. New York: Data & Society Research Institute, S. 2.
[16] Akrich, M. (1995). User representations: Practices, methods and sociology. In Managing technology in society: The approach of constructive technology assessment, edited by A. Rip, T. J. Misa, and J. Schot, 167-184. London: Pinter Publishers.
[17] Weber, Sara (2016). Wenn Algorithmen Vorurteile haben. In: Süddeutsche Zeitung. Available online: https://www.sueddeutsche.de/digital/diskriminierung-wenn-algorithmen-vorurteile-haben-1.2806403-2

Publicerad: 2023-11-02

Ansvarig utgivare: Raquel Frescia
© 2008 - 2024 Osqledaren.