Inequalities, Algorithms, and Data

By Jorge Rojas-Alvarez

The production of inequality and the influences of technology on this issue were active topics at the last annual meeting of the Society for Social Studies of Science – 4S in New Orleans, LA. Several panels showed how communities and minorities subjected to inequality phenomena have found in technology greater obstacles to the recognition of their voices, participation in the improvement of services that alleviate their problems, or their inclusion in forms of economy that allow them to grow and strengthen. In these problems, the intensive use of algorithms and data in information technologies plays a role that is worth checking in more detail.

This article will discuss three presentations that talked about these issues. The presentations were held by researchers Karina Raider (Queen’s University), Jingyi Gu (University of Illinois at Urbana-Champaign), and Janine Slaker (Michigan State University). Respectively, the presentations addressed cases such as data activism at the Civic Tech Volunteering in San Francisco Bay, the issue of automated content moderation for LGBTQIA communities on YouTube, and textual analysis of academic literature on algorithms and their relationship with race. In the presentations, the inequality based on algorithms and data manifested itself in three ways: The first is an undervaluation of the structural factors of inequality over socioeconomically vulnerable communities; the second is the exclusion of communities in the new forms of Internet-based economy; third is the symbolic inequality generated by the imaginary claims of neutrality of the algorithms.

Data activism and inequality apps

Karina Rider’s presentation accounted for the Civic Tech Volunteering program in San Francisco Bay[1]. In this program, volunteer developers from technology companies in the area design applications seeking to improve government services or meeting the data needs of communities in the bay. Examples of these applications are websites that map frequent locations of bicycle accidents, accountability for resources use for local political campaigns, and mobile applications homeless to locate shelters.

Rider presented the inequality as the absence of informed reflections on the structural factors that increase the disparity in the populations that the Civic Tech intends to influence. For example, when a mobile app assumes that the problem of finding suitable shelters for the homeless is reduced to an efficient distribution of resources, they evade and make invisible the precariousness of resources dedicated to this population. Then, the Civic Tech is unaware of the complexity of the situations that the apps try to influence and automates the inequity already produced by the growing limitations in government programs to serve these populations. Therefore, the purpose of the intervention is to create an app, but not knowing what will happen next or what influences it had on the situation.

Although Civic Tech has a good reception by volunteers, the program represents an escape to their dissatisfaction with their jobs and the lack of social impact of their products. According to Rider, Civic Tech recreates in volunteers the idealized workplace without questioning the dreams offered by large companies within a digital economy of hypercompetitiveness. Moreover, for some volunteers, Civic Tech represents a contingency or second option in a precarious labor market.

LGBTQIA communities in Internet economies

In the relations between LGBTQIA communities and new Internet-based economies, inequality was addressed as the impediment of content moderation algorithms for these communities to monetize their videos on streaming platforms such as YouTube. The monetization of content on these platforms works when content producers receive commissions for the advertising watched by users when they reproduce published content. However, researcher Jingy Gu indicated how LGBTQIA communities publicly denounce that many videos uploaded to the platform are excluded from receiving advertising payments[2]. The platform argues that such exclusion occurs when the moderation algorithm finds content that may be inappropriate for most of the audiences. But through a digital ethnography to influencers’ social networks and interviews with users, Gu demonstrates that this is incorrect because the use of keywords as transgender in the publication of videos is often punished and the content is demonized. Although the contents are not removed from the platform, they cannot access the advertising commissions. Additionally, some users claim why in some monetizable content the algorithms choose anti-gay advertising without any moderation. Moreover, inequality is reinforced by the lack of response by those in charge of moderation and the lack of clarity of a cause for impediments and deficiencies in moderation. Consequently, the moderation algorithm performs actions that are supposed to maintain the participation and commitment of the user community, but which excludes communities such as LGBTQIA with their actions.

In the case of YouTube, one effect of inequality is the economic disincentive to the production of content both produced by and targeted to LGBTQIA communities. Besides, this discouragement decreases the agency and the presence of these communities to be an active part of the virtual public sphere because it limits their voices, their content strength, and expansion. However, Gu addressed the fact that these communities have found ways to triangulate diverse content publishing platforms, as well as create communities of practice to discover and act on the actions of the moderation algorithms. Through these strategies, communities subvert the algorithmic bias they have found so far.

Purified algorithms and race politicization

Another kind of inequality was presented through the use of terms associated with race and its representation in technologies. An analysis of the use of words as algorithm and race in technological discussion contexts shows the designation of meanings of both terms. Through textual analysis, scholar Janine Slaker demonstrated how very different visions of both words are provided in the presentations at academic events of information technologies and how they construct an imaginary for their use[3]. For example, the term algorithm is associated with words such as achieve, complex, fair, learning, fair, potential, protection, and value. On the other hand, the use of race has an attribute and a choice. Words such as arrest, black, face, discrimination, group, non-white, prisoner, and policing. Then, the use of the term algorithm is understood as neutral in terms of race, innocent and well-intentioned. The algorithm only follows processes, consumes data and takes the needed variables for fair decision making. Hence, the imaginary provides a space of meanings where the term can move, the algorithm is as an object external to society that transcends racial barriers, hierarchical structures and purifies the actions of a technology.

Although the spaces of meanings that constitute the terms are shared and often overlap, in the case of the relationship between terms such as algorithm and race, the term algorithm restricts, opaque, and limits that of race. Slaker calls this relationship sterilization or purification of meanings. For instance, while the algorithm is neutral, race cannot be neutral. Likewise, if the algorithm is apolitical, the race is political. The algorithm constitutes an abstract object, devoid of moral values, ​​and clean of the issues of racial relations. Therefore, the algorithm that ignores inequalities and complexities beyond its reach makes it part of the social instruments and strategies that deepen them. The very concept of algorithm is not neutral and inequalities are only relieved when there are reflections and purposeful inclusion that influence the factors that generate them.

Thanks to the cases presented above, concerns about the effects of algorithms and the intensive use of data as deepeners of inequity are reinforced. It was pointed out how the lack of a comprehensive understanding of the structural factors that produce inequality can be imprinted in the algorithms (as the case of the Civil Tech program suggested). The other extreme is also concerning. It is demonstrated by the example of the content moderation of YouTube that algorithms can be created with actions that discourage the participation of communities like LGBTQIA. Finally, it was reflected in how the imaginary associated with algorithms as pure and isolated procedures of the political complexity of society also contribute to creating blind spots in technological design.


[1] Karina Raider, “Digital Justice in the Bay: Material Politics After the Internet,” in Annual Meeting of the Society for Social Studies of Science (Innovations, Interruptions, Regenerations, New Orleans, LA: Society for Social Studies of Science, 2019).

[2] Jingyi Gu, “Confused and Furious: Encountering Algorithmic Bias Against LGBTQIA Content on YouTube,” in Annual Meeting of the Society for Social Studies of Science (Innovations, Interruptions, Regenerations, New Orleans, LA, 2019), https://www.4s2019.org/.

[3] Janine Slaker, “Data In-Furs: A Narrative Critique of Data-Driven Solutions for Countering Social Marginalization,” in Annual Meeting of the Society for Social Studies of Science (Innovations, Interruptions, Regenerations, New Orleans, LA, 2019), https://www.4s2019.org/.