Making the Invisible Visible: How Forum Theatre Can Reveal the Impact of Social Media Algorithms on Local and Global Justice Issues
Frontlines of Activism
Abstract: What is the connection between the genocide of the Rohingya in Myanmar, the rapid spread of mis- and dis-information and the rise of mental health and attention difficulties? Whilst seemingly disparate, each of these issues have been exacerbated by the widespread use of engagement-based algorithms on social media platforms. Engagement-based algorithms are designed to captivate our attention to keep us engaging with the platform for longer, thereby raising advertising revenues for the companies. These algorithms pick up on content which is showing strong engagement and suggest it to users. Engagement means interaction with the content, such as ‘likes’, sharing the content, or commenting on the content. Engagement can also simply be the amount of time the user spends looking at content. As well as the adorable kitten videos, extreme content holds our attention, inducing algorithms to promote this material. There is a lack of awareness amongst the public about how these social media algorithms are amplifying alarming content, causing polarisation and driving inequality.
To highlight these issues, a theatre group working with Creativity and Change in Munster Technological University (MTU) explored the local and global impact of social media engagement-based algorithms. This took place through a mentorship with Brazilian theatre practitioner, Julian Boal. In this article, we explore algorithmic awareness as an important missing aspect of global citizenship education (GCE), before illustrating what we learned from exploring this topic through forum theatre. We conclude by drawing some lessons for GCE on using socially engaged theatre to explore complex, invisible topics. We also draw on the links between forum theatre and Freire’s pedagogy of the oppressed, including forum theatre as a form of praxis, conscientisation and collective action that makes invisible power structures visible.
Key words: Forum Theatre; Algorithmic Injustice; Socially Engaged Theatre; Participatory Arts Practice; Paulo Freire.
Introduction
Algorithms are increasingly important in shaping decisions that impact our lives, from what music or television shows we are recommended, to whether we get a mortgage, insurance, or a job (Birhane, 2021; Gran et al., 2021; O’Neil, 2016). Research suggests that many people are unaware of what an algorithm is, or that social media platforms like Facebook use algorithms to filter their feeds, deciding what we see and read on that platform (Eslami et al., 2015; Gran et al., 2021; Oremis et al., 2021; Smith, 2018). Scholars of critical algorithm studies suggest that this is a digital divide that needs to be addressed to enable people to meaningfully participate in democratic life (Lythreatis et al., 2022). In contexts such as Myanmar, this is likely to be even more important, as Facebook is often essentially the Internet (The United Nations’ Independent International Fact-Finding Mission on Myanmar [IIFMM], 2018). By this, we mean that, as Facebook offers participants free data, it is often the only platform used to access information. A lack of algorithmic awareness indicates a lack of awareness about engagement-based algorithms designed to keep us ‘engaging’ by capturing our attention. Engagement-based algorithms favour sensationalism, misinformation and extremism. Engagement-based approaches have been associated with an increase in youth mental health difficulties, attention deficit, the rapid spread of mis- and dis-information, and in worst instances, genocide (Amnesty International, 2022; Barger et al., 2016; Center for Humane Technology, n.d.; Golbeck, 2020; Jin and You, 2023; Naughton, 2022; IIFMM, 2018). At a widespread level, GCE has not yet critically engaged with software as a global justice issue, particularly how algorithms impact the global South. In this article, we demonstrate how forum theatre, an innovative form of theatre for development, is an effective tool to increase algorithmic awareness as a form of GCE. This approach supports an embodied form of conscientisation, that is, a process of developing a conscious awareness of algorithmic justice and how to bring about change. First, we introduce the problem of algorithmic justice as an important focus for GCE, and then introduce the process by which we developed a play based on this issue. Finally, we discuss lessons learned from using a socially engaged art approach that might be used by others wishing to explore this issue within GCE.
Algorithmic justice
Algorithmic justice is a concept increasingly used to highlight growing concern about the impact of AI and ‘Big Data’ enabled algorithmic decision-making (Birhane, 2021). We draw on this concept to highlight the need to address the harm associated with social media platforms’ engagement-based algorithms. An engagement-based algorithm is designed to boost platform engagement (i.e., time and interaction), through prioritising content that will provoke a reaction. Unfortunately, high engagement-based content is often sensational, divisive and/or a form of misinformation, reportedly contributing to a worsening of youth mental health (Haugen, 2022; Naughton, 2022), racist and anti-immigrant actions (Michael, 2023), violence in counties in the global South, including Ethiopia, India and Myanmar (Haugen, 2022; McIntosh, 2021; Paul, 2021) and interference in democratic processes (Wong, 2019). The more engaged a user is, the more social media companies profit from advertising revenue (Naughton, 2022). Engagement is based on how much attention and interaction each piece of content receives, how often it is shared, liked, commented on and so on. In tandem, the algorithms capture what each person likes, and suggests other content that they may like. If a user is depressed, they are likely to be shown content about depression. For example, British teenager Molly Russell, died by suicide months after harmful content was pushed her way by Instagram and Pinterest. The Coronor stated that social media content had contributed ‘in more than a minimal way’ to Molly’s death (Walker, 2022). Social media whistle-blower Frances Haugen revealed that Facebook was aware that Instagram, for example, was harmful for teenage girls and led to an increase in body image issues (Haugen, 2022).
In Myanmar, the same algorithmic approach has incited genocide against the Rohingya minority (Amnesty International, 2022; IIFMM, 2018). The rise of hate speech against the Rohingya on Facebook wasn’t moderated effectively. To reduce hate speech that may have been unknowingly shared, Facebook and Burmese civil society developed a design feature that acted like a sticker to alert users of hate-speech. When users applied the sticker, the Facebook algorithm recognised it as ‘engagement’, and as a result the machine-learning algorithms of Facebook actually increased the sharing of tagged hate speech rather than reducing it (Amnesty International, 2022).
Facebook whistle-blower Frances Haugen showed that the company spent 87 percent of its budget preventing misinformation on English features, whilst only 9 percent of users are reportedly English speakers (Paul, 2021). This means that there are less (if any) misinformation detection Facebook features and moderators in the global South, and especially in minority languages, leading to disproportionate harm (Amnesty International, 2022; McIntosh, 2021; Paul, 2021; IIFMM, 2018).
In Ireland, activist Mark Malone wrote an open letter which named social media algorithms as an underlying cause of the rise of anti-migrant racism (Michael, 2023). The Far Right Observatory (FRO) group brought these concerns before the Oireachtas (Irish Parliamentary) committee on Children, Equality, Disability, Integration and Youth, in February 2023. In her opening statement to the committee, FRO director, Niamh McDonald stated:
“Algorithms drive the content people see - amplifying toxic and manipulative content that fosters engagement via shares, likes, views. The scale and speed of viral content circulating has been instrumental to amplifying protests, and flashpoints, resulting in multiple violent incidents and escalation of vigilante mobs” (McDonald, 2023: 1).
What we see happening in Ireland is part of a larger pattern where racist groups exploit the infrastructure of online algorithms to disseminate hateful messages about migrants and refugees (Noble, 2018). Ireland is the home of many social media platforms, and whistleblower Frances Haughen pointed out how Ireland has a unique role to play in accountability (Molony, 2022). She also suggested that moderators for contexts like Myanmar have jobs based in Dublin. Whilst we have introduced what we mean by algorithmic injustice, we will now introduce the method of forum theatre which we used to explore this topic.
Method
Forum theatre
Forum theatre was initially developed in 1973 by Brazilian theatre director, Augusto Boal (Boal, 1985) as a form of participatory theatre used to engage with spectators, called spect-actors, in meaningful dialogue and action on social issues that impacted their lives. In what becomes a ‘rehearsal for life’ a team of actors present a social issue to the spect-actors. A protagonist is often depicted as trying to overcome a form of oppression and the everyday barriers they face whilst trying to bring about social change. The spect-actors are then able to enter the play, replacing the protagonist, to offer alternative solutions and strategies. The actors respond in character, which gives the spect-actors instant feedback about the potential effectiveness of each strategy that is performed (Ibid.). Through this process, pathways of change become clearer, and hope is affirmed. Forum theatre has been used all over the world to make systems of oppression visible from gender-based violence, to housing, and to what has been called ‘theatre of the techno-oppressed’ which makes visible the ways in which technology can obscure new types of labour exploitation such as the exploitation of Uber drivers or Amazon Turk workers (van Amstel and Serpa, n.d.). It is a particularly good method at making visible, the invisible, and making the abstract concrete.
Creativity and Change mentorship on forum theatre
Through an Irish Aid innovation funded mentorship with Brazilian theatre practitioner Julian Boal (son of Augusto) and facilitated by Creativity and Change, a group of artists, activists and academics prepared and performed forum theatre plays in Ireland that reflected local and global social issues. The mentorship began with a three-day in person face-to-face workshop with Julian in Cork in December 2022, where we devised plays related to forms of oppression that impacted our lives including gender-based violence, housing, care and in our case, algorithmic injustice. Based on the thematic areas of interest, we separated into small groups and continued to research and develop plays on these topics between December 2022 and an initial public performance in April/May 2023. Each month we sent videos of our work to Julian and then met with him online for feedback. The algorithm group began with six participants, which over time became a core group of writers/actors including Sarah Robinson, Claudia Barton, Pat McMahon, and Chriszine Backhouse. We were greatly supported by the wider forum theatre mentorship group who improvised new scenes, provided feedback, and performed in the play, particularly actors Ivy Favier and Kevin McCaughey, and later Catherine Murray who performed in the final version of the play.
In March 2023, the algorithm group invited a group of software engineers from diverse backgrounds (Iran, Pakistan and Ireland) from Lero, the Irish Software Research Centre, members of the Comhairle na nÓg (n.d.), and members of Jigsaw’s (n.d.) Youth Advisory Panel to give us feedback on our scenes. Jigsaw is Ireland’s Centre for Youth Mental Health and Comhairle na nÓg are child and youth councils in the 31 Irish local authorities, which give children and young people the opportunity to be involved in the development of local services and policies. Some of the young people who attended pointed out that the youth voice was not adequately expressed in the play, while some of the software engineers thought the play lacked nuance regarding the infrastructure of algorithms. The feedback helped us to identify areas of improvement, as well as build connections with people directly impacted by the issues we were portraying.
Synopsis of the play
Finding ways to connect the local and global impacts of algorithms was a key task we aimed to achieve through our forum theatre play. We had to choose characters, locations, and dilemmas that would create meaningful opportunities for the spect-actors to engage with the complexity of the issue. After many iterations, we chose to centre the play on a mother who becomes alarmed by the negative impacts of social media on her teenage daughter. The mother makes discoveries that the same algorithms that have negatively impacted her daughter have implications on everything from a rise of misogyny in schools to genocide in Myanmar. The spect-actors are invited to go on this journey with the protagonist, putting together the pieces of the algorithmic puzzle and discovering routes to systemic change.
In a strong forum theatre play, the protagonist will try and fail to find allies in each intervention scene. As Julian Boal taught us, ‘Forum theatre is not about life as it is, it’s about life as it is when you’re trying to change life itself’. As the protagonist tries to make allies, the allies reflect the societal conflicts and contradictions which maintain the oppression. For example, a school dean who is concerned about the reputation of the school if they go to the media, or a software engineer who is afraid of losing their job if they perform a virtual walk out.
As we created our play, Julian encouraged us to develop a protagonist who is trying to change the structures of oppression. To allow for a realistic scenario which encompasses local and global areas for activism, and to emphasise the personal as political, the protagonist is a parent and an employee of a social media firm, with potential to effect changes in both areas of her life. The following is a synopsis of the scenes we developed for the forum theatre play:
Scene one: Suggested for you/Click on me
Adolescent Anna is scrolling on her phone and after a few ‘likes’ is suggested posts promoting eating disorders. The charismatic algorithmic suggestions are personified by actors who vie for Anna’s attention.
Image One: ‘Click on me’ scene. Actors: Ivy Favier, Sarah Robinson, and Chriszine Backhouse (Photo: Helen O’Keeffe).
Scene two: School run
Anna’s mother (our protagonist) is getting Anna ready for school. Anna is absorbed with the phone and rejecting suggestions for breakfast. Anna accidentally leaves her phone behind when running out of the door, giving her mother an opportunity to see the content Anna has been engaging with.
Scene three: Timeprime TV show
In a current affairs programme, a whistleblower and a representative from a non-governmental organisation (NGO) are interviewed on the global consequences of social media algorithms, connecting the amplification of extreme content to genocide in Myanmar and declining mental health in adolescents in the UK and Ireland. Questions are invited from the audience with actors planted amongst the spect-actors, who raise questions about how restricting social media companies might affect jobs in Ireland and how moderating content could undermine freedom of speech. This is a ‘dirt’ scene which conveys information, but also allows for questions from the spect-actors. Real newspaper articles are projected as a backdrop.
Image Two: ‘Timeprime TV Show’ scene. Actors: Sarah Robinson, Ivy Favier, and Claudia Barton. (Photo: Helen O’Keeffe).