Web Exclusive

Issue 01: Year in Review

Exploring the rise of streaming in the year of a global pandemic.

Over the past year, streaming not only met our individual needs and preferences but also brought people with shared interests together.

Solidarity in streaming, or life in a bubble?

Streaming platforms allow communities to organise and demonstrate their collective power from wherever they are.

We saw this last summer as Black Lives Matter protestors streamed demonstrations from multiple US cities following the death of George Floyd. Woke.net’s Twitch channel pulled simultaneous broadcasts of these protests from all over into a single feed, showcasing the strength and solidarity of the movement.

However, the same algorithms that create communities may also produce “filter bubbles” that distort the information people receive. Such echo chambers, like the one that alleges the 2020 US election was “stolen”, reinforce mistruths rather than expose people to different views.

Over time, these “bubbles” have a polarising effect, legitimising hate speech and violence towards certain communities.

The road to insurrection

A study showed that 64 per cent of users who joined extremist groups on Facebook did so because of its algorithm’s recommendations. Similarly, YouTube was found to be the most discussed cause of “red-pilling” in far-right chat rooms — an Internet slang that means converting someone to fascist, racist and anti-Semitic beliefs.

Used uncritically, these technological tools render people more susceptible to misinformation. On the morning of January 6, 2021, stoked by weeks of anger at the election being “stolen”, a pro-Trump mob stormed the Capitol. Many alt-right personalities live-streamed the chaos to online communities who cheered them on.

“Whose House? Our House!”

Live-streaming an insurrection

As live-streaming platforms become popular, alt-right personalities like Anthime Gionet (“BakedAlaska”) have built followings on platforms like DLive and YouTube.

Digital collectives like Woke.net archive live feeds to “amplify the voices of demonstrators”. Woke.net archived an estimated 30,000 streams in 2020.

November 4, 2020

While votes are counted, Donald Trump alleges “fraud on the American public”. A Facebook group called “Stop the Steal” is created and attracts 300,000 members in 2 days.

January 6, 2021

Trump addresses thousands of supporters, saying, “we’re going to walk down Pennsylvania Avenue… and we’re going to the Capitol”. Rioters respond by marching, breaching barricades and violently confronting US Capitol Police.

Many live-stream their actions.

With more than 4,000 watching on Facebook, Derrick Evans stands with the mob shouting “Whose House? Our House!”, before moving into the Capitol.

“That was SO EPIC”, says Stephen Baker (“Stephen Ignoramus”) as he live-streams the break-in.

Comments on Gionet's stream goad the rioters on. Once in, he announces, “We are in the Capitol Building; 1776 will commence again!” His viewers send over US$2,000 in “tips” via DLive.

Woke.net pulls the live streams into a Twitch channel, where users get to watch official news coverage alongside wobbly mobile-phone footage. 64,000 people see the insurrection unfold on Woke.net’s channel, while close to 150,000 viewers tune in to DLive’s far-right streams that day.

Hours pass before US Capitol Police secure the building once again.

In the aftermath, platforms ban Trump, Twitter removes 70,000 QAnon-linked accounts, while Facebook filters all “Stop the Steal” content.

Sources: Boston Globe, MSN, New York Post, The New York Times, NPR, PC Gamer, Vice (1) (2), Wired, World Tribune

Photo Credits: Anthime Gionet, CCDHate, FBI, The Guardian, Hannah Gais, Jacquelyn Martin, Jordan Fischer. Ted Eytan, Woke.net

In cases like the Capitol insurrection, which was fuelled by user-generated streaming content, who should be held responsible?

Streaming ethically

Section 230 of the Communications Decency Act in the US shields websites from being liable for user-generated content. This seems like common sense: users should be held responsible for what they say. Practically, it is also a Herculean task to review every post of every account, given that there are millions of accounts on these platforms.

Yet, it is also too simplistic to absolve the platforms of all responsibility. By encouraging creators to monetise their content for maximum user engagement, they incentivise the proliferation of certain messages.

Despite positioning themselves as mere conduits, platforms play a huge role in amplifying the views of content creators and must take responsibility for what ends up on their site.

A shared duty

In reality, platforms, users and even policymakers all have a role to play in standing against viral hate speech and misinformation that harm the public good.

Governments and civil society groups can help define reasonable and transparent standards that consider freedom of expression, safety and privacy. Platforms can deploy their algorithms not just in service of personalisation, but to offer more comprehensive checks against misleading and offensive content.

Users of these platforms must also understand that every “like” and “share” sends a signal to content creators and platform algorithms. By recognising that our actions have power, we can use our clicks to shape the digital spaces we inhabit.

Only by acting collectively can we achieve the potential these streaming platforms have to do good.

Sources: Bellingcat, The Verge
Masthead Photo Credit: FBI