TikTok’s algorithm misreads creator’s pro-Black profile as a threat

The error occurred on TikTok’s beta Creator Marketplace, which links content makers with sponsors

Adam Smith@adamndsmith
Friday 09 July 2021 17:03

TikTok blocked users of its Creator Marketplace from using the word “black” and phrases like “Black Lives Matter” in their bios, as the algorithm flagged them as “inappropriate content”.

Creator Ziggi Tyler discovered the issue attempting to update his bio; the words “Black,” “Black Lives Matter,” “Black people," “Black success,” “Pro-Black,” and “I am a Black man” were not accepted. “Pro-white” and “supporting white supremacy” were accepted by TikTok’s algorithms without issue.

TikTok’s Creator Marketplace is currently in invite-only beta testing, but aims to connect creators with brands for sponsorship deals.

TikTok said that the app mistakenly flagged phrases because its hate speech detector associated the words “black” and “audience” – which contains the word “die”.

“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order,” a TikTok spokesperson said in a statement.

“We recognize and apologize for how frustrating this was to experience, and our team has fixed this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27B views on our platform."

The issue is the latest in a series of examples of automated systems working against minorities. Instagram’s CEO Adam Mosseri said in June 2020 that the company needed to better support the black community, and is looking into how its “policies, tools, and processes impact black people”, including its own algorithmic bias.

Algorithmic censorship also saw posts from Palestinians about violence in Gaza taken down on Facebook, Instagram, and Twitter, and led to criticism over the black-box nature of these systems.

Outside of social media other algorithms, including facial recognition algorithms, routinely fail to properly identify the faces of people of colour – who are already targeted disproportionately by police. In February 2019, Nijeer Parks spent 10 days in jail and paid $5000 (£3627) to defend himself after being misidentified by facial recognition software and subsequently arrested by police.

“Regardless of what the algorithm is and how it picked up, somebody had to program that algorithm,” Tyler told Recode. “And if [the problem] is the algorithm, and the marketplace has been available since [2020], why wasn’t this a conversation you had with your team, knowing there have been racial controversies?”

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

View comments