Instagram bug saw Chinese food emojis appear under searches for ‘dog’
The issue was caused by the word 'doggy bag', according to Instagram head Adam Mosseri
Searching for the word “dog” on Instagram’s Story function resulted in an emoji for a Chinese takeaway box, a company employee found.
While the racist bug occurred on Apple devices, it could not be replicated on Android. Similar Story functions on Twitter, Snapchat, and Facebook either were not searchable in this way or did not show the Chinese takeaway box.
The associations between dogs and Chinese food are based on racial stereotypes.
“How are the emoji’s being recommended in this and can we remove this so this doesn’t perpetuate Asian racial stereotypes?” wrote an Instagram product integrity program manager, according to Buzzfeed News.
"I’ve tested this with 3 of my family members and it shows up for them.”
In response to Buzzfeed’s story Adam Mosseri, the head of Instagram, tweeted that the issue was caused by the associated word “Doggy bag”.
“So it turns out one keyword associated with [Takeout box] in our system was "doggy bag," so the search term "dog" produced the emoji as a match. We have since removed that search term and we apologize that it was misconstrued, and to anyone we offended”, Mosseri said.
The fault appears to lie at Instagram’s doorstep, according to the Unicode Emoji Subcommittee, which helps new emojis gain approval.
Jennifer8 Lee, a vice chair of the committee, told Buzzfeed that “‘Dog’ is not a keyword for ‘takeout box’ in Unicode” and that while emojis can be linked to certain keywords such associations are made within the app rather than in Unicode itself.
Instagram is not the only company that was recently found to have coded racist connotations in their products.
Phrases like “Asian food” or “Asian countries” are unable to be searched for while this restriction is active.
“The URL was blocked by a content filter”, the error message states. The message does not come up when searching for the words “black”, “white”, “Arab”, “Korean”, or “French” – other popular racial categories on pornographic websites – nor “schoolgirl”, but is activated when users search for “teen”, “amateur”, and “mature”.
The issue was spotted by iOS developer Steven Shen, who originally filed a report with Apple in December 2019 stating that Apple's filters were showing "Incorrect/Unexpected Behaviour" but received no reply from the company.
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies