Instagram boss says it will change algorithm to stop mistreatment of black users, alongside other updates

'We need to take a harder look at the underlying systems we’ve built, and where we need to do more to keep bias out of these decisions' the company said.

Instagram’s CEO Adam Mosseri said that the company needs to better support the black community, and is looking into how its “policies, tools, and processes impact black people”.

The Facebook-owned photo sharing platform will focus on four issues: harassment, verification, distribution, and algorithmic bias.

In a blog post, Instagram was vague about the changes it would make in these areas.

“Any work to address the inequalities Black people face has to start with the specific safety issues they experience day to day, both on and off platform”, Instagram says, claiming it will address “potential gaps” where its policy is lacklustre to those ends.

It is changing its account verification system to “ensure it’s as inclusive as possible,” but gave no further indication what those changes would be.

The company is also looking into the ways its algorithm filters content, both with regards to “shadowbanning” and structural biases in its systems.

Shadowbanning, described by Instagram in its post, is “filtering people without transparency, and limiting their reach as a result”.

The company says that it will be releasing more information about the types of content it does not recommend on its Explore tab “and other places”.

“We need to take a harder look at the underlying systems we’ve built, and where we need to do more to keep bias out of these decisions” the company said.

Questions about algorithmic bias have plagued social media companies for years. In 2019, Instagram was criticised when it was thought to be limiting users' posts to only a small percentage of their followers.

TikTok had to apologise for algorithmically hiding posts that included the Black Lives Matter or George Floyd hashtags from view, with the company saying it had to regain and repair [the] trust“ between it and the black community.

Facebook has recently been criticised about how it manages its algorithm too, after it reportedly shuttered research that would make the platform less divisive but would also be “antigrowth” and require “a moral stance”.

Systemic bias in technological algorithms is not unique to social media platforms either. IBM said it would not continue to develop general purpose facial recognition because of the ways in which it harms communities of colour.

Similarly, Amazon put a moratorium of one year on its own Rekognition facial recognition technology, following the protests in the US over the death of George Floyd.

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Please enter a valid email
Please enter a valid email
Must be at least 6 characters, include an upper and lower case character and a number
Must be at least 6 characters, include an upper and lower case character and a number
Must be at least 6 characters, include an upper and lower case character and a number
Please enter your first name
Special characters aren’t allowed
Please enter a name between 1 and 40 characters
Please enter your last name
Special characters aren’t allowed
Please enter a name between 1 and 40 characters
You must be over 18 years old to register
You must be over 18 years old to register
Opt-out-policy
You can opt-out at any time by signing in to your account to manage your preferences. Each email has a link to unsubscribe.

By clicking ‘Create my account’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in