YOUTUBE
According to Black Enterprise: Meta, the parent company of Facebook and Instagram, will examine whether its platforms treat users differently based on their race. Black Facebook users and employees have complained and criticized the social media site’s racial bias for years.
In a 2019 USA Today report, Black Facebook users say their posts about racism have been taken down for violating hate speech rules. The social media site also apologized earlier this fall after its artificial intelligence software labeled Black men as primates. Additionally, Facebook employees told NBC News the company ignored racial bias research.
According to NBCnews.com, In mid-2019, researchers at Facebook began studying a new set of rules proposed for the automated system that Instagram uses to remove accounts for bullying and other infractions.
What they found was alarming. Users on the Facebook-owned Instagram in the United States whose activity on the app suggested they were Black were about 50 percent more likely under the new rules to have their accounts automatically disabled by the moderation system than those whose activity indicated they were white, according to two current employees and one former employee, who all spoke on the condition of anonymity because they weren’t authorized to talk to the media.
The findings were echoed by interviews with Facebook and Instagram users who said they felt that the platforms’ moderation practices were discriminatory, the employees said.
The researchers took their findings to their superiors, expecting that it would prompt managers to quash the changes. Instead, they were told not share their findings with co-workers or conduct any further research into racial bias in Instagram’s automated account removal system. Instagram ended up implementing a slightly different version of the new rules but declined to let the researchers test the new version.
It was an episode that frustrated employees who wanted to reduce racial bias on the platform but one that they said did not surprise them. Facebook management has repeatedly ignored and suppressed internal research showing racial bias in the way that the platform removes content, according to eight current and former employees, all of whom requested anonymity to discuss internal Facebook business.
The lack of action on this issue from the management has contributed to a growing sense among some Facebook employees that a small inner circle of senior executives — including Chief Executive Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, Nick Clegg, vice president of global affairs and communications, and Joel Kaplan, vice president of global public policy Facebook did not deny that some researchers were told to stop exploring racial bias but said that it was because the methodology used was flawed.
Alex Schultz, Facebook’s vice president of growth and analytics, said research and analyses on race are important to Facebook but is a “very charged topic” and so needs to be done in a rigorous, standardized way across the company.
“There will be people who are upset with the speed we are taking action,” he said, adding that “we’ve massively increased our investment” in understanding hate speech and algorithmic bias.
“We are actively investigating how to measure and analyze internet products along race and ethnic lines responsibly and in partnership with other companies,” Facebook spokeswoman Carolyn Glanville added, noting that the company established a team of experts last year, called Responsible AI, focused on “understanding fairness and inclusion concerns” related to the deployment of artificial intelligence in Facebook products.
In an effort to be neutral, the company’s hate speech policies treat attacks on white people or men in exactly the same way as it treats comments about Black people or women, an approach that employees said does not take into account the historical context of racism and oppression.
“The world treats Black people differently from white people,” one employee said. “If we are treating everyone the same way, we are already making choices on the wrong side of history.”
Investors need to pay attention to Facebook for this reason
Tags used in this video:
black women,black community,social media,Facebook,Instagram,Whatsapp,Twitter,Social Dilemma,black people,social,media,Facebook status,messenger,mark zuckerberg,black men,black children,bias,algorithm,Black people differently from white people,policies prohibiting hate speech that attacks,Sheryl Sandberg,Nick Clegg,Joel Kaplan,racial bias,employees,moderation practices were discriminatory,Facebook Wants to Know if It’s Racist or Not?,bullying,hate speech, Investors need to pay attention to Facebook for this reason
#Facebook #socialmedia #markzuckerberg #blackwomen
For more information visit https://the-fly-nubian-queen-network.teachable.com source
GUAPIFY ORIGINALS
TRENDING NOW
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More