Facebook (NASDA: FB) removed millions of harmful contents including those containing child exploitation, hate speech, terrorist propaganda, among others over the past three months.
On Wednesday, the social network giant released its Community Standards Enforcement Report, which provides details regarding its progress in taking down contents that violate its policies.
Facebook reported that it also eliminated harmful contents that were posted on Instagram, its photo and video-sharing platform.
Based on the report, Facebook removed 11.6 million pieces of content showing child nudity and sexual exploitation from its platform in Q3, up from 6.9 million in Q2. On Instagram, the social network giant took down 754,000 pieces of content in Q3, up from 512,000 in Q2.
Facebook improved its systems in detecting and removing contents
According to the company, the number increased due to the improvements in its processes for detecting and removing contents that endanger children. The improvements included fixing a bug that was mention in its last report.
Facebook eliminated 7 million posts promoting hate speech in Q3, up from 4.4 million in the previous quarter. The company defines hate speech as dehumanizing language, statement of inferiority, insults or calls for exclusion or segregation based on protected characteristics including race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disability or disease.
“In recent years, using machine learning, we have developed detection technology that can find and flag hate speech using several different methods,” according to the social network giant.
In the third quarter, Facebook also deleted 3.2 million posts that violated its policies against bullying and harassment as well as 5.2 million contents promoting terrorism.
During a press conference, Facebook CEO Mark Zuckerberg said, ” “We are going to keep publishing these reports so people can see the scale of these issues and hold us accountable for improving our system.” He added, “We are making some real progress here, but our systems aren’t perfect and being transparent helps us keep pressure on ourselves to always keep doing better.”