Facebook said it deleted millions of posts from April to September for breaching its laws against hate speech, child nudity and other offensive content.
Facebook and Instagram Remove Posts!

Facebook deleted 58 million posts for adult nudity and sexual activity during the second and third quarters, 5.7 million posts for harassment and bullying, and 11.4 million posts for hate speech, according to its two-year compliance study on community standards.
Facebook deleted 11.6 million pieces of content from July to September for breaching its policies on child nudity and sexual exploitation, up from almost 7 million in the previous quarter.
The company attributed the rise in these takedowns to enhancements in the detection and removal of content, including how Facebook store digital fingerprints called “hashes” that run against child nudity and sexual exploitation under its rules.
The report also included new data on the nature of suicide and self-injury as well as the propaganda of terrorists.

From April to September, Facebook pulled 4.5 million posts to depict suicide and self-injury. On Instagram, 1.7 million of these posts were taken down for breaching the same policy.
Nonetheless, the latest transparency report from the company comes as regulators around the world continue to call on Facebook, and the rest of Silicon Valley, to be more effective in preventing the viral dissemination of negative content, such as propaganda, graphic violence and hate speech.
Only after the deadly shooting in March in Christchurch, New Zealand, the demand for legislation strengthened. Rapidly spreading footage of the gunman targeting two mosques on social media, including Twitter, evading the expensive systems of tech companies to stop such videos from getting viral.