Social media giant Facebook, which also owns popular apps Instagram, Messenger and WhatsApp, reportedly assesses about 5 lakh reports of revenge porn per month.
But this number seems low considering Facebook now has around 2.6 billion monthly active users.
Earlier this year, Facebook launched artificial intelligence (AI) tools that could spot revenge porn, also known as non-consensual intimate images, before being reported by users, NBC News reported.
In 2017, the company launched a pilot project that let users submit intimate pictures to the platform as a means of training its AI tool to identify and remove such pictures if they appeared on the platform.
"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," NBC News quoted Radha Plumb, head of product policy research at Facebook.
Facebook has a team of around 25 people -- excluding content moderators -- that works full-time fighting revenge porn.
The team's goal is not only to quickly remove pictures or videos once they have been reported, but also to detect the images using AI the moment they are uploaded, to prevent them from being shared.
Also Read | Opinion: Deepfake Will Make Fake News Realistic
(IANS)