A recent report showed that one team handling child sexual abuse content on the platform for the Asia Pacific region—which has a population of 4.3 billion and includes Japan, the country with the second most Twitter users after the US—is literally down to just one employee.
Twitter—that allows adult pornography, but doesn’t have the technology to distinguish consenting adults from children without human staff—simply needs in-house help. Organisations like Internet Watch Foundation (IWF) and the National Center for Missing & Exploited Children don’t have access to the internal data, detection code, and other tools that Twitter can use to prevent child sexual abuse from being shared in the first place.
Child sexual abuse content is already eating into Twitter’s business: Both Dyson and Forbes have suspended advertising on the platform after their ads appeared directly next to child abuse content.
During this period, the company suspended over half a million accounts for disseminating child exploitation content, marking a 31 per cent increase from the previous six months.
In October, Delhi Commission for Women chairperson Swati Maliwal had said the replies received from Twitter in the child pornography complaints were incomplete, and the Commission was not satisfied with them.
"I am shocked with the kind of rape and child pornographic videos available freely on Twitter. The nauseating material needs to be immediately removed from Twitter and FIR should be registered by Delhi Police in the matter. Systems must be developed so that all such videos are immediately deleted and the perpetrators reported to the law enforcement agencies. Twitter must be held accountable for this filthy and objectionable content being available and even sold on its platform," said Delhi Commission for Women (DCW) Chief Swati Maliwal in a press briefing, end of September.