WhatsApp pushed back against accusations it was not doing enough to prevent child sexual abuse Wednesday. The company insisted it had “no tolerance” of such behavior.
WhatsApp: ‘No Tolerance For Child Sexual Abuse’
A WhatsApp spokesperson told TMO:
WhatsApp cares deeply about the safety of our users and we have no tolerance for child sexual abuse. We rely on the signals available to us, such as group information, to proactively detect and ban accounts suspected of sending or receiving child abuse imagery. We have carefully reviewed this report to ensure such accounts have been banned from our platform.
In the statement, the spokesperson insisted that the messaging app is “constantly stepping up our capabilities to keep WhatsApp safe.” They said this included “working collaboratively with other technology platforms” and said the firm will “continue to prioritize requests from law enforcement that can help confront this challenge.” The spokesperson’s comments followed revelations by The Next Web. Its investigation found that apps to help people discover child sexual abuse groups could still be sideloaded.
In February, the Facebook-owned messaging service outlined ways it was seeking to fight child exploitation. These included using photo-matching technology to pro-actively stop child exploitation on its platform. It said it suspends approximately 250,000 accounts suspected of sharing child sexual abuse every month.