During a recent episode of TMO’s Daily Observations podcast, John Martellaro intrigued me with a scenario. More on that in a moment. Here’s the thought that popped into my head. What if our voice assistants became mandated reporters, and had to report child abuse?
What Is a Mandated Reporter?
In legal terms, a mandated reporter is a particular professional who has frequent contact with children. That individual is required to notify the authorities when there’s reason to believe a child is a victim of abuse. Each state has different standards for this. The types of people who are mandated to report child abuse include, but are not limited to:
- Social workers
- School personnel like teachers, principals, nurses, and others
- Health care workers, such as doctors and nurses
- Mental health professionals
- Child care providers
- Medical examiners and coroners
- Law enforcement officers
Now, you might be wondering how this can apply to something like Alexa or Siri. After all, these are software constructs, not some professional who has frequent contact with children. There are other professions that fall into the category of a mandated reporter in some states, though. In 12 states, Guam, and Puerto Rico, companies that process and/or print photographs have to report child abuse. In six states, even computer technicians have to notify the authorities when they suspect child abuse.
How Can Siri or Alexa Be Forced to Report Child Abuse?
In the case of the Echo lineup of devices, Alexa is always listening. Granted, the voice assistant doesn’t respond or start recording and transmitting data to Amazon right away. In fact, that doesn’t happen until the wake word is heard, but the device is always listening. If the Siri Speaker comes to fruition, that device will likely do the same thing.
Children love these voice assistants. My own children have a blast talking to Alexa, asking her questions, and telling her stories. They’ve come to view Alexa as a friend, and many children see the disembodied assistants as trusted friends. I’m sure most of us have seen viral videos of children carrying on lengthy conversations with Siri or Alexa. The reality of children seeing these voice assistants as friends is already there.
From the Mouths of Babes
John Martellaro posited that if Alexa became sophisticated enough, she might get parents into trouble. The voice assistant might suggest a child contact the police after hearing about a case of domestic violence. If Alexa was legislated to be a mandated reporter, the device could automatically contact the authorities all on its own. This could be a good thing, but some people are going to cry foul and say it could be devastating.
You see, there are occasional cases of false reports of child abuse and sexual abuse. Some people will assume that a child angry with his or her parents will run to Alexa or Siri. They’ll concoct some wild story about the innocent parent victimizing them. Then, the device will contact the police. The courts could even try to hold Amazon itself liable, since the device is always listening. In such a case, the retailer will likely be able to easily defend itself. After all, Alexa doesn’t send any data to Amazon unless she hears her wake word.
The Truth Is in the Numbers
Let me set your mind at ease somewhat. Your children are highly unlikely to tell Alexa you beat them or sexually abused them. It doesn’t really matter that you grounded them for not doing their homework. According to a literature review conducted by The Leadership Council on Child Abuse & Interpersonal Violence, such intentionally false accusations happen very infrequently. In four different states, such allegations made up less than one percent of all unsubstantiated reports of child abuse.
Personally, I don’t think Alexa or Siri are going to become mandated reporters anytime soon. That kind of judgement call is too sophisticated for our voice assistants … for the time being. Mandated reporters have to go through hours of training before they can take on their professional roles. Such devices might become mandated reporters at some point in the future, though we, as a society, have plenty of time to prepare. Honestly, if a voice assistant helps reduce the growing number of unreported child abuse cases, that might not be a bad thing.
A clarification: As Jeff Gamet and I pointed out in the original podcast, none of us condone child or spousal abuse. These are crimes. But as Jeff B. points out, there could be unexpected events and corresponding social and legal consequences to be worked out as these devices become more pervasive – and trusted. This excellent article explores just a few implications.