Jeff Gamet and Bryan Chaffin recently wrote nice articles bemoaning and hoping that Apple can achieve what many think is impossible: an AI agent that knows all about us to serve our wants and needs better, but that can keep its mouth shut and not destroy our privacy. The good news is that Apple can totally snoop on all its users and still maintain everyone's privacy.
Effective AI + Privacy = Doable
If you listen to the pundits, they say it's not possible. It seems most of the tech intelligentsia believes that to compete with Amazon, Google, and Facebook, et al., Apple needs technologies that mine the heck out of people's personal data and store a nightmare of privacy-invasive information just waiting to be used to crush peoples' 4th, 5th and 14th amendment rights to privacy.
But, that's wrong. It's a false dichotomy. You can have both a know-more-about-you-than-your-mom-does AI agent and preserve your privacy; it is totally doable in at least two ways. Here's how.
Actionable Cryptography
First, there are technologies that allow for actionable encryption (e.g., "fully homomorphic encryption") and encrypted searchable data. I'll spare the reader the trip down into those esoteric topics, other than to say, they exist. Apple is smart. It could use such actionable encryption to work on your data but have no idea what is in your data.
Before we get more into a second method for 'how,' a bit on the way some AI works. A lot of AI systems rely on machine learning. Machine learning is a fancy name for a bunch of tech including neural nets, which is a fancy(ier) way of building 'fuzzy' 'if-then' constructs by running tons of scenarios (e.g., your [often private] transactions and data) through the neural nets. The neural nets can 'catch' complex patterns that would be too onerous for humans to find and to difficult to hand program. The above is a gross and totally inaccurate over-simplification that will have proper computer scientists spitting (or worse) at the screen, but it will suffice for purposes of this conversation.
Learn and Purge
The second way 'how' Apple can snoop on you, but keep your information private, is perhaps even more powerful. Train and purge. Although you need to pass information through an AI agent so that it will learn, there is no need to keep that information after the AI is trained. You see, after you pass information about what you order, eat, visit, etc. through a machine learning algorithm, the AI has learned about your likes, and no longer needs the underlying data. You can get rid of it.
Here's a good example, Tesla used machine learning to teach its cars how to self-drive. The company pumped zillions of miles of road video and circumstances through the system to train the AI 'mind' about roads. Yet, it's not like a Tesla car stores those zillions of miles of road video; beyond being impractical, it's totally unnecessary. They just keep the resulting AI 'brain' that was trained by those scenarios.
In some ways, it's just like any human being. If you buy your wife a vacuum cleaner for her anniversary gift and you see her look of abject disappointment, you, hopefully, learn from that bit of data. But you do not need to keep a photo of her disappointed face and a receipt of the vacuum to remember that lesson. And neither do the machine learning algorithms. You can just purge that underlying data.
So maybe Apple starts pumping your purchases, website visits, iTunes music plays, etc. (maybe in encrypted format) and runs that information through machine learning algorithms and learns you like cupcakes and spiderman and puppies wearing funny hats. The great thing is, it can totally purge all evidence that helped the algorithm learn this about you. There would be no actionable records, logs, etc. of your visiting puppy hat websites.
And although the resulting AI agent would be devoid of any actual substantiation or private data, it would be totally useful. Much like after you purged your school books, yet you still intuitively have mastery about at least some of those old school topics, the AI agent learns about you over time without holding onto its 'books' on you. It doesn't need evidence or proof to know you like puppies wearing hats. It just knows. And so, it would know when a new puppy hat video comes out to iTunes, it might suggest it to you when you ask Siri, "Whats new and good on TV tonight?"
Apple, please just buy DuckDuckGo, and show up the tech data hoarders
For instance, Apple could buy a search engine like DuckDuckGo and pump your searches through its AI to learn more about you and yet purge and/or never maintain any logs or history on what you search or visit. And although a bit of a tangent, that Apple hasn't bought DuckDuckGo is almost as baffling as Yahoo! never having bought it or even developed its own search engine. Someone is going to buy it at some point, and everyone else is going to really regret missing the boat.
Hoarders gonna hoard
Anyway, if that backing store of data isn't needed after teaching an AI about you, why does Facebook, Google, Reddit, etc. still store everything about you? Much like your crass hoarder neighbors whose garage holds every bit of crap they've ever come across in life—except their cars—they have problems. These companies use it mostly to sell you advertising. But how likely are you to care about a purchase of candy corn 12 years ago, versus more recent trends in your taste (e.g., Mamba Sour, yum)?
To be fair, they can also use the information to deep-mine, and deep-test alternative and improved machine learning algorithms. However, nothing stops them from just A-B testing new algorithms on newer data coming through its pipes. In the end, they just want to hoard for the same reasons all hoarders want to hoard, they just can't help themselves.