There is a time honored technique, around much of the world, to use sex to sell goods. And by sex, I mean something so innocent as a pretty woman selling laundry soap or something as upsetting as a glimpse of a strangled female movie star to lure people into a crime drama. Even so, the networks have standards and practices groups that dictate just how far these scenes can go.
In fact, we expect corporations to have values, everywhere we turn. We expect oil companies to take considered, careful technical measures to protect the earth’s environment. We expect companies that sell food to provide oversight on the quality of what we eat. We expect our insurance companies to have integrity and not throw legal trickery at families with children dying of cancer. In every aspect of western culture, we expect companies to obey the law, but more than that, to demonstrate sound moral values.
Apple’s Pickle
Apple has made a conscious decision to curate apps. They can’t be harmful and they can’t inappropriately violate a person’s privacy. Apple does this for a reason, which I’ll get into in a minute. But one might first ask if they have a right to do that. I’ll argue that just because Apple never enforced that right on Apple IIs and Macintoshes, doesn’t mean that they can’t start — in a much more vicious Internet atmosphere that arrived long after the first Apple II in 1977 and first Macintosh in 1984.
Companies enforce their rights to protect their customers all the time. Restaurants reserve the right to refuse services to a customer who may be packing a legal, unconcealed firearm lest their own customers become unduly alarmed. Newspapers are not compelled by law to accept ads from terrorist organizations. Cities have the right to enforce a no smoking law in certain buildings, public and private, recognizing their responsibility to protect citizens from harmful cigarette smoke.
These are all laudable values by companies. But let those values get in the way of a businessman who wants to use sex to market his goods, and watch the fur fly.
Apple’s approach is, in my opinion, based on two concepts. First, it has a right and a responsibility to protect its own customers, and if Apple doesn’t, it’ll be found liable in some instances. If it doesn’t do this, the company will be sued anyway, and it’ll be shown that Apple didn’t exercise due diligence — in fact implicitly condoned bad app behavior. Second, Apple recognizes the Internet’s dirty little secret: Apps that appeal to customers because of their sexual content, harmless though they appear on the surface, often use that appeal to engage in scurrilous, under the hood activity. For example, stealing credit card numbers, or reporting your location to car thieves.
Another fact of legal life that’s conveniently ignored is that angry victims take out their anger not only on the creator of the offending product, but also the conveyer of that product.
For example, if a restaurant customer finds piece of a mouse carcass in her hamburger, she’ll not only sue the company that ground the hamburger but also the restaurant for not exercising due diligence in the food they served her. If an Apple customer is financially harmed by an app, who’s he going to sue? A little one man outfit in, say, Berlin? Or Apple, a company with deep pockets, for not protecting him. This happens all the time, and victims win because of juror sympathy for the little guy.
This will become even more important when our smartphones become our digital wallets.
Even though Apple tries to balance freedom of expression against the safety of its customers, there are those who cry foul. There are unscrupulous businessmen who want complete freedom to dupe their customers at Apple’s expense. Some even, outrageously, claim censorship. So far, Apple has stood its ground, but there may be a compromise.
A Way Out
Apple so far has not seen fit to create an adult version of the App Store. In some cases, the company has asked customers to certify that they’re at least 17 years old by touching a button, but that hasn’t kept Apple from prohibiting certain apps that they believe are using sex to lure customers. Yes, it’s a line in the sand, but it does create confidence on the part of the customer who’s about to install an app.
What about an adult’s right to chose and take responsibility for his or her actions? I agree with that. If Apple were to set up a mechanism, an adult section of the App store, with proper legal and practical considerations, it would make a lot of people happy, and the whole problem would, I believe, just go away. For example, Apple could require a customer to enter a name, street address, and a driver’s license number. That data could be then checked for validity in a court case — if the customer decided to sue anyway. Apps there could be paid apps only, a factor that seems to weigh on this. And, I believe, Apple could enforce a terms of service that prohibits legal action if the app turns out to harm the customer. A customer might even have to sign a piece of paper and mail it in, distasteful as that may be to a company out to save our trees.
I’m not a logal expert. These are just my personal observations and opinions, put forward for your consideration. I’m sure all of our astute readers will weigh in, including “Nemo” who can help clarify the legalities.
The fundamental issue here, it seems to me, is that our society, in 2010, tends to think in terms of only what’s legal and not legal. When a company takes a moral, practical stand that has underlying legal and business implications, we become annoyed. The wild, wild west businessmen on the Internet become infuriated. They, in turn appeal to governments to take away the rights of a company that is exercising its own rights to have corporate values and protect its customers. It’s not a pretty sight.
Apple could make both sides happy with a formal, properly configured adult App Store. Then, those who want to carry the burden of risk themselves can do so freely, and we can all move on.
Let the discussion begin.