When Apple added an emoji search bar to the iMessages, it allowed you to quickly access a particular emoji from over 3,000 options. For instance, if you punch in “love” it’ll show you heart emojis in many color options. Or if you type “sad,” it’ll show you the loudly crying face emoji.
However, when people are looking for different geographic regions or countries, the Apple emoji keyboard reinforces weird stereotype suggestions.
For example, if you search Africa, it’ll show you three suggestions: a globe, flag, and hut. While the former ones are probably what you’d expect, the latter suggestion is a Western stereotype about Africa.
However, when you type the term “Europe,” it’ll show emoji options for a globe, the European Union flag, the euro currency, a European post office building, a football, and a castle.
Moreover, Instagram also faced backlash earlier for perpetuating racist stereotypes by displaying the “dog” emoji for a Chinese takeout box to users.
Earlier, a politically controversial suggestion about Israel and Palestine surfaced online. Apparently, when a user typed “Jerusalem” (the capital of Israel, according to Google), the iPhone keyboard suggested the Palestinian flag.
It’s unclear in this instance if Apple’s emoji search function uses more complex machine learning algorithms that adjust to user behavior or just uses keywords and tags.
Apple gets emojis straight from Unicode, a collaboration of some of the world’s largest technology companies, including Google, Facebook, and Microsoft, tasked with standardizing emoji usage across internet browsers, social media platforms, and devices.
Anyway, this won’t be the first time Apple has been accused of putting bias into its emojis. In 2017, Apple’s predictive text feature, which usually shows emojis as users type words into their keyboard, faced criticism for using all-male emojis for phrases such as “CEO” and “CTO.”