Google Image’s search algorithm is fantastically useful a lot of the time. Other times, it is kind of creepy, especially if you’re searching for female stars. Wired looked at the different results the service produced when users searched for pictures of men and women.
If you’re searching for a man, Google’s image-hunting algorithm will mostly focus on his career. If you’re searching for a woman, Google’s algorithm will focus on her body. Do a Google Images search for Robert Downey, Jr., for example, and only four out of the 30 algorithmically-generated related search terms relate to his physical appearance: workout, body, handsome, cute. The vast majority focus on films he has starred in and actors he has appeared alongside. Do the same for Avengers co-star Scarlett Johansson and Google will recommend no fewer than 14 related visual searches based on her physical appearance, from “stomach” to “figure” and “big” to “rack”. It’s the male gaze, algorithm style.
Check It Out: Google Image Search’s Creepy Sexism Problem
It is NOT Google’s responsibility to alter search query data AT ALL to suit uptight hypocritical human defectives – you know, the ones that believe in tooth fairies and after death but can’t even handle after – birth. It’s not much of an algorithm needed to bounce back WHAT PEOPLE WANT as the first to be listed. The irony is Porn (aka ‘The Body”) makes more money than ALL Pro Sports combined and as THE viewable pastime by FAR over any TV, Movie, Books etc by humans you wouldn’t guess what finally overtook it????? Yep, Social Media – the same medium now used to talk about and view the thing that used to dominate the media….and the suckers that use web search get what THEY reflect as proven by data will then bitch about it like children; man and I thought Apple clones were simple. What a world. 📺
I was under the impression that the image search algorithm is trained by the people using the tool. I am not sure this is an indictment of Google or an indictment to Google’s searches customers.
On one hand, is exceedingly hard to train an algorithm to return the results most relevant to the people doing the search.
On the other hand it Google’s job to ignore that a program to a specific social justice agenda.
On one hand is is exceedingly hard to train an algorithm to not have the biases of the programmer or the dataset, sexual, racial, or whatever.
On the other hand it is THEIR JOB to make these algorithms as neutral as possible. It’s what they are getting the big bucks for.