For some reason, when you go to Google Image Search and search for the term [hot] – the only images that come up are pictures of women in underwear, lingerie, swimsuits and other sexual situations. There are no photos of hot coffee, desserts, fire places, ice cream melting or other photos that are more appropriate for young children.
A school administrator complained about this in the Google Web Search Help forums the other day and wrote:
Why does the word “hot” bring up only inappropriate images in Safesearch?
We are a primary school with student aged between 5 yrs and 12 yrs old. We had an 8 year old search “hot” as a query to translate the word into Māori. The child ended up with a complete page of soft pornography, even though we have Safe Search turned on via our filtering service and on our computers. Our filtering service recommended that we report this issue directly with Google to find some resolution. Is there some way we can make this more child friendly?
Even when you filter out “explicit images” – Google still shows these images in the image search results.
I wonder if this has anything to do with the recent image search algorithm changes Google announced?
I don’t believe Google always returned these images for the search [hot] – because hot has a ton of different meanings.
Here is a screen shot of the results with the filter for explicit images set to on.
Forum discussion at Google Web Search Help.