@enkiv2 « [Google] also told the Verge that “its machine learning detects what objects are in the frame, and the camera is smart enough to know what color they are supposed to have.” Consider how different that is from a normal photograph. Google’s camera is not capturing what is, but what, statistically, is likely. »

Great, more bias from machine learning, polluting the photographic record. D-:

@ultimape @enkiv2 Not the same phenomenon, but also, it doesn't matter. Cameras aren't supposed to do *that*, either. :-)

(Also, those strawberries look gray to me.)

@varx @enkiv2 ah, I assumed google was talking abuot the autowhite balancing thing they are doing to adjust for low light and color temperature. They discuss it a bit on their blog: ai.googleblog.com/2018/11/nigh

They specifically refer to the color constancy phenomena in humans (that seems to be the source of the color taint on the strawberries).

@ultimape @enkiv2 Oh, it's not the white-balancing that bothers me (cameras have done that for like forever), it's the imposition of meaning via machine learning. They don't just use it for white-balance.

What's more troubling to me about the current state of affairs is how they try to "beautify" faces, which can't do good things for people's self-image and perceptions of the world.

@varx @enkiv2 Oh yeah, that is really annoying. We're basically giving Cameras a form of apophenia. It's one thing to try and model a person's attention via gaze tracking and applying that to a photograph, and another entirely to take asumptions on what I find interesting in a photo and adapt it toward that bias.

I should see if someone has an iphone and try to get it to find smiling faces in some bird poop.

Sign in to participate in the conversation
Eldritch Café

Une instance se voulant accueillante pour les personnes queers, féministes et anarchistes ainsi que pour leurs sympathisant·e·s. Nous sommes principalement francophones, mais vous êtes les bienvenu·e·s quelque soit votre langue.