+1 (904) 362-3629 vericial@proton.me

For Rest of World, Victoria Turk breaks down bias in generative AI in the context of national identity.

Bias in AI image generators is a tough problem to fix. After all, the uniformity in their output is largely down to the fundamental way in which these tools work. The AI systems look for patterns in the data on which they’re trained, often discarding outliers in favor of producing a result that stays closer to dominant trends. They’re designed to mimic what has come before, not create diversity.

“These models are purely associative machines,” Pruthi said. He gave the example of a football: An AI system may learn to associate footballs with a green field, and so produce images of footballs on grass.

Between this convergence to stereotypes and the forced diversity from Google’s Gemini, has anyone tried coupling models with demographic data to find a place in between?

Tags: , , ,

To Site