Private changes

Diversity & Ethics

Manipulated media data can be detected using AI.

Now, a New York-based startup is working to prevent photos and selfies uploaded onto social media from being used by AI systems for facial recognition. The team’s software alters images but changes are imperceptible to the human eye.

The result? Photos posted online still look natural to users, while being unrecognisable for AI systems. Leaving algorithms unable to make links to a given person when comparing them against a face in their database.

Unlike current laissez-faire laws in many countries, the startup’s system is committed to upholding privacy and doesn’t store original images.

A research team created a similar system to protect musicians’ intellectual property online.

Share the inspiration

Previous article
Next article

More Chillipicks In This Category

Stay Connected

Your headstart thanks to the newsletter

Sign up for our regular newsletter to receive the inspiration directly into your inbox on Fridays. Providing you with positive news on innovation and fresh perspectives that spark ideas. Not to forget – these cool topics make for warm-hearted conversations.

Related Chillipicks