New website shows you how much Google AI can learn from your photos


Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned that the tech giant had briefly helped the US military develop AI to study drone footage. In 2020 he left his job working on Google Assistant and also stopped backing up all of his images to Google Photos. He feared that his content could be used to train AI systems, even if they weren’t specifically ones tied to the Pentagon project. “I don’t control any of the future outcomes that this will enable,” Mohandas thought. “So now, shouldn’t I be more responsible?”

Mohandas, who taught himself programming and is based in Bengaluru, India, decided he wanted to develop an alternative service for storing and sharing photos that is open source and end-to-end encrypted. Something “more private, wholesome, and trustworthy,” he says. The paid service he designed, Ente, is profitable and says it has more than 100,000 users, many of whom are already part of the privacy-obsessed crowd. But Mohandas struggled to articulate to wider audiences why they should reconsider relying on Google Photos, despite all the conveniences it offers.

Then one weekend in May, an intern at Ente came up with an idea: Give people a sense of what some of Google’s AI models can learn from studying images. Last month, Ente launched https://Theyseeyourphotos.com, a website and marketing stunt designed to turn Google’s technology against itself. People can upload any photo to the website, which is then sent to a Google Cloud computer vision program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI model to document small details in the uploaded images.)

One of the first photos Mohandas tried uploading was a selfie with his wife and daughter in front of a temple in Indonesia. Google’s analysis was exhaustive, even documenting the specific watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. “We had to tweak the prompts to make it slightly more wholesome but still spooky,” Mohandas says. Ente started asking the model to produce short, objective outputs—nothing dark.



Source link

About The Author

Scroll to Top