Google’s Gemini AI Keeps Your Conversations for Up to 3 Years (Even If You Delete Them)

Have you got a secret you don’t want anyone to know? Don’t tell any of humanity’s fancy new AI-powered assistants because the companies behind these new tools are probably keeping your data a lot longer than you think.

Google’s Gemini, the AI assistant formerly known as Bard, has received rave reviews, with many people hailing it as heads above OpenAI’s ChatGPT. But if you plan on using Gemini, it might be a good idea to give the privacy policy a quick read-through.

Not only does Google explicitly warn users not to give Gemini any sensitive information they wouldn’t want a human reviewer to read, but Google is also retaining many of your questions to help make their tools better. In fact, everything you tell Gemini might be kept by the company for up to three years—even if you delete all your information from the app.

Google uses human reviewers to look at how Gemini is being used and make any changes necessary when things don’t go as planned. But keeping that information available for Google workers to look over means the tech giant is holding on to vast quantities of user data.

“Conversations that have been reviewed or annotated by human reviewers (and related data like your language, device type, location info, or feedback) are not deleted when you delete your Gemini Apps activity because they are kept separately and are not connected to your Google Account. Instead, they are retained for up to three years,” Google explains in the latest version of Gemini’s privacy policy.

The new privacy notice for Gemini, first flagged by ZDNet Tuesday, tries to reassure users that data is anonymized to some extent.

“We take your privacy seriously, and we do not sell your personal information to anyone. To help Gemini improve while protecting your privacy, we select a subset of conversations and use automated tools to help remove user-identifying information (such as email addresses and phone numbers),” Google’s privacy policy explains.

But just as there’s a lot you can learn from mere metadata, stripping someone’s name out of the information available to a human reviewer doesn’t mean it’s anonymous. And Google knows that.

“Please don’t enter confidential information in your conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies,” Google warns.

Google didn’t immediately respond to questions emailed Tuesday afternoon about whether users can see if something they wrote in Gemini has been reviewed by a human. We’ll update this post if we hear back.

Again, don’t give Gemini or any AI-powered tool information you wouldn’t want another human to read. And it goes without saying, but you should especially avoid typing anything into Gemini you wouldn’t want to hear read out in a court of law. It’s standard practice for prosecutors to obtain search histories and other electronic records for anyone accused of serious crimes. As just one recent example, investigators in Minnesota obtained the search history of a woman who allegedly hit an Amish buggy with her SUV, killing two kids. Her most damning search? “What happens if you get in an accident with an Amish buggy and kill two people?”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Web Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – webtimes.uk. The content will be deleted within 24 hours.

Leave a Comment