The most alarming aspect of your conversations with Gemini is that Google can keep “a subset of conversations” and store them on its cloud servers. Now, there is both good and bad news here. The bad news is that Google can save that data for up to three years. That’s a big shift and exposes the flawed model with generative AI chatbots as they are far more hungry for training data than a regular voice assistant like Siri or Alexa. Unlike Gemini’s three-year data retention tenure, there is no such policy in place for the good ‘ol Google Assistant. “By default, we don’t retain your audio recordings on Google servers,” Google notes on Google Assistant’s data disclosure safety page.
Another bad news is that we don’t know exactly what this subset of data is. The only saving grace here is that Google will use an automated system to remove personally identifiable information such as email addresses and phone numbers. Moreover, these conversations are seen by human moderators. In a nutshell, you don’t want to disclose any personal information or push details into your Gemini conversations that you don’t want another person to read.
The other sigh of relief is that all your Gemini conversation data is stored in a separate data container, detached from the Google account dashboard that is home to all the data collected from your use of mainstream Google services such as Maps and Workspace, among others. Specifically, your Gemini conversations and all the data gleaned from it are stored in the Gemini Apps Activity Center. Google says it only picks random excerpts from conversations for human review and that only a small portion of Gemini chats are reviewed.