Google apes Apple, swears cloud-based AI will keep your info private
(2025/11/12)
- Reference: 1762980684
- News link: https://www.theregister.co.uk/2025/11/12/google_touts_private_ai_compute/
- Source link:
Google, perhaps not the first name you'd associate with privacy, has taken a page from Apple's playbook and now claims that its cloud AI services will safeguard sensitive personal data handled by its Gemini model family.
The Chocolate Factory has announced [1]Private AI Compute , which is designed to extend the trust commitments embodied by Android's on-device [2]Private Compute Core to services running in Google datacenters. It's conceptually and architecturally similar to Private Cloud Compute from Apple, which historically has used privacy as a big selling point for its devices and services, unlike Google, which is fairly open about collecting user data to serve more relevant information and advertisements.
"Private AI Compute is a secure, fortified space for processing your data that keeps your data isolated and private to you," said Jay Yagnik, VP of AI innovation and research, in a [3]blog post . "It processes the same type of sensitive information you might expect to be processed on-device."
[4]
Since the generative AI boom began, experts have advised [5]keeping sensitive data away from large language models , for fear that such data may be incorporated into them during the training process. Threat scenarios since then have expanded as models have been granted varying degrees of agency and access to other software tools. Now, providers are trying to convince consumers to share personal info with AI agents so that they can take action that requires credentials and payment information.
[6]
[7]
Without greater privacy and security assurances, the agentic pipe dreams promoted by AI vendors look unlikely to take shape. Among the 39 percent of Americans who haven't adopted AI, 71 percent cite data privacy as a reason why, according to [8]a recent Menlo Ventures survey .
The paranoids have reason to be concerned. According to [9]a recent Stanford study , six major AI companies – Amazon (Nova), Anthropic (Claude), Google (Gemini), Meta (Meta AI), Microsoft (Copilot), and OpenAI (ChatGPT) – "appear to employ their users' chat data to train and improve their models by default, and that some retain this data indefinitely."
[10]
If every AI prompt could be handled by an on-device model that didn't phone home with user data, many of the privacy and security concerns would be moot. But so far, the consensus appears to be that frontier AI models must run in the cloud. So model vendors have to allay concerns about insiders harvesting sensitive stuff from the tokens flowing between the street and the data center.
Google's solution, Private AI Compute, is similar to [11]Apple's Private Cloud Compute in that both data isolation schemes rely on Trusted Execution Environments (TEE) or Secure Enclaves. These notionally confidential computing mechanisms encrypt and isolate memory and processing from the host.
[12]Mozilla's Firefox 145 is heeeeeere: Buffs up privacy, bloats AI
[13]EU's reforms of GDPR, AI slated by privacy activists for 'playing into Big Tech's hands'
[14]Who's watching the watchers? This Mozilla fellow, and her Surveillance Watch map
[15]Google's Gemini Deep Research can now read your Gmail and rummage through Google Drive
For AI workloads on its Tensor Processing Unit (TPU) hardware, Google calls its computational safe room Titanium Intelligence Enclave (TIE). For CPU workloads, Private AI Compute relies on AMD's Secure Encrypted Virtualization – Secure Nested Paging ( [16]SEV-SNP ), a secure computing environment for virtual machines.
Where Private AI Compute jobs require analytics, Google claims that it relies on [17]confidential federated analytics , "to ensure that only anonymous statistics (e.g. differentially private aggregates) are visible to Google."
And the system incorporates various defenses against insiders, Google claims. Data is processed during inference requests in protected environments and then discarded when the user's session ends. There's no administrative access to user data and no shell access on hardened TPUs.
[18]
As a first step toward makings its claims verifiable, Google has [19]published [PDF] cryptographic digests (e.g. SHA2-256) of application binaries used by Private AI Compute servers. Looking ahead, Google plans to let experts inspect its remote attestation data, to undertake third-party further audits, and to expand its Vulnerability Rewards Program to cover Private AI Compute.
That may attract more interest from security researchers, some of whom recently [20]found flaws in AMD SEV-SNP and other trusted computing schemes .
Kaveh Ravazi, assistant professor in the department of information technology and electrical engineering at ETH Zürich, told The Register in an email that, while he's not an expert on privacy preserving analytics, he's familiar with TEEs.
"There have been attacks in the past to leak information from SEV-SNP for a remote attacker and compromise the TEE directly for an attacker with physical access (e.g., Google itself)," he said. "So while SEV-SNP raises the bar, there are definitely ways around it."
As for the hardened TPU platform, that looks more opaque, Ravazi said.
"They say things like there is no shell access and the security of the TPU platform itself has definitely been less scrutinized (at least publicly) compared to a TEE like SEV-SNP," he said. "Now in terms of what it means for user data privacy, it is a bit hard for me to say since it is unclear how much user data actually goes to these nodes (except maybe the prompt, but maybe they also create user-specific layers, but I do not really know)."
He added, "Google seems to be a bit more open about their security architecture compared to other AI-serving cloud companies as far as this whitepaper goes, and while not perfect, I see this (partial) openness as a good thing."
[21]An audit conducted by NCC Group concludes that Private AI Compute mostly keeps AI session data safe from everyone except Google.
"Although the overall system relies upon proprietary hardware and is centralized on Borg Prime, NCC Group considers that Google has robustly limited the risk of user data being exposed to unexpected processing or outsiders, unless Google, as a whole organization, decides to do so," the security firm's audit concludes. ®
Get our [22]Tech Resources
[1] https://blog.google/technology/ai/google-private-ai-compute/
[2] https://security.googleblog.com/2022/12/trust-in-transparency-private-compute.html
[3] https://blog.google/technology/ai/google-private-ai-compute/
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[5] https://stackoverflow.blog/2023/10/23/privacy-in-the-age-of-generative-ai/
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[8] https://menlovc.com/perspective/2025-the-state-of-consumer-ai/
[9] https://hai.stanford.edu/news/be-careful-what-you-tell-your-ai-chatbot
[10] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[11] https://security.apple.com/blog/private-cloud-compute/
[12] https://www.theregister.com/2025/11/12/firefox_145_arrives/
[13] https://www.theregister.com/2025/11/11/eu_leaked_gdpr_ai_reforms/
[14] https://www.theregister.com/2025/11/08/mozilla_fellow_al_shafei/
[15] https://www.theregister.com/2025/11/07/gemini_deep_research_can_now/
[16] https://docs.amd.com/v/u/en-US/SEV-SNP-strengthening-vm-isolation-with-integrity-protection-and-more
[17] https://research.google/blog/discovering-new-words-with-confidential-federated-analytics/
[18] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[19] https://www.gstatic.com/published_ledger/ledger_instructions.pdf
[20] https://tee.fail/
[21] https://www.nccgroup.com/research-blog/public-report-google-private-ai-compute-review/
[22] https://whitepapers.theregister.com/
The Chocolate Factory has announced [1]Private AI Compute , which is designed to extend the trust commitments embodied by Android's on-device [2]Private Compute Core to services running in Google datacenters. It's conceptually and architecturally similar to Private Cloud Compute from Apple, which historically has used privacy as a big selling point for its devices and services, unlike Google, which is fairly open about collecting user data to serve more relevant information and advertisements.
"Private AI Compute is a secure, fortified space for processing your data that keeps your data isolated and private to you," said Jay Yagnik, VP of AI innovation and research, in a [3]blog post . "It processes the same type of sensitive information you might expect to be processed on-device."
[4]
Since the generative AI boom began, experts have advised [5]keeping sensitive data away from large language models , for fear that such data may be incorporated into them during the training process. Threat scenarios since then have expanded as models have been granted varying degrees of agency and access to other software tools. Now, providers are trying to convince consumers to share personal info with AI agents so that they can take action that requires credentials and payment information.
[6]
[7]
Without greater privacy and security assurances, the agentic pipe dreams promoted by AI vendors look unlikely to take shape. Among the 39 percent of Americans who haven't adopted AI, 71 percent cite data privacy as a reason why, according to [8]a recent Menlo Ventures survey .
The paranoids have reason to be concerned. According to [9]a recent Stanford study , six major AI companies – Amazon (Nova), Anthropic (Claude), Google (Gemini), Meta (Meta AI), Microsoft (Copilot), and OpenAI (ChatGPT) – "appear to employ their users' chat data to train and improve their models by default, and that some retain this data indefinitely."
[10]
If every AI prompt could be handled by an on-device model that didn't phone home with user data, many of the privacy and security concerns would be moot. But so far, the consensus appears to be that frontier AI models must run in the cloud. So model vendors have to allay concerns about insiders harvesting sensitive stuff from the tokens flowing between the street and the data center.
Google's solution, Private AI Compute, is similar to [11]Apple's Private Cloud Compute in that both data isolation schemes rely on Trusted Execution Environments (TEE) or Secure Enclaves. These notionally confidential computing mechanisms encrypt and isolate memory and processing from the host.
[12]Mozilla's Firefox 145 is heeeeeere: Buffs up privacy, bloats AI
[13]EU's reforms of GDPR, AI slated by privacy activists for 'playing into Big Tech's hands'
[14]Who's watching the watchers? This Mozilla fellow, and her Surveillance Watch map
[15]Google's Gemini Deep Research can now read your Gmail and rummage through Google Drive
For AI workloads on its Tensor Processing Unit (TPU) hardware, Google calls its computational safe room Titanium Intelligence Enclave (TIE). For CPU workloads, Private AI Compute relies on AMD's Secure Encrypted Virtualization – Secure Nested Paging ( [16]SEV-SNP ), a secure computing environment for virtual machines.
Where Private AI Compute jobs require analytics, Google claims that it relies on [17]confidential federated analytics , "to ensure that only anonymous statistics (e.g. differentially private aggregates) are visible to Google."
And the system incorporates various defenses against insiders, Google claims. Data is processed during inference requests in protected environments and then discarded when the user's session ends. There's no administrative access to user data and no shell access on hardened TPUs.
[18]
As a first step toward makings its claims verifiable, Google has [19]published [PDF] cryptographic digests (e.g. SHA2-256) of application binaries used by Private AI Compute servers. Looking ahead, Google plans to let experts inspect its remote attestation data, to undertake third-party further audits, and to expand its Vulnerability Rewards Program to cover Private AI Compute.
That may attract more interest from security researchers, some of whom recently [20]found flaws in AMD SEV-SNP and other trusted computing schemes .
Kaveh Ravazi, assistant professor in the department of information technology and electrical engineering at ETH Zürich, told The Register in an email that, while he's not an expert on privacy preserving analytics, he's familiar with TEEs.
"There have been attacks in the past to leak information from SEV-SNP for a remote attacker and compromise the TEE directly for an attacker with physical access (e.g., Google itself)," he said. "So while SEV-SNP raises the bar, there are definitely ways around it."
As for the hardened TPU platform, that looks more opaque, Ravazi said.
"They say things like there is no shell access and the security of the TPU platform itself has definitely been less scrutinized (at least publicly) compared to a TEE like SEV-SNP," he said. "Now in terms of what it means for user data privacy, it is a bit hard for me to say since it is unclear how much user data actually goes to these nodes (except maybe the prompt, but maybe they also create user-specific layers, but I do not really know)."
He added, "Google seems to be a bit more open about their security architecture compared to other AI-serving cloud companies as far as this whitepaper goes, and while not perfect, I see this (partial) openness as a good thing."
[21]An audit conducted by NCC Group concludes that Private AI Compute mostly keeps AI session data safe from everyone except Google.
"Although the overall system relies upon proprietary hardware and is centralized on Borg Prime, NCC Group considers that Google has robustly limited the risk of user data being exposed to unexpected processing or outsiders, unless Google, as a whole organization, decides to do so," the security firm's audit concludes. ®
Get our [22]Tech Resources
[1] https://blog.google/technology/ai/google-private-ai-compute/
[2] https://security.googleblog.com/2022/12/trust-in-transparency-private-compute.html
[3] https://blog.google/technology/ai/google-private-ai-compute/
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[5] https://stackoverflow.blog/2023/10/23/privacy-in-the-age-of-generative-ai/
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[8] https://menlovc.com/perspective/2025-the-state-of-consumer-ai/
[9] https://hai.stanford.edu/news/be-careful-what-you-tell-your-ai-chatbot
[10] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[11] https://security.apple.com/blog/private-cloud-compute/
[12] https://www.theregister.com/2025/11/12/firefox_145_arrives/
[13] https://www.theregister.com/2025/11/11/eu_leaked_gdpr_ai_reforms/
[14] https://www.theregister.com/2025/11/08/mozilla_fellow_al_shafei/
[15] https://www.theregister.com/2025/11/07/gemini_deep_research_can_now/
[16] https://docs.amd.com/v/u/en-US/SEV-SNP-strengthening-vm-isolation-with-integrity-protection-and-more
[17] https://research.google/blog/discovering-new-words-with-confidential-federated-analytics/
[18] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aRURh1cnEyASahARUBHX_AAAARQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[19] https://www.gstatic.com/published_ledger/ledger_instructions.pdf
[20] https://tee.fail/
[21] https://www.nccgroup.com/research-blog/public-report-google-private-ai-compute-review/
[22] https://whitepapers.theregister.com/