Why You Should Never Share Your Financial Data With ChatGPT

Why You Should Never Share Your Financial Data With ChatGPT

We research all brands listed and may earn a fee from our partners. Research and financial considerations may influence how brands are displayed. Not all brands are included. Learn more.

No matter how much you may trust AI for its advice and research, experts say there's one thing you should never share with a chatbot: your financial data.

In a recent Money project, our staff graded ChatGPT and Gemini responses to 25 personal finance advice questions in order to evaluate how top models perform when asked about retirement, investing, credit and more.

We found that AI tools can produce helpful summaries and provide the key points to consider before committing to a money decision. But the bots are far from perfect: AI models often make factual errors, stumble when processing current events and oversimplify financial processes.

They also, crucially, can lead to fraud and identity theft risks. In our research, experts warned that consumers need to remember to protect their privacy when using AI tools for personal finance help.

"Any time you're sharing your own personal, nonpublic information with a nonfinancial services provider that, frankly, isn't as regulated as closely — where information sharing practices aren't as governed as they are with a financial institution — there certainly is concern," Chris Powell, head of deposits at Citizens Bank, tells Money.

Fortunately, people are often able to get the advice they're looking for without sharing any sensitive information. General, first-person prompts like the ones we used in our test (for example, "I don’t have much savings. Can I rely on Social Security for my retirement if I lower my spending when I stop working?") can be effective.

Alternatively, users can try personalizing queries by giving ranges for lifestyle factors like salary, debts or investments to avoid sharing hard numbers, Powell says. Even ballpark figures can help an AI tool give you better responses.

How giving AI sensitive financial data could lead to fraud

Thinking carefully about what you enter into AI is critical — even if it can be tempting to dump your entire financial situation into a ChatGPT chat.

"At the moment it is best not to put any confidential financial information into an LLM tool," Alastair Paterson, CEO and co-founder of Harmonic Security, writes in an email.

According to Harmonic Security's research, people are frequently making risky moves with AI tools, like uploading their employer's unpublished data to aid in writing financial reports without considering the downsides.

"Oversharing is happening," Paterson adds.

In a recent study, Harmonic Security found that 4.37% of AI prompts included "sensitive information." For file uploads, the share containing sensitive information was over 20%.

These trends have experts concerned. Ramayya Krishnan, professor of management science and information systems at Carnegie Mellon University, explains that because the standard versions of Gemini and ChatGPT store your conversation history, all of that information could be compromised if your AI account is compromised.

There's also the fact that a "subset of conversations are sampled and reviewed by OpenAI and Google employees for quality improvement," Krishnan tells Money. In a worst-case scenario, a bad actor could steal your identity or commit financial fraud with that information you casually typed in while seeking AI money advice.

Here's a noninclusive list of financial details experts say you should never plug into an AI model:

  • Bank account info (including account numbers)
  • Investment account info
  • Social Security numbers
  • Passwords and logins to financial accounts
  • Transaction details
  • Account balances
  • Paychecks and exact wage information
  • Sensitive tax info and tax documents

Additional steps to protect your data

There are a few extra steps you can take to protect yourself when using AI for financial advice.

"You mitigate these risks by not storing chat history, or by explicitly turning off sharing of your data for training purposes or using an enterprise-class version where this [is] standard," Krishnan explains.

The enterprise version of ChatGPT, which you must pay to use, advertises higher-level data privacy features for businesses. These include additional sign-on verification security tools, data encryption and data retention options, according to OpenAI's website. Also, enterprise chats are not used to train OpenAI's models.

Last month, Fast Company reported an issue in which some ChatGPT conversations were appearing in Google search results. The report found that thousands of users had been exposing their conversations to the public because they'd made their chats shareable. These folks were likely trying to share their chats with individuals or a small group — not the entire internet, according to the report, though OpenAI argued that users did have to opt in.

The company quickly addressed the problem and said it was working to remove the links from search engine results, but the incident brought greater focus to AI privacy issues and underscored the need to check on your settings.

For more information on the best privacy settings for ChatGPT, you can consult this guide.

Ads by Money. We may be compensated if you click this ad.AdAds by Money disclaimer

More from Money:

Can You Trust AI for Financial Advice? We Put ChatGPT and Gemini to the Test

I Let AI and My 5-Year-Old Pick My Stocks. Who Did Better?

How to Choose a College Major in the Age of AI

Patrocinados
Upgrade to Pro
Choose the Plan That's Right for You
Patrocinados
Patrocinados
Read More
Download the Telestraw App!
Download on the App Store Get it on Google Play
×