How to Protect Client Data When Using AI

Jun 18, 2025 / By Sean Bailey, Horsesmouth Editor in Chief
Print AAA
Add to My Archive
My Folder

My Notes
Save
AI for Advisors: Financial advisors can confidently use AI without jeopardizing client privacy—if they understand the differences between free and paid tools and take a few smart precautions. Here’s how to avoid common pitfalls and use AI responsibly while safeguarding client privacy.

AI for Advisors newsletter

While AI privacy doesn’t come up in every conversation I have with advisors, it’s is a fair concern among those professionals exploring AI. And for good reason.

Working with personal client information means any new tool, especially one as powerful as generative AI, deserves a clear-eyed look at how data is handled.

The key is to move beyond vague worry and toward practical understanding. What’s being stored? Who has access? And how can you configure your tools to align with your responsibilities? These are the questions that matter on the privacy front.

The answers start with understanding how different versions of ChatGPT (or other AI models) handle your data, and why some options are safer than others when it comes to client-related work.

The good news? You can use AI thoughtfully and effectively without compromising client confidentiality. It just requires a little awareness and some smart practices.

Why privacy matters

Privacy isn’t just a “nice to have” in our profession. It’s foundational to our profession. Clients choose you because they trust that you will safeguard their personal information. If you misuse that trust, even unintentionally, you risk not just regulatory consequences and reputational damage.

Using AI introduces a new layer of risk if you aren’t careful. That’s why understanding and managing that risk is so important.

If you enter sensitive, personal, identifiable client information into a free AI tool, you might be handing that data over for analysis without even realizing it. Even paid versions vary. Not all guarantee that your data won’t be retained or used in some way. You have to check.

Understanding ChatGPT’s privacy levels

ChatGPT offers several subscription levels: Free, Plus, Team, and Enterprise. The privacy protections vary significantly across them.

The Free and Plus tiers are designed for individual use and include fewer safeguards. Conversations at these levels may be stored and used to improve OpenAI’s models.

Users can opt out of this data sharing in their settings. (Go to Settings>Data controls>Improve the model for everyone>Off.) Double check this on each device or browser. Your chat history is only deleted if manually removed.

For professionals handling sensitive or client-related data, the Team and Enterprise tiers offer stronger protections. These business-level accounts come with default privacy settings that prevent your conversations from being used to train the AI, and they include better data controls and admin oversight.

5 strategies for protecting privacy

Here’s the good news. With a few straightforward practices, you can use AI effectively without risking your clients’ trust.

  1. Avoid free tiers: Advisors should not use any free tiers if they’re using an AI service for anything client related. For ChatGPT, to get a higher level of privacy, choose Team or Enterprise.
  2. Don’t consent to train the model: Whenever possible, use AI tools that offer enterprise-grade security and privacy commitments. Look for clear policies stating that your data is not used for training.
  3. Anonymize client information: Of course, never input real names, Social Security numbers, account numbers, or any identifying client details into an AI prompt. If you want to brainstorm solutions or draft ideas related to a client situation, fictionalize the names. For example, instead of “John Smith, age 63, retiring from IBM,” you might say “a client retiring soon from a large corporation.”
  4. Check provider data policies: Before using any AI tool professionally, take a few minutes to read their privacy policies. Specifically, look for:
    • Whether your data is stored
    • Whether your data is used for training
    • How long data is retained
    • Whether you can opt out of data retention or training
  5. Isolate sensitive activities: Use AI for tasks like brainstorming marketing ideas, creating educational materials, or drafting internal documents. If in doubt, treat AI interactions the way you would treat email—only send what you’re comfortable being seen.

Privacy as a professional practice

When I started using AI in early 2023, I too was concerned about privacy. Was using AI “safe”? Would it somehow expose my data or chat history?

What I’ve learned, and what I believe many other advisors have realized, is that once you understand how to manage your privacy settings and how to prompt with a solid framework, you realize privacy concerns can be addressed without limiting how you use AI. You’re free to take full advantage of AI’s creative and strategic capabilities without second-guessing yourself. And you do it with purpose, not fear.

AI for Advisors newsletter

Sean Bailey is editor in chief at Horsesmouth, where he has led editorial strategy for over 25 years. He is the co-author of Hack Proof Your Life Now! and has spent over 3,000 hours researching how AI can transform the way financial advisors work. Through his AI-Powered Financial Advisor and AI Marketing for Advisors programs, he helps advisors save time, deliver better client experiences, and market their services with unprecedented speed, quality, and confidence.

IMPORTANT NOTICE
This material is provided exclusively for use by Horsesmouth members and is subject to Horsesmouth Terms & Conditions and applicable copyright laws. Unauthorized use, reproduction or distribution of this material is a violation of federal law and punishable by civil and criminal penalty. This material is furnished “as is” without warranty of any kind. Its accuracy and completeness is not guaranteed and all warranties express or implied are hereby excluded.

© 2025 Horsesmouth, LLC. All Rights Reserved.