Synthesis of published tips for using AI chatbots for health advice

Use of AI chatbots for health advice is no longer niche—about one-third of US adults report using them (Montero et al. 2026). While users are generally positive about their experiences, many also report receiving incorrect or confusing information. Emerging evidence suggests that how people interact with chatbots—not just the technology itself—plays a meaningful role in these shortcomings (Felt-Lisk 2026, Bean et al. 2026).

This post synthesizes and organizes advice from nine recently-published tip sets into a practical framework for real-world use. Below I group the tips into four stages of the user journey, where missteps are most likely and where better guidance might improve outcomes:

  • When to use (and not use) chatbots for health advice

  • Privacy and information-sharing decisions

  • Prompting for better answers

  • Validating and extending chatbot responses

The tip sets used here were published during September 2025 - April 2026. They are listed in the Sources section at the end of this post with full-text links. Selecting which chatbot to use for health questions is also relevant, but the options are changing so rapidly that I will save it for a future post.

When to use (and not use) AI chatbots for health advice

Good AI chatbot uses—to help you:

  • Brainstorm lifestyle modifications

  • Prepare for appointments

  • Learn where to go for care

  • Decode medical jargon

  • Obtain general medical information (e.g. possible diagnoses or treatments)

  • Understand your care plan

  • Stay informed

  • Simplify or summarize complicated information

Do not use AI chatbots to:

  • Make a diagnosis on your own

  • Make treatment decisions on your own, or

  • Get advice on urgent or concerning symptoms (seek health care instead)

Privacy and information-sharing decisions

  • Use the chatbot option you have that’s most constrained and privacy aware

    • New health-specific options like ChatGPT Health and Copilot Health are rolling out in 2026 and protect your health information more than the free or paid general-use chatbots

  • With general-use chatbots, be ready to share context and details, but don’t upload images and medical records with identifying information

    • If you do, the data you share are not protected by HIPPAA, the law that protects your health information from inappropriate disclosure

Prompting for better answers

  • Give your chatbot a clear role

    • Such as, “you are a medical information assistant,” or “you are a health journalist,” or “you are a careful, experienced primary care doctor”

  • Ask a clear, specific question, along with sharing context such as a summary of your personal characteristics (age, weight, sex), health status and history, medications, and details of the issue, within reason

  • Ask the question in an unbiased way, rather than letting it know what you think up front

    • Chatbots tend to be responsive to user framing and may reinforce leading assumptions (sycophancy)

  • Require it to use current, evidence-based sources, and to note sources for each fact

  • Tell it you want the response in clear, plain language

  • Ask it to note if there is uncertainty or disagreement among experts, or if there is something it can’t determine without a physical exam or test

Validating and extending AI chatbot responses - important because sometimes chatbots “hallucinate,” giving wrong information that sounds right

  • Get a second opinion from another AI chatbot

  • Check the sources it used (we’ve all heard stories of chatbots making up sources)

  • Invite more questions

    • For instance, tell it “Ask me any additional questions you need to reason safely”

  • Keep your doctor in the loop

  • Use internal critique

    • For instance, ask it to critique its own answer, or answer the same question with a different role (such as “careful, experienced specialist physician”), and then ask it to reconcile the two responses

Across these tip sets, two themes stand out:

  • User behavior greatly influences output quality. The same chatbot can produce different answers depending on how questions are framed and how follow-up is handled.

  • Chatbots are best used as complements, not substitutes, for clinical care. Their strongest role is in preparation, interpretation, and navigation, rather than decision-making.

A more comprehensive, rigorously-developed resource, “The Health Chatbot Users’ Guide,” is expected in mid-2026 (Khair et al. 2026). Until then, the emerging guidance summarized here offers a practical starting point for safer, more effective use.

References

Felt-Lisk, S. Improving human-AI chatbot interactions on health should be a research and health leadership priority. Bluemont Health Consulting LLC Insights Brief: April 2026. https://www.bluemonthealth.com/insights/improving-human-ai-chatbot-interactions-in-health

Khair, D., Kale A.Ul, Agbakoba R., et al. Building The Health Chatbot Users’ Guide. Nature Health, February 2026. https://doi.org/10.1038/s44360-026-00074-5

Montero, Alex, M., Montalvo, H., Kearney A., Valdes I., Kirzinger A., and Hamel L. KFF. Tracking Poll on Health Information and Trust: Use of AI For Health Information and Advice. Mar 25, 2026. https://www.kff.org/public-opinion/kff-tracking-poll-on-health-information-and-trust-use-of-ai-for-health-information-and-advice/

Sources

These sets of tips for AI chatbot use for health advice were reviewed for this post:

Bajaj, Simar. Asking ChatGPT for medical advice? Here’s how to do it safely. New York Times. October 30, 2025. https://www.nytimes.com/2025/10/30/well/chatgpt-health-questions.html

Cunningham, Eve. Patients are using AI for medical advice. Here’s how to do it safely. Forbes, February 24, 2026. https://www.forbes.com/sites/evecunningham/2026/02/24/patients-are-using-ai-for-medical-advice-heres-how-to-do-it-safely/

Dalton, Meg. Using an AI chatbot for health advice? Keep these tips in mind. YaleNews, February 12, 2026. https://news.yale.edu/2026/02/12/using-ai-chatbot-health-advice-keep-these-tips-mind

Haupt, Angela. 9 Doctor-approved ways to use ChatGPT for health advice. Time, October 2, 2025. https://time.com/7321821/chatgpt-ai-how-to-use-for-health-safely/?utm_source=copilot.com

Perrone, Matthew. 5 things you should consider before asking an AI chatbot for health advice. PBS News. Associated Press: March 2, 2026. https://www.pbs.org/newshour/health/5-things-you-should-consider-before-asking-an-ai-chatbot-for-health-advice

Rampurwala, Mariya S. The dos and don’ts of using AI chatbots for health advice. Duly Health and Care, undated. https://www.dulyhealthandcare.com/health-topic/the-dos-and-donts-of-using-ai-chatbots-for-health-advice

Whelan, Luke. How (and how not) to use ChatGPT for health advice. University of Washington, Right as Rain by UW Medicine, September 2, 2025. https://rightasrain.uwmedicine.org/well/health/ai-chatbot-health-advice?utm_source=copilot.com

Winter, David. Can you trust an AI doctor? How to use AI safely for health advice. Baylor Scott White Health: April 2, 2026. https://www.bswhealth.com/blog/can-you-trust-an-ai-doctor?utm_source=copilot.com

Young, Colleen. Can you trust AI for health advice? Mayo Clinic Connect: March 11, 2026. https://connect.mayoclinic.org/blog/about-connect/newsfeed-post/can-you-trust-ai-for-health-advice/?utm_source=copilot.com‍ ‍

Previous
Previous

Improving human-AI chatbot interactions on health should be a research and health leadership priority

Next
Next

Using AI chatbots for health questions: the who, what, and why