OpenAI Bans ChatGPT from Giving Medical, Legal, and Financial Advice
OpenAI has officially restricted ChatGPT from providing specific medical, legal, or financial guidance, redefining the AI as an educational tool rather than a consultant.
Key Changes Effective October 29
According to NEXTA, as of October 29, ChatGPT will no longer offer:
- Names of medications or dosage information
- Lawsuit templates, court strategies, or specific legal actions
- Investment tips or buy/sell suggestions
The chatbot is now limited to explaining general principles, outlining mechanisms, and directing users to consult qualified professionals.
Real-World Incidents Prompt Policy Change
The restrictions follow several concerning cases where users relied on ChatGPT for critical advice:
In August, a 60-year-old man was hospitalized for three weeks after replacing table salt with sodium bromide based on ChatGPT’s suggestion. A case report in the Annals of Internal Medicine documented that the man, with no psychiatric history, experienced paranoia, hallucinations, and required an involuntary psychiatric hold.
In September, Warren Tierney, a 37-year-old from Ireland, consulted ChatGPT about swallowing difficulties. The AI told him cancer was “highly unlikely,” causing him to delay medical consultation. He was later diagnosed with stage-four esophageal cancer.
“I think it ended up really being a real problem, because ChatGPT probably delayed me getting serious attention,” Tierney told Mirror. “It sounded great and had all these great ideas. But ultimately I take full ownership of what has happened.”




