Published • loading... • Updated
OpenAI Clarifies ChatGPT Policy on Tailored Legal and Medical Advice
OpenAI reaffirms ChatGPT offers general legal and medical information but prohibits tailored advice without licensed professional review to limit liability.
- On October 29th, OpenAI updated usage policies to bar provision of tailored legal or medical advice without licensed professional involvement, emphasizing human review for high-stakes decisions.
- Growing use of ChatGPT for health questions meant about 1 in 6 people rely on it monthly, with some harms reported including a psychiatric case in the Annals of Internal Medicine.
- In practice ChatGPT cannot diagnose users or provide in-depth personalized medical advice; it suggested general remedies for a head cold and urged calling 911 for facial immobility.
- The policy change could constrain OpenAI's healthcare push since ChatGPT cannot replace clinicians for serious conditions requiring professional judgment.
Insights by Ground AI
Podcasts & Opinions
38 Articles
38 Articles
Contrary to what had been said, ChatGPT did not add a ban on providing legal and medical advice to its ChatGPT platform.
·Montreal, Canada
Read Full ArticleCoverage Details
Total News Sources38
Leaning Left5Leaning Right5Center3Last UpdatedBias Distribution39% Left, 38% Right
Bias Distribution
- 39% of the sources lean Left, 38% of the sources lean Right
39% Left
L 39%
C 23%
R 38%
Factuality
To view factuality data please Upgrade to Premium




















