OPENAI
No, ChatGPT didn’t ban legal or medical talk
OpenAI says ChatGPT’s behaviour has not changed, after posts online claimed it would no longer give legal or medical information.
Karan Singhal, OpenAI’s head of health AI, said this is incorrect.
He added that ChatGPT has never been a replacement for real lawyers or doctors, but can still help people understand these topics.
The confusion came after an October policy update that listed activities needing a licensed professional, such as personalised legal or medical advice.
Quick facts
Claims about ChatGPT “banning” legal/medical guidance are false
Rules about expert involvement existed before
OpenAI simply merged its policies into one document
Panic for what
OpenAI says this language is not new. Previous rules already required expert involvement for tailored advice in legal, health, or financial areas.
The only real update is that OpenAI combined several separate policy documents into one shared version. The rules themselves stay the same.
In short: ChatGPT can still explain legal and medical topics, but personalised advice must involve a qualified professional.
ICYMI: Policies didn’t change, people just can’t read. - MG


