Listen to the article
In brief
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
Read the full article here
Fact Checker
Verify the accuracy of this article using AI-powered analysis and real-time sources.
On Wednesday, OpenAI announced a new feature in ChatGPT, allowing users to connect medical records and wellness data, raising concerns among some experts and advocacy groups over the use of personal data.
The San Francisco, California-based AI giant said the tool, dubbed ChatGPT Health, developed with physicians, is designed to support care rather than diagnose or treat ailments. The company is positioning it as a way to help users better understand their health.
For many users, ChatGPT has already become the go-to platform for questions about medical care and mental health.
OpenAI told Decrypt that ChatGPT Health only shares general, “factual health information” and does not provide “personalized or unsafe medical advice.”
For higher-risk questions, it will provide high-level information, flag potential risks, and encourage people to talk with a pharmacist or healthcare provider who knows their specific situation.
The move follows shortly after the company reported in October that more than 1 million users discuss suicide with the chatbot each week. That amounted to roughly 0.15% of all ChatGPT users at the time.
While those figures represent a relatively small share of the overall user population, most will need to address security and data privacy concerns, experts say.
“Even when companies claim to have privacy safeguards, consumers often lack meaningful consent, transparency, or control over how their data is used, retained, or repurposed,” Public Citizen’s big-tech accountability advocate J.B. Branch told Decrypt. “Health data is uniquely sensitive, and without clear legal limits and enforceable oversight, self-policed safeguards are simply not enough to protect people from misuse, re-identification, or downstream harm.”
OpenAI said in its statement that health data in ChatGPT Health is encrypted by default, stored separately from other chats, and not used to train its foundation models.
According to Center for Democracy and Technology senior policy counsel Andrew Crawford, many users mistakenly assume health data is protected based on its sensitivity, rather than on who holds it.
“When your health data is held by your doctor or your insurance company, the HIPAA privacy rules apply,” Crawford told Decrypt. “The same is not true for non-HIPAA-covered entities, like developers of health apps, wearable health trackers, or AI companies.”
Crawford said the launch of ChatGPT Health also underscores how the burden of responsibility falls on consumers in the absence of a comprehensive federal privacy law governing health data held by technology companies.
“It’s unfortunate that our current federal laws and regulations place that burden on individual consumers to analyze whether they’re comfortable with how the technology they use every day handles and shares their data,” he said.
OpenAI said ChatGPT Health will roll out first to a small group of users.
The waitlist is open to ChatGPT users outside the European Union and the UK, with broader access planned in the coming weeks on web and iOS. OpenAI’s announcement did not mention Google or Android devices.
A weekly AI journey narrated by Gen, a generative AI model.
Read the full article here
Verify the accuracy of this article using AI-powered analysis and real-time sources.
Enter your email to receive detailed fact-checking analysis
You've used your 5 free reports. Sign up for unlimited access!
Already have an account? Sign in here
The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.
We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.
Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.
