Microsoft introduced on Thursday that it’s launching Copilot Health, a “separate, safe house” in Copilot for asking questions on lab outcomes and medical data, looking for suppliers, analyzing information from wearables, and different health-related chats. The characteristic could have a phased rollout, so it received’t be obtainable to everybody instantly, however customers can join a waitlist to get entry.
Microsoft says Copilot Well being “doesn’t substitute your physician” and isn’t supposed for offering medical diagnoses or remedy, however quite serving to customers perceive their well being information. Customers can import medical data from over 50,000 US hospitals and healthcare organizations via HealthEx, and import lab check outcomes via Perform. Copilot Well being can be appropriate with “over 50 wearable units,” together with these from Apple, Oura, and Fitbit. The Copilot Well being homepage can present information from wearables, like present step depend, in addition to reminders for upcoming appointments, relying on what information customers select to share.
1/3
Customers also can discover medical professionals via Copilot Well being. It’s linked to “real-time US supplier directories” that may assist customers seek for suppliers primarily based on specialty, location, languages spoken, and insurance policy accepted.
In its press launch on Copilot Well being, Microsoft states that it has “improved the standard and reliability of solutions by elevating info from credible well being organizations throughout 50 nations.” It additionally says responses in Copilot Well being will embody citations with hyperlinks to sources and “professional‑written reply playing cards from Harvard Well being.”
In accordance with Microsoft, customers’ chats in Copilot Well being “are remoted from basic Copilot and stored underneath extra entry, privateness, and security controls.” It additionally claims that information from Copilot Well being chats isn’t used for coaching its AI fashions. Customers are additionally in a position to delete their well being information or disconnect information sources at any time, similar to toggling off entry to wearable information.
OpenAI launched a really related characteristic in January referred to as ChatGPT Well being, which additionally provides an remoted sandboxed atmosphere for medical chats, encourages customers to attach their medical data, and doesn’t use well being chats for mannequin coaching. Nevertheless, Microsoft doesn’t at the moment have a HIPAA-compliant model of Copilot Well being, not like ChatGPT for Healthcare and Amazon’s Well being AI, which was opened up to more users on Tuesday. Anthropic’s Claude for Healthcare is equally “HIPAA-ready.”
When requested about HIPAA compliance in a briefing forward of Thursday’s announcement, Dr. Dominic King, VP of well being at Microsoft AI, acknowledged: “HIPAA is just not required for a direct-consumer expertise like this whenever you’re utilizing your individual information.” The Health Insurance Portability and Accountability Act consists of safety necessities for shielding sufferers’ digital well being information and prohibits sure sorts of its utilization and disclosure. Violators of HIPAA can face fines and probably even a jail sentence. Since corporations like Microsoft aren’t legally required to be HIPAA compliant, they’re not topic to the results {that a} hospital or physician would possibly face for violating a affected person’s HIPAA rights. King added, “Nevertheless, at Copilot, we expect it’s extremely necessary that we’re assembly all the most effective requirements on the market. So, we shall be asserting some updates right here on our standing by way of what are referred to as ‘HIPAA controls.’” King didn’t elaborate on what that precisely entails.
King additionally famous that Copilot Well being has an ISO 42001 certification. ISO 42001 is an unbiased, worldwide commonplace for AI techniques that’s supposed to advertise “accountable use of AI” in addition to “traceability, transparency, and reliability.” Microsoft 365 Copilot and Microsoft 365 Copilot Chat also have this certification.
Nevertheless, even with that certification and any future intentions for voluntary HIPAA compliance, customers should still wish to be cautious about sharing their medical information with an AI. As experts have pointed out, AI corporations can change their information privateness insurance policies at any time. AI additionally has a historical past of giving customers inaccurate or unsafe medical advice, and has an particularly regarding monitor file when it comes to mental health.












