The Risks and Benefits to Healthcare Providers Using ChatGPT to Support Billing Claims

Footnotes for this article are available at the end of this page.

On November 30, 2022, OpenAI launched ChatGPT and immediately set off a firestorm. An AI chatbot that can write stories, plays, poems, and essays;1 pass licensing and other exams;2 produce code;3 engage in conversations;4 and even write judicial opinions,5 ChatGPT had more than one million users in its first five days and over 100 million users in January, earning it the label of “the fastest-growing ‘app’ of all time.”6

Inevitably, technological innovation begets further innovation. Within days of its launch, a video demonstrating how ChatGPT could be used to write an insurance appeal letter appeared on TikTok and had been viewed more than 143,000 times by early January.7 On February 10, 2023, Doximity, a digital platform for medical professionals, released a beta version8 of ChatGPT for doctors called DocsGPT. Using the free open beta site, doctors who are Doximity subscribers can access a library of medical prompts to produce a range of documents, including insurance appeals, prior authorizations, progress notes, post-op instructions, procedure notes, imaging reports, referrals, certificates of medical necessity, disability letters, letters of recommendation, and even letters to excuse patients from jury duty.

Since its release, DocsGPT has been hailed for its potential to relieve the administrative burdens on healthcare providers, reportedly saving as much as two hours a day that could be spent on direct patient care instead.9 Where administrative costs account for 15% to 25% of total national healthcare expenditures,10 the benefits to the whole industry from the time saved by DocsGPT are obvious. Increased productivity, greater efficiency, and reduced costs, especially when they may contribute to improved healthcare, offer great promise to a healthcare system under increasing strain.

However, as Doximity continues to fine-tune the product with input from its physician subscribers, healthcare providers will need to evaluate the risks, as well as the benefits, of using AI to record and document the care they provide to their patients. The website cautions users that, “Since the letter content is AI-generated, please make sure to review and ensure accuracy before you submit,” and each response that DocsGPT generates to a medical prompt comes with a reminder to “PLEASE EDIT FOR ACCURACY BEFORE SENDING.” This warning is especially pertinent when you consider that the responses generated by AI may be used to justify services (referrals, prior authorizations, certificates of medical necessity); support claims for payment (progress notes, procedure notes); and affect care (post-op instructions, imaging reports). With healthcare still the leading source of False Claims Act (“FCA”) cases, providers who use AI to generate the information that supports their billing claims should consider these warnings and their implications carefully, especially because the aspects of the program that make it so attractive as a timesaver also make it easy to just accept what is generated.

Providers planning to use Doximity — or other AI apps — to document patient care, especially the services provided, should consider the following:

First, even as these apps improve with use and incorporate feedback from their users, chatbots can still produce false and even made-up information. When AI creates something that looks very convincing but is not real, it is called a “hallucination.” A healthcare provider who relies on or uses — inadvertently or otherwise — an AI-generated report or record that is false or inaccurate to support a claim for payment, however, is submitting a false and possibly fraudulent claim. Such claims could be subject to denial or recoupment on post-payment review and could even lead to liability under the FCA if false claims are deemed to have been submitted “knowingly” (for example, with a reckless disregard of the risk of inaccuracy in AI-generated reports or records).

Second, similar prompts, such as for patients of similar age with similar conditions, are likely to generate similar or identical language responses, which in turn will likely trigger scrutiny by auditors and investigators. It is not clear how claims reviewers will view multiple claims with the same or similar language, but it is highly likely that they will give them less deference.

Third, chatbots, including DocsGPT, are trained based on the available data, which may be outdated. Thus, in addition to reviewing responses for accuracy, providers will also need to verify that the response is correct and reflects the best, most recent information.

Fourth, notwithstanding the best possible protections, there is still a risk that protected health information will be entered into the system, potentially raising significant HIPAA and privacy concerns.

Fifth, this is particularly true where information is stored in a system at risk for identity thefts, ransomware, and other cyber-attacks.

It is important to remember that, while the technology has advanced exponentially, it is still in its infancy and still being tested. While the tools are promising, they  require human oversight and careful review. Healthcare providers using these apps should continue to review and monitor the state of their records before relying on them to support their claims for payment.

As a final matter, it is worth noting that ChatGPT at least understands its limitations. Asked to explain how DocsGPT might pose risks under the FCA, the bot listed the requirements of the FCA, described what DocsGPT does — and then recommended that a lawyer be consulted.

 

[1] https://www.jpost.com/business-and-innovation/tech-and-start-ups/article-725910.

[2] https://aibusiness.com/verticals/chatgpt-passes-medical-board-exam.

[3] https://www.zdnet.com/article/chatgpt-can-write-code-now-researchers-say-its-good-at-fixing-bugs-too.  

[4] https://capitalizemytitle.com/funny-conversations-with-chatgpt.

[5] https://www.theguardian.com/technology/2023/feb/03/colombia-judge-chatgpt-ruling.

[6] https://www.zdnet.com/article/chatgpt-just-became-the-fastest-growing-app-of-all-time.  

[7] https://www.aha.org/aha-center-health-innovation-market-scan/2023-02-21-will-chatbot-be-just-what-doctor-ordered-reimbursement-appeals.

[8] https://www.easytechjunkie.com/what-is-a-beta-version.htm.

[9] https://www.fiercehealthcare.com/health-tech/doximity-rolls-out-beta-version-chatgpt-tool-docs-aiming-streamline-administrative.

[10] https://www.fiercehealthcare.com/health-tech/doximity-rolls-out-beta-version-chatgpt-tool-docs-aiming-streamline-administrative.