1 February 2023

Move over Dr Google: chatGPT is here

Clinical Research

The artificial intelligence (AI) platform chatGPT that launched as a free “research preview” late last year has quickly gained momentum, with millions of users testing and adopting the text generation tool.

Industry mavens have even described its use as a tectonic shift in how we work.

ChatGPT’s game-changing power results from easy access to OpenAI’s huge language processing AI model GPT-3, made up of 175 billion parameters, and its magical ability to create human-like text (with excellent grammar and spelling) in response to any question (or “prompt”).

It was trained by hoovering up 570GB of text from the internet, including most of Wikipedia and a vast collection of random books and websites.

OpenAI is a Silicon Valley startup, of sorts – with over a billion dollars in Microsoft funding and early involvement by Elon Musk.

The launch of chatGPT has given it the “first mover” advantage over similar platforms under development by Google and DeepMind, and by Facebook’s Meta-owned model OPT-175B – (and medical-specialist AI tool, MedPaLM.)

But while chatGPT delivers passable high-school essays (prompting a classroom ban by education authorities in NSW, Queensland and Tasmania so far), and can even pass the US medical licensing exam: how will it play out in GP clinics?

GPT in the GP clinic?

Dr Rob Hosking is a GP in regional Victoria who heads the RACGP’s Practice Technology and Management expert committee, and has tested out chatGPT for himself.

He is relieved the platform appears to respond to medical queries with some “safety net” qualifiers.

“The program answered some medical questions I put to it quite well, with the default being: you should see a doctor about your symptoms,” he says.

“But if I type in ‘I’m experiencing chest pain, what should I do?’, the first thing it comes up with is: ‘call 911’, which isn’t correct for Australia.”

Incorrect emergency details are just one of the flaws; Dr Hosking says that “general AI” is no match for a qualified clinician treating each patient individually.

“There’s no doubt that this is a great leap forward in AI ability – but it’s not ready for medical applications and it can’t replace experience and the clinical perspective to know that not all headaches are brain tumours,” he says.

Dr Hosking says that the launch of chatGP and corresponding acceleration of “general AI” tools are covered by the RACGP’s position statement on AI in Primary Care, which warns that while AI holds promise for GPs and patients, “the risks that it poses must be carefully mitigated.”

Dr chatGPT is no match for a real GP

Dr Hosking says that a patient who comes into their GP clinic with a chatGPT diagnosis will typically have less information than they might from a Google search, because the program has no source transparency.

“People can quote something they found on the internet, and we can ask, ‘What was the website you read it on? and ‘What was the validity of the research that was done to back this up?’,” he says.

“But chatGPT is very opaque, it delivers answers with no citations,” he says.

“This is an AI that answers a laypersons query without the oversight of an experienced expert, we can’t see the algorithms behind it, and there’s no way to know whether, or how much of, that information comes from reputable sources.”

As for its usefulness in the clinic: Dr Hosking says it’s not quite there yet.

“The current version of chatGPT won’t be all that useful for tasks like writing a referral letter because you want to be able to access relevant information from patient records, to find a specialist’s address and so on; our current practice software does that quite well.”

However, he says that it’s likely that we will see an acceleration of AI into existing systems as a result, and that’s where it will get interesting for doctors.

“For example, we’ll soon see chatGPT-style natural language processing available for our clinical notes, improving the accuracy of our records and reducing our admin burden.”

Saving lots of work

GP and digital health investor Dr Amandeep Hansra says she can already see that the chatGPT tool will save her hours of work.

“When you read the output from chatGPT, it sounds like a human has written it, which we’ve not really seen before,” she says.

Like Hosking, she’s tested the quality of chatGPT’s medical advice.

“I put in some symptoms – abdominal pain, vomiting – and got a fairly good overview of various diagnoses that were no worse than what’s out there on the general internet or on social media,” she says.

Dr Hansra cites a controversial experiment by mental health app Koko, which tested GPT-3 responses on 4000 users invited to opt-in to a week-long trial.

“Messages composed by AI (and supervised by humans) were rated significantly higher than those written by humans on their own, (p < .001). Response times went down 50%, to well under a minute,” tweeted Koko app founder Rob Morris on January 7.

However, Koko abandoned the chatbot after users reported the simulated empathy and very short response time felt weird, empty and inauthentic.

Dr Hansra says that patient care isn’t a great application of AI – but there are plenty of tedious, time-consuming tasks that could be.

“Let’s use these tools so we don’t have to read lots of journals and synthesise a heap of papers every night to stay current and come up with meaningful data to change our practice, because when machines do that, it frees us up to do the actual application of medicine,” she says.

“That’s why it’s crucial that GPs and clinicians are involved in building these models; otherwise, if we wait for tools to be delivered to us, they won’t serve how we want to use them.”

Meanwhile, Dr Hansra is making use of the platform for her own work.

“I’m not using it for patients – but for consulting and writing reports, it can write up a framework and cover the main points. It’s timesaving, but it’s just basic drafting, writing in a way that sounds OK but needs expanding, editing, checking thoroughly and referencing,” she says.

“chatGPT is another stage of the technology journey of AI, and another tool in our kit-bag,” adds Dr Hansra.

“As we’ve said for the last few years, ‘Doctors won’t be replaced by AI, but doctors who don’t use AI will potentially be replaced by doctors who do.’ So my advice to GP colleagues is: get in there and try it out!”