Deepali Misra-Sharp
Dr. Deepali Misra-Sharp uses AI to take notes
This is the fifth article in a six-part series examining how AI is changing medical research and treatment.
Difficulty getting an appointment with a GP is a familiar complaint in the UK.
Even when an appointment is secured, doctors’ increasing workload means these meetings may be shorter than either the doctor or patient would like.
But Dr Deepali Misra-Sharp, a physician associate in Birmingham, has found that AI has taken some of the administration off her job, allowing her to focus more on patients.
Dr. Mirsa-Sharp started using Heidi Health, a free AI-assisted medical transcription tool that listens and transcribes patient appointments, about four months ago and says it has made a big difference.
“Usually when I’m with a patient, I write things down, which takes away from the value of the consultation,” she says. “This now means I can spend all my time meeting the patient’s eyes and actively listening. This allows for a better quality consultation.
She says the technology reduces her workflow, saving her “two to three minutes per consultation, if not more.” She mentions other advantages: “It reduces the risk of errors and omissions in taking my medical notes. »
With staff numbers declining while patient numbers continue to rise, GPs are facing immense pressure.
A single full-time GP is now responsible for 2,273 patients, an increase of 17% since September 2015, according to the British Medical Association (BMA).
Could AI be the solution to help GPs reduce administrative tasks and alleviate burnout?
Some research suggests it’s possible. A 2019 report prepared by Health Education England estimated a minimum saving of one minute per patient from new technologies such as AI, equivalent to 5.7 million hours of GP time.
Meanwhile, research carried out by the University of Oxford in 2020 found that 44% of all administrative work in general practice can now be largely or fully automated, freeing up time to spend with patients.
Corti
Lars Maaloe (left) and Andreas Cleve, co-founders of Danish medical AI company Corti
One of the companies working on this is Danish company Corti, which has developed AI that can listen to healthcare consultations, over the phone or in person, and suggest follow-up questions, prompts, options for processing, as well as automating note-taking.
Corti says its technology processes around 150,000 patient interactions per day in hospitals, GP practices and healthcare settings in Europe and the US, totaling around 100 million encounters per year.
“The idea is that the doctor can spend more time with a patient,” explains Lars Maaløe, co-founder and chief technology officer at Corti. He says the technology can suggest questions based on previous conversations heard in other health care situations.
“The AI has access to related conversations and it might then think that in 10,000 similar conversations most of the questions asked X and this was not asked,” says Maaløe.
“I imagine that general practitioners have a series of consultations and therefore have little time to consult their colleagues. It’s about giving advice to this colleague.
It also says it can review a patient’s historical data. “For example, we could ask: have you considered asking if the patient still has pain in his right knee?
But do patients want technology to listen and record their conversations?
Mr Maaløe says “the data does not leave the system”. He says, however, that it is good practice to inform the patient.
“If the patient disputes it, the doctor cannot record it. We see few examples of this because the patient can see better documentation.
Dr. Misra-Sharp says she lets patients know she has a listening device to help her take notes. “No one has had a problem with it yet, but if they did, I wouldn’t do it.”
C the signs
C Signs software allows you to analyze a patient’s medical file
Meanwhile, currently 1,400 GP practices across England are using C the Signs, a platform which uses AI to analyze patients’ medical records and check for different signs, symptoms and risk factors of cancer, and recommend the measures to be taken.
“It can capture symptoms, such as cough, cold, bloating, and essentially, within a minute, it can see if there is any relevant information in their medical history,” says Dr. Bea Bakshi, managing director and co-founder of C the Signs. a general practitioner.
The AI is trained based on published medical research articles.
“For example, it might indicate that the patient is at risk for pancreatic cancer and would benefit from a pancreatic scan, and then the doctor decides to refer to those pathways,” says Dr. Bakshi. “It won’t diagnose, but it may make it easier.”
She says they have conducted more than 400,000 cancer risk assessments in a real-world setting, detecting more than 30,000 cancer patients across more than 50 different cancer types.
A report on AI released this year by the BMA reveals that “AI is expected to transform, rather than replace, jobs in the healthcare sector by automating routine tasks and improving efficiency.”
In a statement, Dr Katie Bramall-Stainer, chair of the UK General Practice Committee at the BMA, said: “We recognize that AI has the potential to completely transform NHS care – but if it does not If not applied safely, it could also cause considerable damage. is subject to bias and error, can potentially compromise patient privacy, and is still a work in progress.
“While AI can be used to enhance and complement what a GP can offer as an additional tool in their arsenal, it is not a silver bullet. We can’t wait for the promise of tomorrow’s AI to deliver much-needed improvements in productivity, consistency and security. needed today. »
Alison Dennis, partner and co-head of the international life sciences team at law firm Taylor Wessing, warns that GPs need to exercise caution when using AI.
“There is a very high risk that generative AI tools will not provide complete or correct diagnoses or treatment pathways, or even give incorrect diagnoses or treatment pathways, i.e. producing hallucinations or basing their results on clinically incorrect training data,” says Dennis.
“AI tools that have been trained on reliable datasets and then fully validated for clinical use – which will almost certainly be specific clinical use, are more suitable for clinical practice. »
She says specialized medical products must be regulated and receive some form of official accreditation.
“The NHS would also like to ensure that any data entered into the tool is held securely within the NHS system infrastructure and is not absorbed for further use by the tool provider as data training without the appropriate GDPR (General Data Protection). Regulations) protective measures in place.”
For now, for GPs like Misra-Sharp, it has transformed their work. “It allowed me to enjoy my consultations again instead of feeling pressed for time. »