In an opinion piece published by CNBC, Scott Gottlieb, the former Commissioner of the Food and Drug Administration (FDA), opined that artificial intelligence (AI) is poised to assume responsibilities previously managed by doctors, and this transition could occur earlier than anticipated.
Gottlieb emphasized that the not-so-distant future of medicine might witness AI systems performing duties typically carried out by medical practitioners. He divided AI applications in healthcare into two sections: machine learning, which employs algorithms for identifying data patterns and making predictions, and natural language processing, which comprehends and generates human language.
In certain instances, according to Gottlieb, robust language models are employed to examine patient medical records and suggest diagnoses and treatments directly to the patient, bypassing the need for a doctor.
He suggested that the main obstacle could be in developing an appropriate regulatory pathway. Regulatory authorities exhibit hesitancy due to concerns that these models may be error-prone and that the clinical datasets used for training might contain incorrect decisions, leading to the replication of medical errors by AI models.
Gottlieb noted that surmounting these challenges could yield significant improvements in patient outcomes and address financial issues resulting from the recruitment of more non-physician providers to decrease provider labor costs.
He also mentioned that these large language models are utilized for administrative functions, such as processing medical claims, analyzing medical records, or powering clinical decision support software. This software uses specific patient data to propose diagnoses and treatments. If a doctor is involved in these processes, the FDA may not regulate the tool, Gottlieb added.
He further stated that machine learning technology is applied to evaluate clinical data, images, and scans, with the software often classified as medical devices by the FDA. These tools are trained using verified data sets, providing the FDA with enhanced assurance in assessing the devices' integrity.
Although the current stage of development may not entirely exclude doctors from decision-making processes, Gottlieb believes these tools will increasingly supplement provider productivity and, in many instances, start to replace them.
Highlighting the advancements in AI, Gottlieb referred to OpenAI's ChatGPT, a comprehensive language model that passed the U.S. Medical Licensing Exam, a feat not achieved annually by approximately 10% of medical students.
Gottlieb's perspective aligns with a growing body of research indicating that AI may displace human jobs in the future. A recent report from the McKinsey Global Institute suggested that nearly 30% of the hours currently worked in the U.S. could be automated by 2030.