Skip to main content

Expediting Drug Development of Novel Therapeutics: Regulatory and Commercialization Implications of Digital Twin Technology in Clinical Trials

Posted by on Wednesday, January 24, 2024 in Blog Posts.

By Colleen Carroll

Clinical trials are a major bottleneck for new drug development.[1] No drug will make it to market without first meeting rigorous safety and efficacy standards. This requires extensive testing across multiple phases of clinical trials, which take, on average, ten and a half years.[2]

But advances in AI may transform clinical trials and increase patient access to life-saving drugs. Digital twins (DT) are virtual clones of human patients enrolled in clinical trials. Using AI, DTs simulate how a patient’s disease would progress if she had not received the tested drug.

As the field advances, DT technology in medicine may soon be capable of reducing or even eliminating humans from clinical trials, accelerating drug development timelines. But the FDA has not yet indicated whether it will accept the use of digital twins in clinical trials. The regulatory response to DTs in clinical trials has implications for the regulation of new drug development, the effects of which will reach a range of legal areas.

The Status Quo: Challenges in Clinical Trial Enrollment

Before testing a new drug in humans, drug makers (sponsors) must submit a trial design protocol, including a projection of the trial’s sample size necessary to show statistically significant outcomes. Once the FDA approves the protocol, the drug maker must then recruit and enroll the specified patient group.

Patient recruitment and retention frequently delays or ends clinical trials. Pivotal late-stage trials require hundreds, and sometimes thousands, of patients. Trials must overcome patient perceptions around safety and efficacy of experimental drugs and risk patients dropping out if they suspect that they’ve received a placebo. For these reasons, trials face the risk of early termination if sponsors fail to recruit enough—or lose too many—patients.[3]

Digital Twins: Transforming Drug Development and Commercialization

Digital twins are virtual clones of human patients enrolled in clinical trials. DTs simulate how a patient’s disease would progress if she had not received the tested drug by using proprietary computational models trained on aggregated, anonymized patient data from past trials.[4]

At the beginning of a trial, sponsors collect each patient’s baseline health information. Using a generative AI model trained on patient data from thousands of previous clinical trials for the same disease, a digital twin is created for each participant. The digital twin predicts how the patient’s disease would have progressed if she had not received an experimental drug.[5]

Digital twins decrease the number of patients needed for a trial’s control group, and decreasing the requisite sample size speeds up the trial. They also increase the likelihood that a participant will receive the experimental treatment rather than a placebo, protecting against retention challenges. With fewer trial participants, the time required to find and enroll patients shortens, and as digital twin models become more accurate and make better predictions, control groups could be reduced in size by 75 percent or more.[6]

As an emerging technology, uncertainty surrounds the use of DT models in drug development and commercialization. Anticipating legal challenges surrounding regulation, product liability, and licensing issues will support integration of DT technology into clinical trials.

  • Drug Development and Regulation. While the FDA has recognized the possibility of DT models to speed clinical trials, the Agency has no qualification process for the technology.[7] The European Medicines Agency (“EMA”), however, has qualified DT models for use in clinical trials, publishing an opinion permitting the use of DT models for Alzheimer’s Disease clinical trials.[8] As shown in the EMA context, documenting model frameworks, code, and training data sets will aid drug sponsors preparing trial design protocols using DT models in FDA-regulated clinical trials, mitigating the risk associated with the FDA’s ad-hoc regulatory process.
  • Product Liability. Digital twin models raise new kinds of product liability claims. Model errors may go undetected in the development process, leading to under-reporting of rare, but dangerous, side effects. In the absence of established case law concerning liability in AI-based product development, traditional product liability precedent offers guidance around the duty of care owed by drug sponsors leveraging DT models in drug development.[9] Dedicating in-house teams to oversee DT models in drug development is a possible risk-mitigation measure that may optimize drug safety in DT trials years before these drugs reach the market, potentially protecting drug sponsors from future product liability claims.[10]
  • Licensing. Biopharmaceutical partnerships with AI start-ups developing proprietary DT models may demand new licensing structures. Biopharmaceutical firms’ data assets from past clinical trials are important to produce high-quality DT models and may aid regulatory acceptance.[11] Deal structures with technology partners may contemplate data-sharing and exclusivity, requiring new aspects of pre-transaction diligence and post-transaction compliance. As these deal elements become more common, data oversight will be an important aspect of deal diligence and compliance.

By recognizing the transformative effects of digital twin technology in clinical trials now, practitioners will be well-equipped to shape the areas of law that DT technology will implicate.


Colleen Carroll is a 2L at Vanderbilt Law School. Before attending Vanderbilt, Colleen worked in economic and litigation consulting on matters addressing emerging technologies, including those in the biopharmaceutical industry.


[1] David Thomas et al., Clinical Development Success Rates and Contributing Factors 2011–2020, at 1, Biotechnology Innovation Organization (last visited Oct. 22, 2023), https://go.bio.org/rs/490-EHZ-999/images/ClinicalDevelopmentSuccessRates2011_2020.pdf.

[2] David Thomas et al., Clinical Development Success Rates and Contributing Factors 2011–2020, at 1, Biotechnology Innovation Organization (last visited Oct. 22, 2023), https://pharmaintelligence.informa.com/~/media/informa-shop-window/pharma/2021/files/reports/2021-clinical-development-success-rates-2011-2020-v17.pdf.

[3] See, e.g., Rashmi Ashish Kadam, et al., Challenges in recruitment and retention of clinical trial subjects, Perspect Clin Res. 137, 139–140 (2022) https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4936073/.

[4] Urtė Fultinavičiūtė, Digital twins: easing the clinical trial conduct but regulatory oversight is needed, N.Y. Times (Dec. 14, 2022), https://www.clinicaltrialsarena.com/features/digital-twins-clinical-trial/?cf-view.

[5] Big Picture Medicine, Clinical Trials – Dr Charles Fisher (CEO Unlearn.AI), at 12:00–21:00 (Mar. 15, 2021) (downloaded using Spotify); Axial Podcast, Founding Unlearn and Revolutionizing Clinical Trials with Charles Fisher, at 13:45–17:30 (Jul. 8, 2023) (downloaded using Spotify).

[6] Id. at 18:00–20:00.

[7] See generally Food & Drug Admin., Using Artificial Technology & Machine Learning in the Development of Drug & Biological Products at 8 (May 2023),  https://www.fda.gov/media/167973/download?attachment.

[8] Eur. Medicines Agency, Qualification opinion for Prognostic Covariate Adjustment (PROCOVA™) (Sept. 20, 2022), https://www.ema.europa.eu/en/documents/regulatory-procedural-guideline/qualification-opinion-prognostic-covariate-adjustment-procovatm_en.pdf.

[9] W. Nicholson Price II et al., Liability for Use of Artificial Intelligence in Medicine 1 (2022) (working paper), https://repository.law.umich.edu/cgi/viewcontent.cgi?article=1352&context=law_econ_current (“[T]he field of tort liability for AI is still evolving … health-care AI liability has still not been directly addressed in court cases, mostly because the technology itself is so new and is still being implemented.”); see also Scott J. Schweikart, Who Will Be Liable for Medical Malpractice in the Future? How the Use of Artificial Intelligence in Medicine Will Shape Medical Tort Law, 22 Minn. J.L. Sci. & Tech. 1, 14 (2021); W. Nicholson Price II, Medical AI and Contextual Bias, 33 HARV. J. L. & TECH. 65, 85 (2019).

[10] Price II, Medical AI and Contextual Bias, supra note 12 at 86–87.

[11] Id. at 86.

Tags: , , , , , , , ,