IQVIA Logo
2126_Cyber big data flow network line

Fit The Curve Blog

Coding, Automation, and AI in Bioanalysis

I’m excited to welcome my colleague, Dr. Jean Custodio, as a co‑blogger on our Fit The Curve platform. Jean brings deep expertise in bioanalysis along with a genuine passion for emerging technologies and the future of bioanalytical laboratories. This week, he tackles a topic he and I often find ourselves discussing: What truly differentiates artificial intelligence from traditional computer coding and automation? The term AI has become ubiquitous across our industry, frequently appearing in conversations about the evolution of bioanalytical labs. Jean cuts through the noise with a clear, thoughtful exploration of how these concepts diverge, and why those distinctions matter. I hope you find his article as insightful and thought‑provoking as the conversations we have every day here at IQVIA Laboratories.
Website Featured_JPG-1121_Programmer reading computer codes on desktop PC
Home / Blog / Coding, Automation, and AI in Bioanalysis

January 26, 2026

Why Coding, Automation, and AI Are Not the Same in Bioanalytical Workflows
If you work in the industry, you probably understand the pressure that bioanalytical laboratories are under. Scope keeps growing in complexity, and that directly affects what labs need to do to remain competitive. Demand is driven by current needs and a growing number of drug modalities, which pushes the limit of instrumentation, scalability, and technical expertise. In a time when digital lab transformation is fairly new, Bioanalytical labs rarely fail due to poor science. Issues arise when complexity outpaces systems.

As turnaround times tighten, laboratory demand increases, and matrices become more challenging, many labs have been “adding technology” to workflows to adapt. Scripts here, automation there, AI on top… tools accumulate faster than our understanding of how best to use each one. The confusion matters because coding, automation, and AI solve very different problems.


Coding is already part of most bioanalytical labs
Python and/or R scripts extract and analyze LC-MS output files, calculate accuracy and precision, apply dilution factors and generate tables for reports, and evaluate instrument performance. The simplest “if… > do…” rules are gold standard to flag calibration failures and internal standard drift. At an operational level, scripts summarize backlog, reruns, and turnaround time into dashboards.


The main added value is consistency. By incorporating scripting into these repeated calculations, analyst-to-analyst variability is minimized, and studies are more aligned with the precision and accuracy required by regulation bodies. Once stable coding is established, labs typically see greater efficiency in reporting review, fewer reporting errors and more rapid data processing.


Code is static. This is an advantage and a bottleneck at the same time. Scripts don’t adjust when a matrix behaves differently, method scope changes, or priorities shift mid-study. Maintenance should be considered since deviations require updated thresholds, logic, assumptions, and models. Edge cases are common and must be addressed. This is how deterministic systems are maintained in changing environments, and it does not represent a failure of coding.


Automation shines when execution, not logic, is the bottleneck
Now you have a task that can be divided into multiple subtasks, each fully scriptable. Add an equipment layer to it, where software instructions control physical or digital processes. That’s automation. Liquid handlers, automated protein precipitation, SPE robots, and autosamplers remove repetitive manual steps from sample preparation and analysis. Bioanalytical labs using automated tools can see a reduction in sample preparation time and analyst hands-on time on the order of 20–40%.


Given the deterministic nature of coding and automation, it is no surprise that these gains are mechanical and reliable. Analysts’ hours spent on pipetting can now be used for data review. For high-volume studies, automation can be the difference between meeting and missing agreed timelines.


However, similar to code, automation assumes stability. Tasks are scheduled and executed as designed as long as instruments don’t drift subtly, plates behave as expected, or sponsors’ priorities don’t change. Without human intervention, automation runs well, but it runs blindly.


AI supports roles that coding and automation fail to accomplish
AI becomes relevant when scripts and robots struggle to answer certain questions. Why do accuracy and precision indicators fail more often late in the week? Why do some matrices result in more reruns? Why does turnaround time skyrocket when volume spikes? Why can’t instrument A hold a calibration long enough?


These issues are hard to script because they depend on dynamic variables distributed across various systems. Humans tackle variabilities like these by learning from experience and analyzing historical and real-time data across various instruments, sample preparation logs, batches, QC, and metadata. So does AI. By employing neural networks trained on historical data, AI systems can adapt to quick changes by learning patterns and trends.


Process optimization often focuses on reducing surprises. Predictive monitoring can learn sources of instrument shifting and identify trends hours to days before they become bottlenecks. Smart scheduling can reduce missed timelines by rerouting work depending on early signals of equipment/personnel unavailability.


Beyond operational optimization, AI also supports a class of problems that coding and automation cannot handle. Deep research and knowledge synthesis require context, judgment, and the ability to connect information across sources that were never designed to work together. Integrating scientific literature, regulatory guidance, historical method data, and internal knowledge cannot be reduced to deterministic rules or scripted execution. This is a different role for AI: supporting how scientists think through problems, rather than how workflows are executed.


AI does not replace analysts or methods. It changes timing. Problems become evident earlier, when they are cheaper to address and do not disrupt workflows.


Where we are now, and where we are headed
Coding and automation are already main components of Bioanalytical workflows and have been for a long time. The consistency gained from structured and scripted tasks is readily scaled up by adopting the use of automated equipment. This works well for most studies, until new challenges turn exceptions into routine events. At that point, systems feel less efficient and small disruptions cascade and staff often find themselves spending most of their time reacting rather than reviewing.


At the same time, other sources of complexity continue to build. Regulatory expectations have expanded over time, adding documentation, traceability, and audit-readiness requirements to already complex workflows. Trial designs have also become more adaptive, with shifting priorities, interim analyses, and irregular sample arrival patterns. Together, regulatory creep and trial-driven variability add significant strain to scheduling and coordination, especially in environments that depend on static rules and fixed workflows.


Where we are now already reflects part of this shift. At IQVIA, we not only understand the difference between coding, automation, and AI, but we have applied that understanding in practice through the development of AI-enabled deep research capabilities. The IQVIA Med-R1 Deep Research Agent, for example, was built to support complex scientific and regulatory research by synthesizing information across diverse biomedical sources, demonstrating how AI can be used in regulated environments to generate insight without adding risk to execution.


Looking ahead, the operating environment for bioanalytical laboratories is continuing to shift. Market pressure is rapidly changing the optimal operational level required from excellence-driven laboratories. Complexity grows as sponsors now submit more biologics, peptides, ADCs, oligonucleotides, hybrid modalities and many others. New modalities come with new stability profiles, new interferences, and new failure scenarios. Static workflows struggle in this environment, not because they are poorly designed, but because the market has set a bar that these systems were never meant to achieve.


Bioanalytical labs are not ready for full autonomy, and at this point it is not clear they ever will. We are headed toward clearer separation of roles and time optimization. Coding remains the foundation for consistency and automation continues to scale execution. AI sits above both, learning from data and outcomes and enabling experienced staff to reclaim time previously spent on reacting to innovate. Labs that understand the role of each of these pillars, as well as recognize and align with this operational shift are better equipped to thrive, even as market pressure and demand continue to rise.

 

About the blog author
Jean CustodioJean Custodio, Ph.D., is a Senior Medical and Scientific Advisor at IQVIA Laboratories with over a decade of experience spanning analytical chemistry, immunology, and translational work across drug development. Much of Jean’s career has been spent moving between chemical and biological problem spaces, rather than staying in a single lane.

Jean tends to work in the gray areas between discovery and bioanalysis, where biology is messy, constraints are unavoidable, and decisions rarely come with perfect information.  He cares deeply about solid scientific thinking butut is most interested in whether the choices teams making actually help programs move forward without creating problems that surface later.

Outside of work, Jean enjoys traveling and immersing himself in different cultures. Exposure to different ways of thinking and living has shaped how he approaches uncertainty, resilience, collaboration, and problem-solving more than any single technical framework ever could.

Stephen Lowes, Ph.D.
Stephen Lowes, Ph.D.

Senior Director, Bioanalytical Services

Welcome! I'm Steve Lowes, and I'm thrilled to share my journey, thoughts, and insights with you through this blog. As the Senior Director of Scientific Affairs at IQVIA Laboratories in Ithaca, NY, I've dedicated over 30 years to the fascinating field of regulated bioanalysis.

Throughout my career, I've had the privilege of presenting at numerous conferences and authoring publications that aim to advance our science and foster dialogue within our discipline. I'm proud to be the co-editor of the book "Regulated Bioanalysis: Fundamentals and Practice," and I enjoy sharing my knowledge and experience from the lab, as well as troubleshooting bioanalytical data. Recently, my interests have focused on the exciting applications of LC-MS in modern drug modalities and biomarker bioanalysis. This has expanded into biologic molecules, adding new dimensions to the future potential and importance of the bioanalyst's role in bringing safe and effective therapies to market.

Outside of work, I cherish life with my wife and two wonderful teenage daughters. You can often find me fly fishing on trout streams and salmon rivers or hiking the beautiful gorges and forests of central NY with my black Labrador, Josie. I look forward to diving into and exploring current bioanalytical topics and more with you!

Subscribe to our blog

Receive updates via email when new blog articles are posted. You can unsubscribe at anytime.