Grow your brand, now and through patent expiry
This blog is part of an ongoing series, A Brave New World: Modern Analytics & Next-Gen Insights.
There are countless use cases for generative AI and agentic AI within the life sciences. Many of those surround drug development and discovery where exciting advances are being made utilizing these new technologies. Less advanced in their progression, but equally important, are commercial operations and commercial analytics where there remain opportunities to drive efficiency and improve business insights. There is a significant imperative to do both.
Over the past three years, the growth in AI utilization has proliferated in everyday life. Consumers are exposed to it in search engines, mobile applications, e-commerce, in education and more. However, when examining commercial use advances within life sciences, it remains nascent as there are multiple challenges the industry is working to address in the journey to broader utilization.
Data complexity and disparate data sources that do not easily connect have combined with an explosion in the quantity of information available. For example, the level of sophistication required to connect deidentified claims data to lab data, to first-party data such as sales force operations, or other third-party data such as digital activity has created exponentially larger needs and costs.
The journey to commercial GenAI use cases in life sciences is fraught with high risk and common breakage points. People, processes, technology, and data all represent critical paths that must be navigated.
Over the past 10 years, pressure on the balance sheets of pharmaceutical manufacturers has accelerated. Between 2012 and 2023, price discounts by dollars have grown 400% for the top 15 manufacturers and now account for 38% of topline revenue versus just 13% a decade ago. The inherent trade-offs that occur as P+Ls shift have reduced investment in Selling, General & Administrative expenses (SG&A), and profits, while focus has been on preserving research and development (R&D). Many organizations have felt the pinch this causes through reductions in workforce, serial reorganizations, aging technical debt, and other austerity measures.
When it comes to commercial operations and analytics, the functions have long been targets of cost reductions. The resulting industry dynamic created what is commonly referred to as data lakes, where information is centralized and is then processed by managed services which are offshored to reduce labor costs. The result is too often a rigid technology and people infrastructure that leads to scale and capacity constraints making incremental improvements difficult to attain. Technical debt, process complexity, and organizational change management become significant barriers to change.
There is little to suggest that the industry pressures will alleviate in the future only raising the need to continue to drive for ever more efficient processes. For an industry that is under constant pressure, and one that needs to protect R&D spend, next generation analytics and commercial operation technologies are needed to unlock greater gains.
For industry, the new question that is emerging is what is the cost and time to business insight relative to the value it brings?
Discussions on analytics often focus on technology choices, data explosion, insourcing/outsourcing trends, and operating models. These are all relevant considerations. However, it is critical to begin any discussion on analytics first and foremost with the lens of business value. The value of analytics hinges on the ability to provide critical business insights in a timely and efficient manner: right insight, right time, right cost, and of course, right place.
“Better analytics” in pharma commercial uses is not just about more dashboards or fancier models. It is about changing decisions and outcomes for patients, HCPs, payers, and brands. Better insights require more explanatory, predictive and prescriptive analytics that answer, “why did it happen,” “what will happen,” and “what should we do.” To get to these more sophisticated insights, though, is difficult with modern infrastructures and data availability. Individually, manufacturers struggle with the scope and complexity of data while also questioning if they have enough information to fuel these models. The quest for more data to drive predictive power taxes existing operations, which are already stressed, often to the point of breaking.
Consider the amount of information required to complete a single integrated profile of a healthcare provider. The information rests across dozens of source data nodes, several attributes require understanding behaviors or affiliations that are larger than any one dataset, and the correlation to patient treatments may be secondary or tertiary impacts which are not always immediately visible. Understanding both a rearward view of descriptive analytics and a forward-looking view of prescriptive analytics is elusive. Even identifying that an event of note occurred can take months of analysis to uncover, leaving a brand manager blind and unable to impact an outcome in time to effect positive change.
In July 2025, IQVIA conducted market research exploring the most significant barriers to GenAI entry faced by organizations of different sizes within the life sciences sector (n = 156). The research focused on assessing the level of difficulty each barrier presented in implementing and scaling AI within an organization.
|
|
While many companies believe they are organizationally prepared and recognize the potential benefits of AI, they face significant challenges in trusting data accuracy and in accessing and managing the datasets effectively to generate high-quality compliant outputs. Notably, 81% of respondents across all companies reported achieving analytic accuracy and operational efficiency as the top barrier to implementation.
The market research also revealed notable differences in how small/mid and large pharmaceutical organizations perceive and experience barriers to AI adoption. Smaller companies reported feeling less organizationally prepared and demonstrated a limited understanding of AI’s potential impact, particularly struggling to measure its quantifiable benefits. Specifically, small/mid-sized pharma organizations reported facing significant challenges in prioritizing the selection and adoption of a consistent AI standard across their operations.
With these fundamental challenges facing an industry ripe for change, it becomes imperative to broadly rethink the path forward. If the underlying structure of data lakes and managed services needs to evolve as data complexity increases, resource allocation and strategies must also adapt. Continued investment into rigid and siloed data infrastructure must give way to more dynamic approaches. Moving towards the AI promise of scaled efficiency with faster and better insights is a stacked journey of new needs.
Foundational questions in the new hierarchy of needs include industry concerns about privacy and regulatory compliance. Work done in new ways must continue to be held to the highest standard of healthcare-grade analytics and withstand changing laws and regulations. As new synthetic data is emerging and new use cases merge previously disparate datasets, the risk of re-identification rises. The industry must continue to seek internal and external counsel to ensure the ethical use of information is maintained.
The new foundational question, however, as reinforced in the IQVIA research, rests on reducing or removing data and analytic hallucinations to improve accuracy. This is where GenAI or agents on top of raw data struggle. The sheer volume of information leads to many false-positive and false-negative interpretations. Multiplied over ever larger swaths of information and data stacking, the challenge quickly cascades, breaking models before they can even start.
Beyond foundational questions are challenges on data scale and interoperability. Consider the aforementioned HCP profile: the traditional approach to understanding prescribing focuses on a narrow slice of brand or therapeutic data. As market dynamics become more complex, new requirements are emerging on the scale of data that is needed to fuel large data models. Because of the data complexity, this quickly leads to questions of if disparate data sources are working with other data sources. For example, in therapeutic areas like oncology, where there are dozens of data sources and few complete ones, it is necessary to look through multiple data prisms to see a full picture, only adding to the scale of the underlying problem.
If the foundational questions have not been adequately addressed, then questions of data scale and interoperability cannot truly be solved. Additionally, underlying technologies also play a role in addressing these data questions and are one of the most significant escalators of rigid costs that the industry must avoid in the future of continued margin compression.
Addressing these first two levels of hierarchy is a combination of data linkage, structure, and tokenization. The resulting combinations of pre-run analytic-ready data accelerate time-to-insight while reducing the effort to get it. Because data is pre-aggregated at scale with documented methodologies, hallucinations are greatly reduced if not eliminated. At the same time, the breadth and diversity of the data sources exponentially increase the capability to generate insights. While deep custom analytics will always be a staple for understanding complexity, the benefit of speed and scale driven from analytic ready data drives both efficiency for core business operations and generates new innovative use cases.
Utilizing this hierarchy, real-world use cases at IQVIA such as performing market landscape assessment, physician segmentation, patient journey, omni-channel orchestration, product forecasting, and payer deal modeling have gone from taking weeks of effort to days, and many are now capable of being rendered in just a few seconds utilizing the latest industry technology, all in a compliant and privacy-protected manner.
Addressing the foundational and data-level needs creates the ability to free human resources to focus on insight generation and testing the veracity of the underlying analytics. The core questions of where humans in the loop are needed to interpret insight into action become where time is spent rather than in cleansing and structuring data for dozens of separate use cases. Additionally, the underlying structure empowers agentic workflows to be unlocked more easily leading to the rapid reduction in time and cost to insight, scaling of analytics, and efforts needed to power modern analytics.
Is the life sciences behind other industries in the pursuit of AI? While advances are being made, there are real barriers that impede commercial use progress as the underlying data and how it is used are not easily converted into large language models. Historic approaches of data lakes, managed services, and the accumulated technical debt create inertia in the journey to broader AI adoption and utilization. Simultaneously, the data fidelity that is needed as the industry moves to smaller patient populations is inflating the volume and complexity of data sources that must be pulled from. It is no longer sufficient for brand teams to rely on slices of information to understand how a product is performing, let alone predict future performance.
The real power of AI comes when accurate regulatory and privacy-adherent multi-dimensional multi-factorial insights are generated while reducing the effort and time required to produce them. This requires new approaches to how analytics are structured, and access to the broadest datasets of information possible, that can then be AI-enabled through data engineering. Future data strategies are evolving to meet these new business demands.
IQVIA understands the many dimensions of how to navigate to broad life sciences commercial AI use. New tools are allowing for very large, complex data models to be manipulated quickly, reducing the time and effort required to achieve insights. Along the journey, new perspectives on market, system, physician, and patient behavior are unlocking key insights that allow the industry to move from looking rearwards at descriptive analytics to looking forward with data-driven probabilistic modeling. Use cases already range across the entire life sciences commercial spectrum, from gaining resource efficiency through improving business performance, and are growing quickly. Remember, it is not where you are in the middle of the race that counts, it is where you finish.
Please contact your IQVIA representative for more information.
Grow your brand, now and through patent expiry
Data, AI, and expertise empower Commercial Solutions to optimize strategy, accelerate market access, and maximize brand performance.
Gain high value access and increase the profitability of your brands
We empower life sciences with connected intelligence, transforming data into actionable insights for smarter decisions and better patient outcomes.