2025 CDISC + TMF US Interchange Program

12 October: Early Registration / Check-In; Workshops
13 & 14 October: Main Conference
15, 16, & 17 October: CDISC Authorized Training
Print Full CDISC + TMF Agenda Here
Session Category
Starstruck Foyer (Mezzanine Level)
Starstruck (Mezzanine Level)
Starstruck Foyer (Mezzanine Level)
Symphony Foyer (Lobby Level)
Symphony Foyer (Lobby Level)
Symphony Ballroom (Lobby Level)
test
Sarah Dolan was diagnosed with young onset Parkinson’s Disease 7 years ago. After leaving a 30-year career in the biopharma industry she has participated in multiple clinical trials, is a current member of Critical Path to Parkinson’s Endpoints Team, a current Consumer Representative for the FDA PCNS Advisory Committee, and is an active Ambassador for the Davis Phinney Foundation. Sarah lives in a cabin on a river in Cody, Wyoming and loves cycling, her horses, and spending time with her husband and three children.
Symphony Ballroom I & II (Lobby Level)
CDISC 360i is redefining the future of clinical research by digitizing and connecting standards across the entire study lifecycle. Phase 1 laid the foundation for intelligent automation—linking digital protocols to biomedical concepts to generate core study artifacts like eCRFs, SDTM specs, and define.xml files. It also advanced the transformation of source data into SDTM datasets. Despite challenges such as metadata gaps and tool limitations, strong community collaboration drove innovation. Looking ahead, CDISC 360i will expand its reach, integrate AI, and begin showcasing analysis concepts—ushering in a new era of scalable, interoperable, and insight-driven clinical data workflows.
This interactive presentation showcases CDISC 360i in action—a visual demonstration of how digital protocols enable automation from study design through SDTM. Using an interactive Sankey visualization, attendees will see how each deliverable—from protocol to CRFs, SDTM datasets, and validation reports—is connected through automation and data standards. Each step in the process links to live demonstration videos and an open-source Google Colab notebook that powers the underlying workflow. The session concludes with a resource library and presentation deck available to all attendees for hands-on exploration. Participants will leave with a clear understanding of how CDISC 360i enables standardized, machine-readable study definitions, end-to-end data traceability, and scalable conformance across the research lifecycle—bridging the gap between standards, technology, and real-world implementation.
Symphony III (Lobby Level)
The TMF Reference Model is entering a new era — one that will redefine how we think about clinical documentation. Building on the momentum from Geneva, this session unveils what’s been accomplished and what’s next as we shape Version 4. We’ll explore the foundational elements now finalized and the exciting work ahead: harmonized metadata, consistent dating conventions, and the integration of all Working Group contributions into a unified, future-ready model. This is more than an update — it’s a transformation. Together, we’re creating a TMF framework built to evolve with our industry, strengthen collaboration, and enable smarter oversight. Join us as we set the stage for a future where the TMF isn’t just a repository, but a powerful enabler of quality, compliance, and innovation.
Join your TMF management industry colleagues in this session to explore the latest thinking behind the TMF Reference Model’s next version! This engaging, panel-style discussion led by members of the V4 TMF Project Management Committee alongside select industry thought leaders, to unveil and explain foundational concepts slated for the new release.
Symphony I (Lobby Level)
The CDISC Tobacco Implementation Guide (TIG) v1.0, developed with the FDA Center for Tobacco Products (CTP), provides a framework for collecting, analysing, and exchanging tobacco product data to ensure consistent standards and high data quality. The accompanying TIG Conformance Rules v1.0 facilitate the creation of accurate, compliant data packages aligned with TIG standards. These rules build on existing SENDIG, SDTMIG, and ADaMIG rules, while introducing new ones for tobacco-specific use cases.
This presentation offers a high-level overview of TIG v1.0 and the pilot submission project, along with insights into integrating TIG rules into CDISC Open Rules. The process revealed key differences between implementations and provided valuable feedback to strengthen existing conformance rules.
By sharing insights and experiences, the importance of these conformance rules in achieving consistent and compliant data will be emphasized, encouraging both broader industry adoption of the TIG framework and continued enhancement of conformance rule standards across implementations.
CDISC CORE v1.0 marks a major milestone for the clinical research community: the first production-ready release of an open, executable standard for study conformance validation. Built collaboratively by industry, technology providers, and CDISC, CORE delivers a governed, transparent set of conformance rules that can be used by any validation engine to ensure data meets CDISC and regulatory conformance requirements.
In this session, we will walk through the journey towards CORE v1.0, the architecture behind separating rules from engines, and how this launch moves the industry toward faster, more reliable, and standardized validation. We will discuss production deployment considerations, integration into regulatory workflows, and early feedback from pilot implementations.
From the e-protocol to the analysis results and back, this presentation will illustrate a few interconnected process tasks in clinical research and summarize a few process challenges. The process tasks will be demonstrated using a Proof of Concept (POC) implementation based on open-source software, either freely available or through other supported agreements.
By leveraging the use of data based on CDISC modeling standards USDM, CDASH, SDTM, ADaM, ARS, underlined CT, plus data-exchange standards ODM, Define-XML and Dataset-JSON, along with the CDISC CORE initiative, the presenters will cover some potential gaps that would need to be addressed to maintain data integrity and consistency across the different standards and data-exchange process.
This presentation aims at encouraging users to start taking steps towards a consistent implementation of CDISC standards from start to end, using available open-source software and service resources. Sponsors could start gradually growing their 360i capabilities as more knowledge is acquired.
Symphony II (Lobby Level)
The authors have been actively involved in the CDISC ADaM team a long time ‚ one for more than 20 years, and the other for over 10 years. During that time, we have led and contributed to a variety of ADaM documents, and we are both authorized CDISC ADaM trainers. Because of our extensive ADaM expertise, we end up reviewing a lot of ADaM submissions before they are sent to regulatory agencies. This presentation highlights some of the common issues that we have seen ADaM developers make. For each issue, we provide a better and/or more conformant approach.
This paper addresses ambiguities in the CDISC ADaM Integrated Document/ADaM v3.0 regarding the derivation of parameters and the use of variables carried over from SDTM into ADaM. It highlights inconsistencies and grey areas in the current Implementation Guide (IG) and aims to clarify best practices for populating these variables, such as xxSEQ, xxDTC, xxVISIT, and xxVISITNUM, on derived records in an analysis. The paper focuses on the use of DTYPE to indicate derived records and provides general best practices for deriving records. It discusses different strategies for retaining SDTM variables, along with their pros and cons, and provides examples to illustrate cases where SDTM variables are carried over into ADaM. The goal is to enhance traceability and improve the quality of the analysis package before submission to regulatory authorities.
Designing effective data collection instruments involves much more than simply adding questions and responses to a form. In this session, we will explore strategies for creating fit-for-purpose, standardized data collection instruments utilizing a helpful checklist of references, real life applications, data collection principles, and tips. We start with the end in mind ensuring we are collecting data to support analyses, mocking up data to share a story with the study team and reviewers, as well as gathering cross-functional stakeholder feedback.
In the context of data collection, we recommend adopting lean design principles thereby reducing site burden. By embedding quality into the design phase, we minimize errors and improve the overall data integrity, ensuring collected data adequately serves the study's objectives.
As Data Standard SMEs, we share our knowledge of best practices and industry standards, empowering the team to make informed decisions and enhance their understanding of data collection requirements.
Symphony III (Lobby Level)
The 2025 AI Innovation Challenge: A global call to vendors, researchers, and innovators to create AI/ML-driven solutions that advance the digitization and automation of clinical research using CDISC Standards. During this session, the winner and runner-up from each of the three use cases during the Challenge will present on their solutions.
Interested in joining the Challenge? To get a comprehensive overview of the challenge's objectives, timeline, and three targeted use cases, ranging from protocol design to metadata traceability throughout the lifecycle. You will hear directly from CDISC leaders about how your solutions can help shape the future of standards-driven research and earn a spotlight at the 2025 CDISC US Interchange.
Faro’s AI-powered protocol digitization solution transforms static clinical study protocols into structured, reusable data that drives efficiency across the drug development lifecycle.
Zifo’s solution transforms protocols from static documents into reusable, machine-readable assets - accelerating study design, standardization, and reuse.
The Smart BC suite, submitted by Saama, is a sophisticated framework designed to extract, standardize, and link Biomedical Concepts from clinical documents using an advanced AI-driven approach.
Lindus’ BC Registry Framework registers new BCs, checks against local registries and CDISC Library for matches, then falls back to an advanced NCIT ontology search.
Merck’s solution is an open-source traceability engine that automatically pulls together study files like protocols, CRFs, SDTM, ADaM, and TLFs. The engine then builds one clear “lineage graph” that shows how results, variables, and endpoints connect back to where they came.
Zifo establishes end-to-end traceability across trial artifacts by applying AI-driven metadata extraction and CDISC Standards, enabling dependency queries and impact analysis from study design through statistical outputs.
Starstruck (Mezzanine Level)
This session will explore the newly launched Investigator Site File (ISF) Reference Model, released for public review July 2025. Developed in alignment with TMF Reference Model 3.3.1, the ISF RM provides a standardized framework to organize and manage essential documents at the site level. Attendees will gain an understanding of the model’s structure and key components, as well as insight into plans for integrating the ISF Standard under TMF RM Version 4. The session will also highlight ongoing engagement efforts with investigator sites, including valuable feedback received during development. Additionally, Matt Lowery will share his site perspectives and practical experiences and why the ISF RM was needed. Join us to learn how this new model supports harmonization between sponsors, CROs, and sites—promoting efficiency, consistency, and inspection readiness across the clinical research ecosystem.
The adoption of electronic Trial Master Files (eTMFs) in Real-World Data (RWD) studies presents unique challenges compared to traditional clinical trials. While eTMFs offer benefits in organization and accessibility, the nature of RWD—often derived from disparate sources like EHRs and registries—introduces complexities in document collection, standardization, and quality control. Unlike protocol-driven documentation, RWD studies require a flexible approach to defining essential documents and their lifecycle. This includes less frequent but more targeted Quality Control (QC) reviews, focusing on data lineage and contextual relevance rather than strict timelines.
A configurable eTMF, emerging standards and the power of AI can be leveraged to overcome many of the specific challenges of RWD studies with notable application in the following areas:
- Adaptability: Accommodating evolving data and analysis
- Efficiency: Streamlining document workflows using AI
- Compliance: Meeting regulatory expectations despite RWD complexity
- Alignment: Standardizing documents via RWS-DI and TMF v4
This session explores the transformation of Trial Master File (TMF) management in the digital era, focusing on how organizations can balance essential records management with the adoption of modern digital solutions enhancing compliance and operational efficiency. Attendees will gain insights into regulatory drivers such as the EU Clinical Trials Regulation (EU-CTR) and the evolving standards for TMF documentation in the next version of the TMF Reference Model. The presentation highlights best practices for integrating Regulatory Information Management (RIM) systems and TMF for trial applications, ensuring interoperability, and leveraging automation to enhance data integrity and streamline clinical trial processes. Practical examples and interactive elements will provide participants insights on TMF digital transformation and the future of the TMF Reference Model in today’s regulatory landscape.
Blackbird Studio A&B (Mezzanine Level)
In clinical trials, the TMF serves as the backbone of regulatory compliance. Far beyond technology and services, the success of a TMF program hinges on the underlying cultural dynamics. This presentation, a collaboration between Intellia and Cencora focuses on how the critical role TMF Culture plays in shaping effective sponsor/CRO/Vendor relationships and ensuring the operational success of Sponsor’s TMF program and CRO/Vendors.
Our session will discuss four key strategies for fostering an environment where all
stakeholders—Sponsors, CROs, and Vendors—are aligned and motivated towards
common goals.
These four key strategies are:
- Ensuring Operational Partnership Success
- Cultural Missteps and Lessons Learned
- Culture and Accountability
- Engagement and Relationship Building
By weaving together these strategies, our presentation seeks to illuminate how to make an impact of TMF culture within organizations. We aim to provide attendees with actionable insights, drawn from our diverse experiences, to enhance their own TMF culture.
There was no active community of TMF’ers in Japan, and each company has been struggling to manage challenges in TMF operation. To address these aspects, CDISC Japan User Group (CJUG) launched a TMF team in January 2025, bringing together 20+ TMF’ers from pharma, CRO, tech service provider, consultant and academia. We focus on 4 key challenges including ICH-E6(R3) vs TMF, Sponsor-CRO collaboration, TMF in academia and TMF RM penetration in Japan.
One of the practical challenges for Japanese pharma/CRO is managing Japan-specific regulatory requirements while complying with global standards from EMA, MHRA, and FDA. This presentation highlights a case study on safety information documentation in Japan, which uniquely requires IRB review for study continuation rather than just PI’s review as in EU/US. This process creates additional documentation and potential duplication, so CJUG TMF team aims to streamline TMF management practices while fostering greater integration with the global TMF community.
- Matt Lowery, MGH
- Colleen Butler, Syneos
- Steph Viscomi, Apellis
Symphony I (Lobby Level)
Artificial intelligence is reshaping clinical trial operations, particularly the build phase. This session offers a pragmatic blueprint for integrating AI into study build workflows, emphasizing safe, effective adoption using CDISC standards as foundational for efficiency. We’ll examine how advances in AI, the maturation of CDISC standards, and technology capabilities are converging to enable protocol-to-study automation, without losing sight of the importance of human oversight. Attendees will gain awareness into how AI can reduce study build timelines, while enhancing quality and consistency. We’ll address key risks such as AI bias, validation, regulatory compliance, and implementation challenges, and present a phased adoption roadmap, real-world use cases, and practical success factors. Participants will leave with actionable insights for strategic planning, vendor selection, and responsible AI deployment, ensuring rigorous quality and safety while amplifying human expertise.
The digital transformation of clinical research is redefining how protocols are developed, managed, and executed across trials. A digital protocol can empower research teams to design studies with an increased probability of success by integrating real-time analytics and streamlining collaboration among stakeholders. This paradigm shift from documents to data enables new efficiencies in study startup, faster and more representative patient recruitment, automation, and superior data quality throughout the clinical trial lifecycle.
Join this session to explore how your organization can embrace this transformation. Learn about implementation options, common challenges, and tools to help you succeed.
Symphony II (Lobby Level)
A robust standards metadata repository lays a strong foundation upon which automation and other cycle time reductions can be built. AstraZeneca is amid transforming its delivery of Study Instance Metadata through the implementation of a standards metadata repository, moving away from Excel spreadsheets. It was important to have a system that provided the study instance metadata in a machine-readable format to easily be consumed by other systems to enable automation and data re-use. We undertook a phased approach of first migrating global clinical data standards into the metadata repository followed by introducing new practices for building study instance metadata. We have laid a strong foundation that will allow us consistent re-use of metadata and metadata driven automation across the organization.
Comprehensive data collection from underrepresented populations is essential to advancing health equity in clinical research. Historically, these groups have been underrepresented, leading to gaps in regulatory submissions and scientific generalizability. This presentation evaluates the impact of broadening data acquisition to include diverse demographic, socioeconomic, and clinical variables, aligned with CDISC standards for standardized data representation. Incorporating these enriched datasets enhances trial validity and regulatory readiness. Additionally, by leveraging CDISC compliant frameworks, clinical trial teams can accelerate the integration of inclusive data elements and improve the timeliness and quality of reporting. This data-driven approach fosters more representative clinical evidence, ultimately driving better health outcomes and regulatory compliance.
The biopharmaceutical industry’s digital transformation is accelerating, yet collaboration among sponsors, regulators, CROs, and technology partners remains fragmented by siloed systems and static documents. Trusted Regulatory Spaces (TRS) introduce a new model for secure, cloud-native collaboration across the product lifecycle. Designed to enable real-time data exchange and communication, TRS supports structured authoring, regulatory validation, and protocol development while leveraging standards such as HL7 FHIR, XML, and JSON for interoperability. By aligning with initiatives like TransCelerate and CDISC’s Digital Data Flow (DDF), TRS delivers end-to-end automation, built-in compliance, and a scalable foundation for digital transformation. Proven through successful implementations—including ICH M11 protocol execution and DDF pilots—TRS demonstrates how structured, data-driven workflows can replace document-centric processes.
What You’ll Learn:
- How Trusted Regulatory Spaces (TRS) enable secure, real-time collaboration across sponsors, regulators, and partners
- The role of TRS in supporting Digital Data Flow (DDF) and frameworks like ICH M11 using HL7 FHIR, XML, and JSON
- Real-world examples, lessons learned, and strategies for implementing TRS within your organization
Symphony III (Lobby Level)
The FDA's Real-Time Oncology Review (RTOR) program accelerates the review of oncology clinical trials by allowing for the early submission of top-line results and datasets. However, RTOR requires meticulous preparation and adherence to the FDA's specifications, particularly concerning the submission of comprehensive Analysis Data Model (ADaM) datasets based on data specifications provided by the FDA Oncology Center of Excellence (OCE) and Office of Oncologic Diseases (OOD) Safety Team.
This presentation delves into some of the nuances and challenges related to the implementation of these specifications, focusing on the adaptation of ADaM dataset requirements for RTOR.A key area of focus will be the non-standard Adverse Events Analysis Dataset for Cytokine Release Syndrome (CRS) and Neurotoxicity (NT) – ADCRSNT.
In a significant collaboration, seven leading vaccine companies have formed the Vaccines Industry Standards Group (VISG) which has focused on harmonizing interpretations of regulatory submission guidance and CDISC data standards and sharing regulatory feedback. The group recognizes that aligning the understanding of requirements — such as participant diary data collection and the submission of reactogenicity and efficacy data — accelerates time to market and benefits global health. This unified approach could facilitate future collaboration with Health Authorities and CDISC, aiming to update the CDISC Vaccines TAUG to meet current Health Authorities' expectations, thereby ensuring clarity and consistency in submission standards. Our collaborative model can serve as a blueprint for other therapeutic areas within the pharmaceutical industry, demonstrating how organizations can work together to streamline regulatory processes while maintaining a competitive edge in product innovation.
This paper aims to present an overview of data traceability flow, data standard requirements, and lessons learned during the preparation of the Study Data Submission Plan (SDSP), tabulations, analyses, and BIMO package submissions. It will address key differences between the agency and its divisions, focusing on the FDA (CBER and CDER), PMDA, and NMPA. Special emphasis will be placed on guidance requirements, including TCG, technical specifications, rejection criteria, and validation rules, as well as the importance of documenting these elements. The study start dates will influence the implementation of data standards and their implications for submissions. We will highlight the differences between the SDSP and the SDSP Appendix, as well as the size requirements for SAS XPT files. These documents are subject to ongoing changes, which impact submissions throughout the product development lifecycle.
Starstruck (Mezzanine Level)
In this session we will introduce a proposal developed under the TMF RM V4 initiative to define the core record types for computerized systems used in clinical trials. These records are those that sponsors and their service providers should consider essential for inclusion in the Trial Master File (TMF), supporting both oversight activities and regulatory submissions.
The proposal reflects inputs from a multidisciplinary consultation and is aligned with the guiding principles of the TMF RM V4 initiative and CDISC standards. It is designed to meet known regulatory expectations while offering a practical and adaptable approach for diverse sponsor organizations and trial needs.
Additionally, we will present a template for documenting the trial system inventory, aimed at enhancing inspection readiness and facilitating efficient retrieval of both trial-specific and enterprise-level records.
The presentation explores the evolution and standardization of metadata within the CDISC TMF Reference Model (Version 4). It emphasizes how metadata—structured information describing documents and records—forms the backbone of efficient, compliant Trial Master File (TMF) management. The session explains what is changing in V4, covering record types, levels, and hierarchy metadata. Then will discuss what's going on behind the curtain at CDISC on which working groups are meeting for this important job. Come for a fun, informative session on how metadata can ensure consistent, user-friendly data practices across sponsors, CROs, and vendors.
Although 2027 may seem far off, the TMF Reference Model Version 4 development is well underway, making now the ideal time to start preparing for implementation and change management. The transition to TMF Reference Model V4 will bring significant updates that impact TMF standards, processes, and technology. Engaging in early preparation will help your organization align resources, assess readiness, and plan for a smooth transition.
This session will guide you through key considerations for planning and implementing TMF V4 changes. You will learn about the major updates introduced in TMF V4, strategic approaches to developing an organizational roadmap, and practical steps for managing change across teams. You’ll leave with actionable insights to begin shaping your TMF V4 readiness plan and position your organization for a smooth transition well before the planned 2027 go-live. Join us to explore what’s ahead and prepare for the future of the TMF Reference Model.
Blackbird Studio A&B (Mezzanine Level)
This presentation will share how Beacon Therapeutics, a mid-sized biotech company, ensures high-quality and compliant oversight of the Trial Master File (TMF). Although many TMF Quality Control (QC) tasks are often outsourced, Beacon remains accountable for inspection readiness and regulatory compliance. From the perspective of the Sr. Clinical Documentation Manager, the session will cover practical strategies, governance frameworks, defined roles, SOPs, vendor oversight, and escalation pathways. The focus will be on taking a proactive, data-driven QC oversight using key metrics, rework frequency, and common errors. The presentation will show how regular QC reviews, spot-checks, and vendor monitoring can help improve quality over time and identify areas needing support or training. It will also touch on effective communication, audit-ready documentation, and governance routines. Attendees will leave with practical strategies and insights to implement a consistent, compliant, and inspection-ready TMF QC oversight model across studies.
The partnership between Sponsors and External Service Providers (ESP/CRO) is vital for effective TMF management. A proactive approach to defining TMF responsibilities, ESP deliverables, and associated costs early - during RFI, RFP or contracting phase- enhances operational efficiency and drives inspection readiness improvements.
Misaligned expectations and poorly defined budget assumptions often lead to downstream inefficiencies. Clearly defining scope and budget helps prevent miscommunication, compliance risks, and costly change orders.
Sponsors should focus on understanding how ESPs will manage the TMF and what services are included. In turn, ESPs should maintain transparency on their pricing models and TMF capabilities.
Standardizing definitions and cost structures across partnerships minimizes redundancy, improves governance, and ensures TMF quality. Leveraging this unified framework grounded in mutual agreement, the TMF plan delineates this defined responsibility and accountability for all TMF related activities.
Ultimately, early budget alignment strengthens partnerships and drives a more efficient, inspection-ready TMF management model.
This presentation highlights how a standardized reporting solution transformed Gilead’s TMF completeness process within four months. The initiative addressed key challenges including heavy manual effort, inconsistent processes, complex CRO portfolios, and ongoing inspection readiness risks. An interim reporting framework was developed that integrates milestone-driven automation, governance documentation, and industry best practices aligned with the TMF Reference Model. The result was real-time completeness reporting, providing proactive oversight and consistent execution across studies. The program’s success established a scalable foundation for future automation, CTMS integration, and sustained inspection readiness supported through managed services and data-driven monitoring.
Symphony Foyer (Lobby Level)
Symphony I (Lobby Level)
The CDISC 360i initiative was launched to realize CDISC’s mission of creating connected standards across the study information lifecycle - enabling accessible, interoperable, and reusable data to support more meaningful and efficient clinical research. The program aims to deliver a complete, metadata-driven study package: from protocol design to submission, including test data, executable tools, and an automated pipeline for generating analysis results.
In this presentation, we will showcase how OpenStudyBuilder is integrated into the CDISC 360i ecosystem. As one of the key components in this collaborative initiative, OpenStudyBuilder functions as a central clinical metadata and study definition repository. We will highlight its role in supporting end-to-end automation and standardization, and how it interfaces with other tools and solutions within the 360i framework. Join us to learn how OpenStudyBuilder contributes to making the CDISC 360i vision a reality - and how its integration helps pave the way for a more streamlined and interoperable future in clinical research.
In January 2025, a CDISC working group was established to define and model Analysis Concepts, aiming to enhance automation in clinical research as part of the CDISC 360i initiative. While the Unified Study Definition Model (USDM) standardizes key elements like study objectives and endpoints, there is a gap in structured metadata for derived and analyzed data. Analysis Concepts aim to fill this gap by providing a standardized framework for translating clinical questions into analytical outputs, supporting the creation of a digital Statistical Analysis Plan (eSAP), an d informing data collection and analysis programming. This work aligns with CDISC's Analysis Results Standards (ARS), which describe statistical results and their derivation. Analysis Concepts focus on clinical intent and analytical approach, serving as upstream metadata for the ARS framework. The presentation will cover the rationale, key components, early models, and future directions for standardizing analysis planning.
With growing emphasis on standardized analysis results data (ARD) to improve regulatory compliance and data reuse, organizations must navigate complex decisions when adopting CDISC’s Analysis Results Specification (ARS). This presentation shares our ongoing journey of evaluating ARS capabilities against operational requirements for flexible, reusable analysis results.
Our objective was to enable a ‘derive once, render many’ approach—where a single set of analysis results supports multiple output formats, from in-text and post-text summaries to dynamic dashboards and publications. Achieving this within ARS presents unique challenges that we aim to address.
This presentation details our decision-making framework and evaluation process, weighing immediate operational flexibility against future-proofing for potential ARS regulatory requirements. We'll share practical insights on ARS implementation challenges we've identified, potential workarounds for the single-output constraint, and lessons learned from balancing innovation with standards compliance.
Symphony II (Lobby Level)
Version 3.0 of the Study Data Tabulation Model is now in review prior to publication. This session reviews the changes that have taken place in the model since the release of version 2.0, which accompanied SDTMIG 3.4.
Definitions are now present for all variables, and model information about variables has been expanded to include variable groups and tables of variable relationships. A new structure for non-standard variables has been developed, replacing SUPPQUAL datasets. A special-purpose domain (DC) has been added to support multiple participations. We will identify variables that have been added to the model or changed since the previous release.
Version 4.0 of the SDTM Implementation Guide is now in review prior to publication. This session reviews the changes that have taken place in the SDTMIG since the release of version 3.4.
The introductory sections about the fundamentals of SDTM, conformance, and general assumptions have been updated and re-organized. Metadata tables have been restructured, variable definitions have been added, and documentation of variable relationships has been standardized. Non-standard variable datasets (NS--) have replaced supplemental qualifier datasets. New domains include DC (for multiple participation), EA (for event adjudication) and GI (for gastrointestinal system findings). We will also identify significant changes to variables and within existing domains since the previous release.
The Demographics for Multiple Participations (DC) domain, introduced with SDTMIG v4.0, addresses multiple subject participations within a single clinical trial.
The eleven-year history of DC, involving collaboration with industry and regulatory authorities. DC supports SEND and SDTM standards but excludes site transfers. CDISC's approach diverges slightly from FDA Technical Conformance Guide, which will be covered in greater detail.
DC complements the existing Demographics (DM) domain, which remains largely unchanged but now includes additional variables like CRACE and CETHNIC. DC is required when at least one subject has multiple participations, capturing detailed data for each subject, but is otherwise prohibited. DC also includes FOCID, supporting study designs involving distinct treatments to multiple parts of the body, and DCSEQ, aligning with other special purpose domains that allow multiple records per subject. Sponsors are able to detail the specifics of variable population strategies in submission documents such as the Define-XML and cSDRG.
Symphony III (Lobby Level)
The life sciences industry stands at a critical juncture, as astonishing technological advancements with Artificial Intelligence (AI) are opening doors to exponentially accelerate and optimize legacy processes. In parallel, organizations like CDISC and TransCelerate have been instrumental in developing crucial data standards, which are essential for driving efficiency and innovation in clinical research.
There are evident synergies between standards and automation in clinical trials, but are we truly leveraging both? Large Language Models (LLMs) and Machine Learning (ML) can augment the valuable work already being done in standards development.
This presentation aims to foster a dialogue about how the industry can responsibly incorporate AI to evolve standards and ultimately continue to meet the evolving needs of clinical research, facilitating faster drug development, more efficient trials, and improved patient outcomes.
Decentralized Clinical Trials (DCTs) are rapidly transforming the clinical research landscape, shifting the focus from sites to participants, and technology is the engine behind this shift. This session explores how digital tools and infrastructure are enabling faster, more inclusive, and more efficient trials.
Today’s DCTs rely on a connected ecosystem that integrates wearables, telehealth, mobile apps, and e-consent with backend systems in real time. Data from smartwatches and home devices flows securely to cloud platforms and EHRs, automatically structured into CDISC-compliant formats for seamless downstream use. Edge computing enables near-real-time processing, while AI-powered algorithms support quality checks, risk-based monitoring, and early anomaly detection.
The impact is measurable: DCTs have been shown to improve patient retention by 15% (Medidata, 2021), and McKinsey estimates they can reduce trial timelines by 25%. With 80% of sponsors planning DCT adoption by 2027 (Everest Group), this approach is moving from experimental to essential.
Artificial Intelligence (AI) is transforming clinical trial research, and this presentation explores its integration into the CDISC Open Rules project. I will demonstrate a custom-trained GPT-powered chatbot embedded in the CDISC Open Rules editor, designed to assist users in creating and validating CDISC Open Rules efficiently—without programming expertise.
The session will cover the chatbot’s architecture, highlighting prompt engineering’s role in generating high-quality outputs. I’ll guide attendees through document preparation for accurate, context-aware results and showcase real-world examples of rule creation and validation. The AI-driven chatbot automates rule drafting, test data generation, and interactive support, reducing manual effort while enhancing efficiency.
This presentation will inspire attendees—whether data managers, statisticians, or other clinical trial professionals—to embrace AI-driven solutions in their work. By demonstrating a practical application of a GPT-powered chatbot, I’ll provide actionable insights to enhance workflows, improve efficiency, and foster innovation.
Starstruck (Mezzanine Level)
This presentation explores the meaning and practical application of a risk-based approach to Trial Master File (TMF) management. It introduces the TMF Risk Initiative, which aimed to foster greater understanding and consistency in risk-based TMF practices. The initiative produced a white paper, toolkit, and training resources to guide organizations in defining essential TMF content, quality control and documentation requirements. The presentation emphasizes that risk is multifactorial, shaped by trial design, technology, and organizational context. It advocates for tailored risk management strategies, periodic risk reviews and leveraging technology for quality and compliance. Key takeaways include the importance of regulatory alignment, proactive risk mitigation, and continuous improvement to ensure TMF completeness, integrity, and inspection readiness.
Regulators around the globe are encouraging Sponsors and related stakeholders to take a risk-based approach to support their clinical research work. But what does this look like and how should you apply, document and manage this process? The TMF Reference Model Risk initiative developed a tool to support Risk Management for the TMF. This session will introduce the tool and discuss how to use it as well as discuss how to develop a plan for mitigation of risk.
As ICH E6(R3) reshapes expectations for clinical trial oversight, risk-based TMF review must evolve beyond document completeness. True compliance risk isn’t revealed by counting files—it’s discovered through data and context. This session explores how sponsors and CROs can integrate Critical to Quality (CTQ) factors and operational insights to identify where oversight truly matters.
Through practical examples, we’ll show how leading teams are reframing inspection readiness and using study-specific data to confirm accountability, detect emerging risks, and demonstrate meaningful oversight. Attendees will gain actionable strategies and tools—such as the TMF Risk-Based Review Spreadsheet—to move from static checks to dynamic, quality-driven review.
Join us to learn how to build a defensible story of compliance that reflects not just task completion, but the intent, quality, and ownership regulators expect under ICH E6(R3).
Blackbird Studio A&B (Mezzanine Level)
Audit trails in the electronic Trial Master File (eTMF) are often viewed as a basic compliance requirement—but when used strategically, they offer far greater value. Beyond confirming workflows, they provide visibility into system use, team behaviors, and potential areas for process improvement. The EMA’s 2023 guidance emphasizes the importance of secure, time-stamped audit logs and encourages regular review of user activity to ensure data integrity and system oversight. Audit trails can help detect issues early, long before they escalate into compliance or inspection risks. This presentation will explore how audit trail data can be used not just to meet regulatory expectations, but to strengthen documentation quality, enhance team engagement, and support inspection readiness. The key takeaway is this: audit trails are not just for compliance, they’re tools for visibility, insight, and continuous improvement.
The electronic Trial Master File (eTMF) is evolving from a static archive into a critical, active operational tool to manage increasingly complex clinical trials. Currently, TMF content remains fragmented across disparate eClinical systems, leading to compliance risks, duplicated effort, and inconsistent data. This presentation discusses the need for clinical operations to require the eTMF to function as an "intelligent," integrated repository. Achieving true interoperability demands connecting systems via shared metadata standards and standardised integrations. We will examine how industry standards provide the structural and technical framework necessary to bridge system gaps. By leveraging these standards, organisations can ensure TMF completeness, timeliness, and quality in real-time. Attendees will gain practical strategies for turning the eTMF into a living, integrated driver of trial governance, efficiency, and regulatory success.
- Jamie Toth, BeOne Medicines USA, Inc.
- Bryan Souder, Merck
- Deb Wells, Novartis
Symphony I (Lobby Level)
Join us for a 360i Roundtable Discussion including an overview and interactive roundtable collaboration on specific topics critical to the 360i initiative and transformation to digital standards. Topics will include:
- Identifying the Value Proposition and Elevator Pitch for Connected Standards
- Developing and Leveraging Analysis Concepts
- Accelerating Tool Development to Enable Digital Standards
- Understand and Accelerate Development and Use of Biomedical Concepts
- Leverage AI to help develop and use the Connected Standards
Attend, engage, and provide your input into the 360i future!
Symphony II (Lobby Level)
FDA guidance requires Real World Data (RWD) submissions to follow formats listed in the FDA’s Data Standards Catalog, currently CDISC SDTM. However, transforming RWD—collected from billing and clinical care sources like claims and electronic health records—into SDTM is challenging. SDTM is designed for clinical research, and this complex transformation often causes data loss and unquantifiable biases, making RWD unreliable for regulatory use. To address this, CDISC’s RWD Lineage project is developing a standardized metadata model that provides detailed lineage and traceability for each RWD data point. This approach enables auditing of source data and assessment of data transformation quality. By standardizing lineage across diverse data sources, the project aims to unify tools, workflows, and analytics, supporting validation and regulatory audits. This presentation will share the latest progress and findings from the RWD Lineage team, highlighting efforts to improve the reliability and regulatory acceptance of RWD through enhanced data transparency and traceability.
As digital health technologies like wearables increasingly capture clinical trial data, sponsors face challenges in integrating and mapping this data to CDISC standards for regulatory submissions. The diversity of data sources, high granularity, large volumes, use of software algorithms, and inclusion of supportive contextual data complicate this process. Current CDISC guidance offers limited examples, making standardization difficult. Some data may not fit clearly into existing SDTM domains, while other data presents challenges related to how it’s collected or reported by devices or vendors. This presentation will examine these challenges in mapping digital health technology (DHT) data and outline Pfizer’s approach to determining target mappings. It will also share practical examples illustrating how Pfizer navigates the complexities of aligning diverse DHT data with CDISC standards to support regulatory compliance and data integrity.
The growing use of Real-World Data (RWD)—from electronic health records and claims—is transforming regulatory submissions for drugs and biologics, especially as digital health technologies expand. However, Real-World Evidence (RWE) derived from RWD presents challenges for regulatory reviewers due to missing contextual variables not captured under study protocols. This paper outlines key issues and proposes solutions using existing standards for observational and healthcare delivery data. It focuses on enhancing CDISC SDTM, particularly the Subject Visits (SV) domain, to better support RWE submissions. By examining standards like HL7 FHIR, OMOP, and Sentinel, the authors suggest updates to represent patient encounters with added context—such as provider identity, care site, and encounter details. These enhancements aim to improve the interpretability and reliability of RWD, reduce bias, and support regulatory analysis. The recommendations also contribute to ongoing efforts by standards development organizations to harmonize real-world data with randomized clinical trial frameworks.
Symphony III (Lobby Level)
This presentation walks through the use of the CDISC Library API to obtain and process data standards metadata. Many organizations still rely on manual Excel-based processes instead of direct API integration into systems like Python, Java, SAS, or R.
By the end of the presentation, you will understand how to retrieve CDISC standards metadata in JSON format via API calls, navigate through the hierarchical JSON structure and transform it into tabular, relational references.
The solution requires virtually no knowledge of the standards content or model, rather relying on the nature of JSON content to generate an integrated, self-defined structure. The result, while similar in general structure to the Excel representation, fully represents the richer, more integrated model available from the CDSIC Library.
The term omics refers to fields of research that use large scale sets of bioinformatics data to identify, describe, and quantify the entire set of molecules and molecular processes that contribute to the form and function of cells, tissues and organisms. Types of omics, such as genomics and transcriptomics, are a fundamental part of clinical research and contribute to the development of precision medicine approaches to improve patient outcomes. Currently, CDISC develops and maintains comprehensive standards for the representation of omics data and is part of a connected standards landscape that supports clinical research.
This presentation will provide:
- A brief introduction to omics and supporting CDISC standards
- An overview of CDISC and related standards in clinical research
- A reference guide to CDISC standards and resources for the community
Harmonizing data from multiple studies to develop Integrated Summaries of Safety (ISS) is a cornerstone of pharmaceutical regulatory submissions, yet it remains a complex and resource-intensive process. Open-source tools, particularly those aligned with CDISC standards, have emerged as innovative solutions to address these challenges. This case study examines the potential of open-source tools such as sdtm.oak and admiral to streamline ISS programming.
Through the analysis of hypothetical scenarios and real-world examples, this paper investigates how open-source options enable consistent data formatting, reduce manual effort through automation, and improve efficiency without compromising quality. The study identifies key benefits, including enhanced productivity, and scalability, as well as highlights the challenges encountered during implementation.
The findings aim to provide actionable insights into leveraging open-source for ISS integration, contributing to the growing body of knowledge on innovative practices in clinical analytics and regulatory submissions.
Starstruck (Mezzanine Level)
In the world of music, remixes take on old tracks and add new energy, styles and layers. TMF Migrations are similar: you start with raw, unprocessed data and transform into something efficient, structured and ready for the future using technology and standardized processes.
The presentation will explore strategic, technical and operational considerations involved in TMF migration projects. Key topics will cover end-to-end process including pre-migration planning, validation as well as stakeholder communication while aligning to regulatory requirements. Drawing insights from case studies, the session will discuss challenges and risks, lessons learned and effective strategies for successful migration. There will be a focus on technology and how it can be harnessed for areas such as data mapping to reduce manual workload. The presentation will also highlight the importance of industry standards in promoting interoperability enabling consistent, seamless and stress-free migration.
When acquisitions occur, TMF integration teams are often faced with the perfect storm: different TMF processes, regulatory pressures with impossible deadlines, or TMFs held in multiple or different systems. Based on information provided upfront such as study status and document location, count, format, and metadata, decisions are made on transfer methods, management of scattered records, and deployment of risk-based approaches. Skilled navigation is required when charting acquisition courses, especially with different TMF structures where the standardized Reference Model isn’t used to guide the way. Strategic voyage planning and collaborative teamwork between the divesting company and internal crews is vital to ensure safe passage through migration waters. This presentation will share experiences from an actual acquisition voyage and lessons learned about upfront questioning, content location and managing unexpected complications. We’ll discuss tools and steps taken to rescue problematic migrations, turning a TMF chaotic storm into controlled success.
Clinical trial regulations and guidelines are becoming stricter on the requirement for sponsors to retain essential records and documents. EU CTR, and the MHRA (come April 2026), require the retention of the TMF for a minimum of 25 years. Additionally, sponsors must retain data in line with ALCOA+ and ICH E6(R3), for the entire data lifecycle.
In this session, Arkivum CEO Chris Sigley will outline strategies for meeting these retention and compliance requirements, including:
- Consolidating long-term records and data to combat data sprawl.
- Applying a risk-based approach to determine what to retain and how to preserve it.
- Maintaining data integrity over time in line with ALCOA+ principles.
- Leveraging digital preservation to ensure data remains readable and usable.
Attendees will gain a clear understanding of best practices for retaining essential records—from strategic planning to execution.
Blackbird Studio A&B (Mezzanine Level)
Pfizer (Von’Diza Flix, TMF Operations Lead, Sr. Manager) and Veeva Systems (Neharika Ramani, Manager, Customer Success) will share how Pfizer is embarking on a journey to modernize their TMF operating model.
We will share the journey of implementing this modern strategy, focusing on practical steps for change management, process improvement, and leveraging technology that can be applied to any clinical transformation. Attendees will learn how a unified TMF approach enables better data quality, faster document access, and stronger team alignment—all critical elements for accelerating trials and achieving broader clinical development goals. Discover how modernization isn't just about technology, but about creating a simplified, connected, and ultimately more efficient clinical research ecosystem.
As clinical trials become increasingly complex, the need for actionable insights to guide Trial Master File (TMF) decisions grows. This session offers a strategic perspective based on years of collaboration with sponsors, CROs, and biotechs. Drawing from extensive data analysis, it highlights key trends, patterns, and metrics that enhance team performance and support inspection readiness. Attendees will explore industry-wide TMF health benchmarks—covering quality, timeliness, and completeness—and learn to distinguish impactful metrics from distracting ones. The session emphasizes how data storytelling can drive compliance and informed decision-making. By leveraging visibility at scale, organizations can shift TMF operations from reactive to resilient. Sharing metrics not only reflects performance but also fosters trust and alignment across teams and partnerships.
Lyric Room (Lobby Level)
Is it possible to interact with the CDISC Library API using natural language in plain English? The answer is yes. This paper introduces a web application enabling users to query ADaM/SDTM variable information(metadata and codelist) through natural language by leveraging AI-driven Natural Language Processing (NLP).
The underlying logic involves initially creating a SAS macro to extract ADaM/SDTM variable and codelist information from the CDISC Library API, followed by converting this SAS code into Python while preserving its original functionality. This Python script serves as the application's core logic, interfaced with AI to manage both input and output. Code is open-sourced via GitHub. During this conference, the step by step implementation and practical applications of CDISC Genius will be demonstrated.
This poster addresses the increasingly important requirement for anonymized clinical trial data for research by academia, regulatory evaluation, and training of AI models. The FDA, EMA, and Health Canada, among others, require traceability of data, creating a conflict between protecting patient privacy and regulatory requirements. We propose an approach utilizing CDISC SDTM, ADaM, and value harmonization techniques. The regulatory framework created through laws like GDPR and HIPAA disallows the reuse of data but enables its reuse in proper anonymization. Our suggested approach categorizes variables according to EMA Policy 0070 as direct identifiers, quasi-identifiers, and non-identifying variables. Generalization, suppression, and perturbation are approaches used to handle these variables. Quasi-identifiers are assessed based on how easy it is to replicate, differentiate, and know them. The poster as a whole contributes practical guidance on anonymizing data to satisfy privacy and regulatory needs, enabling responsible worldwide data sharing.
Digital Health Technologies (DHTs), such as wearable sensors, offer real-time, quantitative insights into patient activity, enhancing the evaluation of therapeutic efficacy in clinical trials. However, their integration presents operational challenges, including device misassignment, incomplete data uploads, and unfamiliarity among sites and participants. These issues can lead to data loss and reduced data quality. To address this, it is key to implement automated quality control checks and leverages cloud-based APIs for secure data access. CDISC-compliant data entry (e.g., DXSTDTC, DXENDTC, DXTRT, SPDEVID) anchors DHT data to participant timelines, enabling cross-validation and query generation. Standardized data models allow for reusable programmatic checks, improving efficiency and consistency across studies. This approach ensures robust data monitoring, preserves statistical power, and supports the generation of novel endpoints. By addressing DHT-specific complexities early in study design, sponsors can enhance data integrity and optimize the value of digital endpoints in modern clinical research.
Behind every high-quality, CDISC-compliant SDTM dataset lies a team of data-driven heroes: Data Managers, Standards Managers, Clinical Programmers, and Biostatisticians, each bringing their own unique skillsets and expertise to the fight for clean, conformant data.
Our heroes may hail from different domains, but they share a common trait: a working knowledge of CDISC standards and familiarity with the tools of the trade. This foundational knowledge becomes their secret weapon to enabling smarter, more collaborative workflows. Understanding the strengths of each role unlocks opportunities to rethink traditional processes, distribute tasks more effectively, and innovate with workflows that leverage each team member’s unique superpowers.
This poster will present our SDTM Squad, outlining the signature skills each role possesses, supercharged by emerging contributions of AI. By highlighting the strengths of this dynamic team, we’ll demonstrate that with the right strategy and collaboration, regulatory compliance can be both faster and more heroic.
As a Canadian academic research institute, Population Health Research Institute (PHRI) has historically utilized multiple clinical trial management systems (CDMS) across studies and clinical trials. The challenge is to create harmony among the multiple CDMS in study design, Case Report Forms creation (CRFs), variable names, data types, formats and codes to make optimized studies and clinical trials. Each CDMS system has unique preferences in study design and programming. The PHRI CDASH Working Group was established to manage the standardization process and to create a custom PHRI CDASH library.
The poster will explain the process of creating the PHRI CDASH-P library and provide examples of the challenges met and solutions found for applying CDASH to clinical trials across different studies and multiple CDMS in an academic organization. Currently the standardization applies to the CDMS DFnet DFdiscover and AnjuEDC TrialMaster. Our goal is to expand the library for RedCap and other systems.
Processing Trial Design domains in clinical research is a complex and time-consuming task, primarily because much of the required information is embedded within unstructured documents such as clinical trial protocols, rather than being derived directly from subject-level data. This Poster explores how Artificial Intelligence (AI) and Machine Learning (ML) techniques can be leveraged to automate and streamline the generation of Trial Design datasets. By extracting relevant data from protocols, Case Report Forms (CRFs), and other Study Data Tabulation Model (SDTM) domains, AI/ML models can significantly reduce manual effort and improve accuracy. We propose an ensemble approach that combines multiple models to classify and interpret various sections of the protocol, enabling precise extraction of key elements necessary for constructing Trial Design domains. Furthermore, we highlight the integration of the Unified Study Definition Model (USDM) as a foundational framework to standardize and facilitate the creation of these domains.
Autumn is the perfect time to focus on the changing landscape of TMF culture.
Let go of outdated habits and embrace a new season of consistency and shared purpose for all.
Key Themes:
Harvesting Data
• Conduct thorough TMF reviews to gather valuable insights.
• Use this data to improve processes and ensure compliance.
Turning Over a New Leaf
• Refresh and update your TMF Plan and workflows.
• Leave behind inefficient habits and adopt best practices.
Storing for Winter
• Focus on inspection readiness to prepare for future audits.
• Ensure your TMF is complete, organized, and ready for any challenge.
“This Autumn, let’s embrace the season of change and prepare for a future of success!”
Effective management of the Trial Master File (TMF) is essential for ensuring regulatory compliance and maintaining inspection readiness. However, clinical research operational teams frequently encounter challenges related to clarity, confidence, and consistent engagement in fulfilling their TMF-related responsibilities.
This poster presents how the design and implementation of a TMF Reference Model tool, built in Smartsheet, has empowered multidisciplinary operational teams to gain a clearer understanding of TMF zones and artifacts, and clarify responsibilities and expectations.
For a clinical pharmacology organization operating within short, high-intensity timelines, the development of a streamlined, practical resource to support TMF comprehension and process adherence was crucial. This targeted approach has driven stronger engagement, improved role clarity, and heightened accountability, ultimately enhancing operational efficiency in achieving inspection readiness, and ensuring the delivery of complete and accurate final TMFs.
Gilead Clinical Data Science (CDS), in partnership with Trialwise, launched a TMF Inspection Readiness Program to enhance proactive document evaluation and operational efficiency using data-driven solutions. Guided by Total Quality Management, the initiative enhances operational visibility through structured workflows, targeted training, tailored metrics, and real-time feedback from a dedicated Trialwise TMF Documentation Specialist. Quarterly insights highlight readiness trends across studies, fostering continuous enhancement. Since launch, 21 studies and over 1,100 documents have been onboarded, resulting in measurable gains in filing efficiency, early challenge identification, minimized rework, and accelerated timelines. This scalable, data-driven approach reflects Gilead’s continued commitment to operational excellence and proactive TMF management through trusted partnership.
This abstract reviews the process the HEALEY ALS Platform Trial TMF team follows to ensure audit readiness of the eTMF through conduction of an internal quality control (QC) of the eTMF every 6 months using a risk-based approach. This abstract describes the methods that the PM TMF team utilizes to conduct regular QC of the HEALEY ALS Platform Trial eTMF and the lessons learned. The TMF team conducts a bi-annual QC review of the eTMF records. The records are reviewed by risk level, with 50% of high risk records reviewed and 25% of the moderate and low-risk records. The basic elements of the records are reviewed. Results are recorded and shared with the record owners/teams. Performing this QC review bi-annually will continue to facilitate the overall health of the TMF for the HEALEY ALS Platform Trial.
{sdtm.oak} is an EDC (Electronic Data Capture systems) and Data Standard agnostic solution that enables the pharmaceutical programming community to develop CDISC SDTM datasets in R. The reusable algorithms concept in {sdtm.oak} provides a framework for modular programming and can potentially automate SDTM creation based on a standard SDTM spec.
Symphony Ballroom I & II (Lobby Level)
The digitalization of healthcare and advancing science and technologies are driving evolution. Biopharma companies, regulatory agencies, academic institutes, technology vendors and standards development organizations must adapt to this new environment. Hence, a significant challenge for the data standards community is keeping pace with the sheer breadth and depth of disease areas for which therapies are being developed across the biopharmaceutical industry, and the myriad of data types (including biomarker, genomic, imaging real world data, and various digital health technologies) being collected and used for analysis and reporting for decision making.
Leaders in the data standards community will share examples for current Data Standards Governance and evolving Data Standards roles, challenges being faced and aspects of where we are going, and how the Data Standards roles are evolving to thrive in the Digital Age with rapidly advancing science and healthcare needs.
Speakers and Panelists:
- Jonathan Chainey, Roche
- Miho Hashio, GSK
- Brooke Hinkson, Merck, CDISC Board Chair
- Shannen McGinnis, Amgen
- Rhona O'Donnell, Novo Nordisk, CDISC Board Member
- Tushar Sakpal, Novartis
- William Standen, Eisai
- Nicole Thorne, BMS
- Aatiya Zaidi, Gilead
Symphony Ballroom III (Lobby Level)
This presentation outlines the strategic roadmap for the evolution of the TMF Reference Model. Led by Paul Carter, CEO of Montrium and Chair of the TMF Reference Model Steering Committee, the session explores key developments in core standardization, including record types, artifact consolidation, and controlled terminology updates. It highlights interoperability efforts through enhanced metadata standards and alignment with USDM and audit trail frameworks. The roadmap also addresses ancillary mappings to ICH M11, ISF, RWE, and Medical Device, alongside the development of tools, APIs, and implementation guides to support standard management. Training programs, risk frameworks, and regulator engagement strategies are discussed to ensure widespread adoption and compliance. Attendees will gain insight into the future direction of TMF standards and their critical role in enabling efficient, transparent, and compliant clinical trial documentation.
- Rob Jones, Cencora
- Guillaume Gerard, Agatha
- Eleanor Hewes, Syneos
Symphony Ballroom I & II (Lobby Level)
Moderator: Nicole Harmon
Panelists:
- Chris Decker, President & CEO
- Julie Smiley, VP, Data Science
- Peter Van Reusel, Chief Standards Officer
Symphony Ballroom III (Lobby Level)
The Trial Master File (TMF) remains a cornerstone of clinical trial operations, yet it continues to rely heavily on document-centric, manual, and disconnected processes. This presentation explores the transition toward a fully digital, metadata-driven ecosystem. It highlights how TMF Version 4 and related CDISC initiatives aim to modernize interoperability across clinical ecosystem. Key industry drivers are accelerating this transformation. Through AI and real-time data exchange, the TMF is evolving from an archival repository to a proactive intelligence layer that enhances compliance, inspection readiness, and operational efficiency.
Attendees will gain insights into:
- Current challenges, strategic roadmaps, and practical steps toward a data-first TMF paradigm,
- The strategic and operational implications of digital transformation in TMF management,
- Lessons learned from other similar standardization efforts.
Join us to demystify the future of TMF and show how a digital TMF can drive efficiency and quality in clinical trial management.
Moderator: Aaron Grant, Just in Time GCP
Panelists:
- Paul Carter, Montrium
- Gui Gerard, Agatha
- Jim Horstman, Veeva
- Rob Jones, Cencora
- Ricky Lakhani, Phamaseal
- Chris Sigley, Arkivum
- Jay Smith, Transperfect
- Leah Weitz, Egnyte
Green Room Foyer (Lobby Level)
Southern Ground Foyer (Mezzanine Level)
Southern Ground A (Mezzanine Level)
This training will focus on how the TMF Reference Model can be utilized to improve TMF Management. Starting with the basics of the TMF Reference Model itself, the training will walk through the importance of people, process, and technology, setting up a TMF, performing QC, developing oversight approaches, and finally, surviving the dreaded inspections!
Tracking Room (Mezzanine Level)
Creating high-quality data packages for analysis, reporting, and submission is crucial in drug development. To guide the industry with data quality checks, CDISC has developed the CDISC Conformance Rules as part of the Foundational Standards. Regulatory authorities have also defined their own business rules. The current biggest challenge is that these rules are descriptive and non-executable.
To automate conformance checks, CDISC has launched the CORE project (CDISC Open Rule Engine). This project aims to provide clear and executable Conformance Rules, along with an open-source execution engine available from the CDISC Library. This training offers a comprehensive overview of the CORE project, including its open-source components, hands-on practice, and guidance for adoption within your company. It’s time to prepare your processes and data packages for the future.
Southern Ground Foyer (Mezzanine Level)
Southern Ground Foyer (Mezzanine Level)
Southern Ground B (Mezzanine Level)
Join us for our first CDISC Biomedical Concepts training session covering both Biomedical Concepts (BCs) and Dataset Specializations. In this half-day hands-on training, delivered by leading experts, you'll learn how BCs and Dataset Specializations are modeled and curated. The training will include step-by-step instructions and demonstrations with hands-on exercises and guided activities to help you gain proficiency in creating BCs and associated Dataset Specializations, and understanding their role in the broader 360i ecosystem. Certificates of Achievement and digital badges will be available for attendees who successfully complete this hands-on training.
Tracking Room (Mezzanine Level)
Join us for our first Dataset-JSON hands-on implementation training session covering the new Dataset-JSON v1.1 standard. In this half-day hands-on training delivered by leading experts, you'll get a head start on learning everything you need to know about the new, published version of Dataset-JSON. You'll build new skills with hands-on exercises and demonstrations to help you to master this new standard. Certificates of Achievement and digital badges will be available for attendees who successfully complete this hands-on implementation training.
Southern Ground Foyer (Mezzanine Level)
Green Room Foyer (Lobby Level)
Lyric Foyer (Lobby Level)
Lyric (Lobby Level)
Enhance your expertise with our SDTM Advanced training, a comprehensive program now featuring an additional half-day of instructor-led, hands-on implementation. This immersive course provides an in-depth exploration of SDTM, covering both complex theoretical approaches and practical applications. Through interactive group activities and practical exercises, you’ll deepen your understanding of CRF annotation, creating and validating SDTM-conformant spreadsheets, and using CORE.
On Day #2, the hands-on component will focus on applying SDTM standards in a project-based setting, enabling you to practice and refine key standardization techniques in real-world scenarios. Join us to build the skills you need to confidently implement SDTM standards in your work. Certificates of Achievement and digital badges will be available for attendees who successfully complete this hands-on implementation training.
Melody (Lobby Level)
Analysis results play a crucial role in the drug development process, providing essential information for regulatory submission and decision-making. However, the current state of analysis results reporting is suboptimal, with limited standardization, lack of automation, and poor traceability. Currently, analysis results (tables, figures, and listings) are often presented in static, PDF-based reports that are difficult to navigate and vary between sponsors. Moreover, these reports are expensive to generate and offer limited reusability.
To address these issues, the CDISC Analysis Results Standard (ARS) team has developed a logical model to support consistency, traceability, and reuse of results data. This hands-on implementation training will provide an in-depth overview of the ARS model and practical examples illustrating the implementation of the model using common safety displays.
Lyric Foyer (Lobby Level)
Lyric Foyer (Lobby Level)
Lyric Foyer (Lobby Level)
Lyric Foyer (Lobby Level)
Lyric (Lobby Level)
Enhance your expertise with our SDTM Advanced training, a comprehensive program now featuring an additional half-day of instructor-led, hands-on implementation. This immersive course provides an in-depth exploration of SDTM, covering both complex theoretical approaches and practical applications. Through interactive group activities and practical exercises, you’ll deepen your understanding of CRF annotation, creating and validating SDTM-conformant spreadsheets, and using CORE.
On Day #2, the hands-on component will focus on applying SDTM standards in a project-based setting, enabling you to practice and refine key standardization techniques in real-world scenarios. Join us to build the skills you need to confidently implement SDTM standards in your work. Certificates of Achievement and digital badges will be available for attendees who successfully complete this hands-on implementation training.