Presentation by Dr. ShaAvhree Buckman, M.D., Ph.D., Director Office of Translational Sciences, Center for Drug Evaluation and Research, U.S. Food and Drug Administration
If there is something that clearly caught my attention during Dr. Buckman’s keynote speech is that FDA needs to have a standards-based environment, from end-to-end, to support a fully electronic receipt, review, and dissemination environment.
Dr. Buckman started her presentation talking about the current challenges for the 21st Century Drug Development.
Patients want the most up to date information on products and quickly, and this is only one part of an extremely complicated healthcare system. “The amount of time spent in the preclinical and clinical trials as well as the amount of time it takes for one FDA drug approval, involves tremendous effort, time and money!” Dr. Buckman stated.
She portrayed the current situation by providing an example of having “one New Molecular Entity” and explained that it will entail having over 9 gb of data and over 8.8 gb of documentation. She went on to mentioned that if the FDA receives a copy of a submission, that will require ~1.7 million pages, while if they have five copies then printed at FDA, they will need 8.5 million pages! The FDA spends a lot of money on paper, and solving this issue requires having some sort of electronic submission interface which will also certainly allow for greater efficiencies. We will have efficiencies if we have new ways to get eSubmissions; we can then improve regulatory decision-making through advanced analytics.
According to Dr. Buckman,”Even though the number of electronic submissions is increasing, we are still not in a totally electronic environment. And the key to harmonization and success is being in an electronic environment.” She again stressed: “We cannot improve efficiency or innovation without having standards!“
A survey conducted by CDER to determine how much time they spend on data management showed that it required 40% of their resources. Dr. Buckman mentioned that the FDA needs to spend less time on data management and more time on primary and exploratory analysis, rather than spending so much time just getting the data in the right shape.
Dr. Buckman continued by talking about the challenges that their reviewers encounter. They follow Good Review Practices and a 21st century process. However, one of the biggest challenges of the current state of data submission is that FDA has massive amounts of clinical research data in extremely disparate formats. In some cases, a patient’s data may be distributed across several different datasets. She described it as having Legos and Tinker Toys and blocks and trying to take the time to put these together ….you cannot do it!”.
How do we change this situation? This requires Better Data: requiring electronic submissions, standardization of application data, communication and training. We also need to have Better Tools: Analytic Tools, development, validation, and implementation, streamlined business processes, infrastructure and training. And with these tools we will have Better Decisions.
We really need to have effective interaction with sponsors by having them understand what we need and what they should submit. “We need them to submit standardized data” said Dr. Buckman.
She further explained that having a repository to hold the data, such as Janus (Clinical Trial Repository), as well as the analysis tools available to work with that standardized data helps FDA facilitate the process. CDER is trying to address this through the Computational Science Center (CSC). The vision of CSC is that it aids CDER decisions through supporting high-quality quantitative analysis to assess efficacy, safety, and product quality over the product life-cycle.
“Building a CSC in this environment that we are trying to develop is like building a house, and part of building the house is that we need the first level of infrastructure, and that represents the data that we need”, Dr. Buckman continued. We realize that some of the major obstacles we’re facing with the amount of electronic data is that even with being electronically, and despite having approval for CDISC standards for submissions since 2004, some still use “quasi-standards”; this represents a major obstacle to conducting timely and efficient review including difficulties in integrating datasets and lack of traceability of the sponsor’s statistical analysis. Non-standard data presents a major obstacle to enabling a quality review; it will affect the timeliness of reviews.
Dr. Buckman referenced the CDER Data Standards Plan of May 2010. According to Dr. Buckman, a significant part of the solution is the adoption of the CDISC data standards from the beginning, as this will help create a Modern Technologically Advanced Review Environment. Having data standards is of utmost importance to the FDA and to the entire healthcare industry.
Another challenge that the FDA is facing is revealed in the complaints they receive by Sponsors. Sponsors who complain about poor communication. CDER is trying to address those issues by developing clearer communications., One of these efforts is their website listing the CDER Data Standard plan for example the Study Data referencing CDISC Study Data Tabulation Model (SDTM), and the recent CDER Data Standards Common Issues document. They are also training the reviewers on CDISC standards (though an agreement with CDISC and a new course directed towards reviewers).
An FDA Panel Discussion, chaired by Dr. Edward Helton with open Q&A from the Audience followed Dr. Buckman’s presentation. The panel included representatives from FDA CBER and FDA CDER (CSC, CTS, OBI, Office of Planning and Informatics, OSI). Here is a brief summary, with most of the questions included along with responses from FDA.
Q: Looking out several years, would the Clinical Trial Repository this be a sharable database?
A: FDA doesn’t ‘own’ the data. We are always excited when sponsors want to share data; we have PPPs and consortia where sponsors are willing to share the data. Once you build the warehouse when you have the data in a relational database, it would not be difficult to share. An FR (Federal Register) notice is coming out soon focused on wanting to find ways to facilitate data sharing.
Comment from the Session Chair: Janus will provide institutional memory to drive decision support in the future. Queries to the database could give FDA answers to share generically with sponsors, appropriately.
Q: There are some parallel projects through the science enclave, e.g. with PPPs and with CDC and others to look at the Legacy Data Conversion data outcomes.
A: Stay tuned for the FR notice; FDA’s ‘hands are tied’ to get something out of the warehouse requires going to each sponsor to ask for permission.
Q: About Janus, this is either 3rd or 4th generation. It is a difficult and admirable project; are we starting over for the fourth time? What will make it happen and what lessons have been learned?
A: The basic vision of a CTR has not changed; what has changed for me is a better appreciation for the complexity of clinical trial data and the complexity of modeling that data for warehousing and analyses. If you look at the first model, the richness has increased since there is a better understanding of the standards that are needed. The new model will be industrial strength. It is “BRIDG informed”; BRIDG has matured tremendously over the past years and really enables technologies now.
Session Chair: We have learned a lot and we are at a maturity model where we have never been before. What we recognize as well is that it will not be perfect and there will be gaps to be filled. But, this should now fill the needs of the Agency.
FDA Comment: The new Janus model will be made publicly available. 30 Sept 2013 if not sooner.
Session Chair: I anticipate that there will be significant discussions with CDISC in the coming months where we are going to start integrating and, as it evolves, we need to talk with you all. Some events would need to take place.
FDA: The scope originally was all data would go into Janus. That is too large to measure success. In this iteration, we have narrowed the scope to clinical data only and the model is expandable so that we can add other data into it in the future.
Session Chair: Last year, tell us what you want us to do and we will do it; then, we said here is what we want to do; now we will be doing it. Now it is SDTM based and then ADaM and SEND. In the future perhaps HL7 models will be accepted into it.
Q: 3.1.2 SDTM Amendment sends mixed messages around how we should submit it and also the removal of the white space – should we follow these in future submissions?
FDA: As many of you know, the size of datasets has dramatically increased (over 100MB) – so that the reviewers cannot open them and they exceed the size of the gateway. There is tremendous expansion of the space under the datasets (regardless of M or F vs. large text streams)….lots of wasted space. We would look now for companies to start shrinking down the space so that they do not need to split the datasets. We would like to see companies move in that direction.
FDA: Some of the items in the amendment are things that the reviewers have been asking for over and over and over again. These recommendations come from reviewers and they won’t complain if you follow the amendment. If you are uncertain, you should contact the reviewer.
FDA: For CBER – file size is 1 gig. The death date in the demographics is very important, also.
Q: Is there a grace period for implementation?
FDA: We have not discussed that. We are encouraging sponsors to follow this and to submit standard data.
FDA: It is worth noting that, as we adopt standards and they evolve, we recognize that sponsors cannot adopt them instantaneously so there will always be a grace period. We will take into consideration that you have cycles such as budget cycles.
FDA: There is a CDER/CBER/CDRH document that is on the website and this will go into the Guidance.
FDA: We fully intend to collaborate on the development of the scripts, use with CDISC data elements being the target data content. There is lots of activity and energy around the CSC conference – come to this and join with the effort.
FDA: Sponsors need to decide whether they are going to share their scripts openly.
FDA will share those that were generated by reviewers.
FDA: We are very grateful that CDISC standards are available.
Q: Is it indeed time to take the next steps to replace SAS V5 with an xml based on define.xml?
FDA: Please submit your questions to the e-mail provided (eData@fda.hhs.gov).
FDA: The only way we know where we have challenges we don’t monitor what each reviewer says to each sponsor – although it may be helpful. Avail yourselves of the e-mail we gave you. Our folks with the eData team need this info.
FDA: Reviewers are on tight schedules with lots of oversight. A serious question won’t be a problem with your application. That’s how we learn about these things – through legitimate questions: eData@fda.hhs.gov
By Diana Harakeh, CDISC, Manager of Communication and Marketing
Share this on