For the market approval of a new biopharmaceutical drug, manufacturers are obliged to ensure its safety and efficacy. But not just the drug itself is under high scrutiny – the production process as well must be proven as stable, robust and compliant to regulatory demands. The identification of critical process parameters and acceptable process operating ranges is getting more and more complex and is, in its entirety, subject to the requirements of Good Manufacturing Practice and regulatory approval.
But how can we accurately define all the critical parameters before we even run the production process? Digital twins based on mechanistic models offer a promising approach to solve this quandary.
In the past few years, the concept of digital twins has entered the stage in the biopharmaceutical industry. Looking back on decades of successful application in other industries such as aviation and automotive, digital twins now start to establish in biopharma as well, sounding the bell for the bioprocessing 4.0-era.
Mechanistic models: The core of digital bioprocess twins
Digital bioprocess twins allow for an accurate prediction e.g. of commercial processes or different production scales. The range of application is broad and starts with early stage process development, over late stage process characterization, up to manufacturing control. The key asset of digital twins is the model behind it – the better the model, e.g. a mechanistic model, the more applicable and valuable the digital twin.
As the pioneers of in silico – computer guided – process development, we strongly believe in the strenghts and advantages of mechanistic models and therefore initiated the Chromatography Modeling Days (CMD). This conference is dedicated to the chromatography modeling community and focuses on all facets of mechanistic chromatography modeling. During the three days of the event, our speakers provide case studies, industrial user cases and interesting insights into their workflows. The CMD offer a great opportunity for professional and informal exchange, knowledge transfer and getting in touch with other passionate modelers.
We are happy to announce Dr. Joey Studts (Boehringer Ingelheim) as our keynote speaker at CMD. Joey has extraordinary experience in the field of downstream process development. He and his team have been implementing mechanistic modeling in several projects over the past years, especially in late stage development. We asked him to answer a few questions to shorten the waiting time until the next CMD…
Modeling [...] was starting to make some strong progress in the community as a whole and we see this technology as a game changer for the industry.
Dr. Joey Studts, Director Purification and Formulation Development at Boehringer Ingelheim
GoSilico: Joey, you have been working in the field of downstream process development for almost 20 years. What were your observations, which technologies will have the largest impact on process development in the near and long term?
Dr. Joey Studts: At Boehringer Ingelheim (BI), we spend a lot of time and resources to stay at the forefront of biopharma process development, both as a contract manufacturer and for our internal pipeline. For me as a downstreamer, a lot of that focus has been on optimization of unit operations, analytical methods, automation and PAT. But all these approaches have only brought us incrementally forward.
A few years ago, there was a push to make bigger steps, we wanted to put more focus on game changers. At this point, we started looking at column free processing, continuous processing and modeling. My team took a large interest in modeling because of two key aspects: First, BI had a lot of data from our internal pipeline and many many years of experience as a contract manufacturer. Second, modeling, both using statistical tools and mechanistic models, was starting to make some strong progress in the community as a whole and we see this technology as a game changer for the industry.
Where and how did you get started with the application of mechanistic models? What was your path at BI?
Working together with our service providers and academic partner as a team, we set up some questions such as: How well does mechanistic modeling work? What are the minimal experimental requirements to set up an usable model? How far can we stretch the model and what are limitations of the model? And finally, how do we prove to ourselves that the model works and get regulatory acceptance for the knowledge generated from the models?
In the very beginning, we focused on process development but then realized, that we needed a proof that the models are predictive. We also evaluated our workflows, costs and timelines and based on these findings we decided that the largest impact would be in late stage with process characterizations. In late stage development, we have a lot of process knowledge to compare the modeling results to different scales but also big work packages, stressful timelines and a direct interaction with regulatory authorities – thus a huge need for optimization and the possibility of relatively direct regulatory feedback.
Thus, we changed our strategy and started applying mechanistic models during process characterization to establish a strong base of knowledge. With the large amounts of data, we could really see how accurate and scalable the models are. We then put a strong focus on model quality and validation.
Having in mind that process, where would you consider Boehringer Ingelheim and the technology of mechanistic modeling standing now?
I have to honestly say we were surprised how accurate the models proved to be early on. We thought there would be more work to optimize the models to fit our needs. Well, maybe I was more surprised as the team was always very confident. But to be honest, we were lucky enough to recruit a few really key team members and PhD students along with very strong working relationships with the academic partners and software providers so the technology developed very quickly.
We have several publications on the strategy with these partners and the technology is indeed impacting many of our late stage processes at this point. Probably the most immediate success was to accurately predict the impact of a technical failure on a manufacturing process. I won’t go into detail but the data provided by the model accurately predicted the impact on product quality of a commercial scale run and as a result the run was saved and not discarded. This is a very large cost savings.
In addition, we are proud of our publication record in such a short time and we are confident that the model data have been successfully integrated into the regulatory strategy of several projects. The next steps will be to continue to expand the impact of modes on process characterization and scale-up and to get more detailed feedback from regulatory authorities on our model quality and model validation strategies.
How do you see this developing in the direction of regulatory interactions? What do you think, how will in silico solutions and mechanistic modeling influence regulatory requirements?
We are really confident that many of the regulatory authorizes are ready to and interested in looking at modeling data. Through interactions with both regulatory colleagues and colleagues throughout the industry, we do not believe that modeling will pose a large question for the authorities.
We believe the clarity will need to come around how the models and the data from the models are being applied. How can BI demonstrate the quality of the models and the model validation strategy being used to show the predictability and robustness of the models. In fact, relative to other industries, even the NCE world, we as BioPharma companies might be holding ourselves back by being too conservative and relying too heavily on the wet lab data.
As I said, we have impressed ourselves with the data that can be generated from mechanistic models in downstream and I think a smart combination of these models with statistical approaches will bring a wealth of process understanding and can have a significant impact on what is currently meant by Quality by Design (QbD). The current standard of Design of Experiments based approaches provides a relatively limited amount of data compared to a fully implemented mechanistic model in our view so the impact on industry might be to raise the bar of process understanding.
Joey, thanks a lot for your answers! We are looking forward to getting into more detail with you at CHROMATOGRAPHY MODELING DAYS in September where you will be giving the keynote!
The answers given are the opinions of Dr. Joey Studts and do not represent the official strategies of Boehringer Ingelheim.