In the light of the advancing digitalization, mechanistic modeling has gained traction in the biopharmaceutical industry. In particular, mechanistic models to describe chromatography processes have turned out as a very promising technique. These models offer numerous advantages such as cost and time reduction and are applicable at any point of the product life cycle. Digital twins based on mechanistic models enable an in-depth understanding of even very complex separation problems.
However, the main drawback of mechanistic modeling is the cumbersome approaches to model calibration. Multicomponent feed stocks lead to multidimensional parameter estimation problems with many unknown protein parameters. Standard model calibration techniques may result in unreasonable correlations and unphysical parameter estimates. But how can we mitigate this risk, improve the model’s quality, and reduce the time we need for model calibration at the same time?
Here’s GoSilico’s recommendations on a straightforward and GMoP (Good Modeling Practice) compliant modeling workflow.
1. Definition of the modeling purpose: Why am I modeling this process?
The first step is to set the scope and to define the project goal. It is important to understand the process challenges and define the expected model capability to ensure that the model will meet the desired requirements later. These considerations are very important for the experimental planning of the calibration runs and also for the definition of a model validation strategy.
2. Experimental planning: What information to include for mechanistic model calibration?
The model requirements with respect to quality need to be defined first and the capacity, resources and expertise requirements need to be specified. Only afterwards suitable experiments can be identified. Two aspects are especially important when it comes to the calibration experiments. First, the quality of the experiments is crucial to eliminate as many uncertainties as possible (e.g. unknown porosities, system dead volumes, ionic capacities, …). Second, the calibration experiments need enough variation to allow deriving all relevant information necessary for the model application.
At this stage also the validation strategy should be already planned or at least roughly specified. This will help to ensure that the model can meet its design purpose later.
3. Model selection: How much complexity is required?
Going on to model selection, again it is important not to introduce additional uncertainty by choosing too complicated models with parameters that are irrelevant for the project at hand. The selected column, pore and isotherm models should be only as complex as necessary.
4. Data plausibility check: Does my data make sense?
Before the generated data is used for model calibration, the plausibility of the experimental outcomes should be investigated. It is important to account for the characteristics of the experimental data, e.g. back mixing effects need to be modeled accordingly.
5. Mechanistic model calibration: How to determine the unknown model parameters?
To avoid parameter correlations and unreasonable parameter estimates without physical meaning, the model calibration workflow should ideally use a bottom-up approach starting with a simple model and few calibration experiments and adding more complexity step-by-step. Certain attributes of chromatograms transfer information about certain parameters. Ideally, this fact should be used for parameter determination.
As an example, in IEC linear gradient elutions (LGE) at low column loading can be used to determine the protein charge and the equilibrium constant using the Yamamoto approach. To do so, at least 3 LGE featuring different gradient lengths are needed to determine the charge and equilibrium from the gradient slopes and the salt concentration at retention. The parameter values obtained can then be transferred to additional calibration experiments. The Yamamoto approach is directly implemented in GoSilico’s simulation software ChromX as “Peak finder”.
After the charge and equilibrium parameters were identified, other parameters such as the binding kinetics and the mass transfer and pore diffusion characteristics can be determined from a step elution. The ligand shielding due to steric hinderance and repulsion between the protein molecules can be determined from an additional high loaded LGE. This step-by-step approach will mitigate the risk of unreasonable correlations and unphysical parameter estimates.
6. Parameter uncertainty analysis: Can I trust my calibrated model?
After mechanistic model calibration, the model quality needs to be investigated. The first indicator for model quality is the visual fit. A high-quality model describes all calibration runs accurately.
Once a good visual fit was obtained, the model quality can be evaluated more intensely. To investigate the parameter sensitivity, the 95% confidence intervals (CI) need to be determined. Those can be calculated directly in ChromX. Well determined parameters have small CI. Large CI indicate that the respective parameters could not be determined accurately from the calibration data.
To investigate the influence of the parameter insensitivity on quality attributes such as yield or purity, a parameter space sampling can be done within the CI. This will help with the decision if certain parameter sensitivities are sufficient. Based on the outcome of the parameter uncertainty analysis, additional calibration experiments can be performed to improve the model quality.
7. Model validation: Is my model’s prediction accurate?
Once the final model was calibrated, the model must be validated experimentally. As mentioned before, the validation strategy should be already defined in the stage of experimental planning. The model validation can be done with one or multiple experiments.
A common approach is to perform model validation with in silico optimized process conditions. Another possibility is to choose process conditions at the edges of failure of the model. It is also possible to include experiments at different scales in the model validation if all used systems and columns were characterized. Every calibrated mechanistic model should be able to extrapolate to process conditions outside the calibration space.
The validation is typically done with respect to peak shapes and positions as well as critical quality attributes. The validation runs can be imported to ChromX to compare experimental data and model prediction.
8. Model application: Stop experimenting and go silico with your mechanistic model.
Once successfully calibrated and validated, the model can be used along the entire development life cycle of the product.
The desired model application should be kept in mind during the whole modeling workflow as different model purposes result in different implications concerning model quality and capability to guarantee a suitable ratio between effort and benefit.
This clearly defined, straightforward workflow facilitates the calibration of mechanistic models significantly, increasing the speed of model calibration, reducing the model uncertainty, and mitigating the risk of parameter overfitting at the same time. Following this approach, the main drawback of mechanistic modeling can be diminished, allowing this technique to unfold its full potential.