Abstract technology background with digital data binary code and electronic circuit board (PCB) microchip processing information signal, hi-tech and robotic automation concept

Bioprocessing 4.0

Digital transformation of biopharmaceutical process development

scroll for more

Digital transformation and Biopharma 4.0

The digital transformation has become an increasingly important topic for the biopharmaceutical industry as it leads to new opportunities for additional value creation and leverages essential competitive advantage. Digital biomanufacturing takes advantage of the internet of things to connect disparate sources of data, equipment, materials and people. In combination with artificial intelligence, augmented reality, robotics and digital twins, biopharma 4.0 changes legacy concepts from the ground up.

Biopharma 4.0: IT gets connected with operational technology

Benefits arise along the entire value chain, starting from developing new product candidates and the associated manufacturing processes, up to regulatory aspects, market approval, manufacturing, quality and much more:

  • Digital lab folders and data lakes enable a standardized aggregation of structured and contextualized data through the entire organization. Data hubs are a prerequisite for most pharma 4.0 applications.
  • Big data and artificial intelligence are used to screen potential target molecules virtually and play a key role towards personalized medicine.
  • Digital twins of production assets enable more efficient operations, flexible and agile process designs, shorter time to market, lower process development costs, and an improved process understanding.
  • Virtual process control strategies and soft sensors allow real-time process monitoring and control; and, finally, real-time release testing of the product.
  • Virtual and augmented reality support the design of the manufacturing facilities of the future, while increasing efficiency and reducing downtime in existing plants. Operator training and product change-overs become more efficient.

Digital twins in the biopharmaceutical industry

A digital twin is commonly understood as a virtual representation of a real-world process that allows understanding, optimization and monitoring of the process. Industries such as aircraft or automotive engineering fundamentally rely on the application of digital twins from concept to design, up to procurement, manufacturing and service.

The virtual representation is typically accomplished by precise process simulation, based either on statistical approaches such as data analytics or machine learning or on fundamental natural sciences. Digital twins based on process simulation have revolutionized many industries, including, amongst others, the chemical industry:

Process modeling is the single technology that has had the biggest impact on our business in the last decade.

Frank Popoff, former CEO of Dow Chemical

Next generation bioprocesses are driven by digital twins

Although other industries seem to be far ahead, the digital twins of bioprocesses have started to affect the biopharmaceutical industry as well. As a game-changing solution, digital twins allow the replacement of laboratory experiments with in silico simulations, enabling a cheap and fast environment for research, development and innovation. Digital bioprocess twins have an enormous potential for value creation.

Benefits of digital bioprocess twins along life cycle

Digital bioprocess twins: statistic vs. mechanistic models

Digital twins of bioprocesses can be accomplished with statistical or mechanistic models. It is essential to consider the differences, since both modeling concepts have their pros and cons. In a nutshell:

  • Mechanistic models require a profound and fundamental understanding of the driving mechanisms, denoted in terms of physical/mathematical equations.
  • Statistical models are the only feasible method for processes with limited fundamental and quantitative understanding, e.g. living systems like the entire metabolism of a cell.
  • Statistical models are typically the method of choice for technical processes with low geometrical complexity such as a bioreactor which can be simplified to a stirred tank reactor.
  • Mechanistic models are required to simulate processes with high complexity, such as chromatography or filtration processes. As an example: All effects along the chromatography column and within the adsorber pores contribute to the final chromatogram. Statistical models are typically not able to represent this level of complexity. Instead, they oversimplify the digital twin, which may jeopardize the project success.

To be compliant with regulatory requirements on process understanding and Quality-by-Design, the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use considerations (ICH) further suggests utilizing mechanistic models for PC/PV.

Snapshot of video on digital bioprocess twins.

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

Digital twins based on mechanistic models turn data into deep process knowledge

Digital downstream twins built from mechanistic models turn sparse process data from development or manufacturing into deep process understanding. This knowledge is expressed in the form of model equations and parameters, allowing the development and management of process knowledge in simple, clear and precise language.

Compared to experimental approaches, they allow more and better data to be gathered as well as sound and scientific decision-making along the entire product lifecycle to be leveraged. Associated risks can be mitigated early and at the lowest costs.

Integrated in a development lab or a manufacturing site, digital twins built from mechanistic models constitute soft sensors and thereby facilitate a data-based approach for more effective biopharma process control and  root-cause analysis.

Downstream process simulation – why not use chemical tools?

Size of molecules: acetylsalicylic acid vs antibody.
Size of pharmaceuticals: A typical chemical molecule (acetylsalicylic acid) is composed of 21 atoms; an antibody is composed of more than 25,000 atoms

For decades, downstream process simulation tools have played a crucial role in the chemical industry. The core mechanisms in chemical downstream processing may seem similar to those in biopharmaceutical downstream processing. However, biopharmaceuticals are significantly more complex than small chemical molecules. Therefore, mechanistic simulation tools for chemical processes do not qualify for biopharmaceutical process simulation.

Biopharmaceutical process simulation is an extremely interdisciplinary affair. It requires a fundamental understanding of biotechnological processes, substantial knowledge in applied mathematics, solid software programming skills and profound industry knowledge. Despite the popularity in the chemical industry, biopharmaceutical downstream simulation has remained an academic topic for decades.

DSPX facilitates biopharma process simulation

laptop mockup DSPX_Model application

GoSilico’s DSPX simulation software overcomes the barrier of complexity and opens the door for digital bioprocess twins for large biomolecules. DSPX is based on a new generation of physical and biochemical models that accurately describe the behavior of large proteins. State-of-the-art mathematics leads to unsurpassed simulation speed and model calibration techniques using machine learning to minimize the need for expert knowledge and human error.

DSPX enables the full utilization of mechanistic models by providing an accurate, fast and robust simulation framework that is also easy to use. Check out GoSilico’s industrial success stories and references for an overview of how this technology is changing downstream bioprocessing.

Discover in silico case studies