Skip to content
    September 8, 2019

    Why Bioprocessing 4.0 Is A Force To Be Reckoned With

    Whether you’re ready or not, the fourth industrial revolution is on its way. Not just a funky new buzzword, “Industry 4.0” refers to a new approach to manufacturing that we can’t afford to ignore–bioprocessing engineers included. 

    What is industry 4.0?

    To give bioprocessing 4.0 some context, we need to consider a trend that is shaking up manufacturing across different sectors: Industry 4.0.

    Industry 4.0 is an umbrella term which refers to a new approach to manufacturing.

    Preceded by the first (think steam engines, mechanization and the first factories), second (electricity, gas, and oil) and third (electronics, computers, the internet) industrial revolutions which enabled productivity leaps, this one is... different. It’s digital. Connected. Integrated.

    While there’s no hard and fast definition, these are the general features of industry 4.0:

    • Physical processes affect computations and vice versa. Through feedback loops known as cyber-physical systems, control components in a manufacturing process are controlled by computer-based algorithms.
    • Connection via the Internet of Things (IoT). Computing devices and machines are connected to each other via the internet. Through inbuilt sensors, data from interrelated devices are fed into a cloud platform, allowing algorithm-driven analysis and optimization.
    • Smart factories; factories run in an autonomous manner, self-optimize and learn in real-time. Powered by artificial intelligence, advanced robotics and sensor technology, the manufacture and delivery of products are adapted to real-time shifts in demand and transport availability.

    The original term was coined by the German government in 2011, which announced “Industrie 4.0” as one of the key initiatives in its high-tech strategy to drive manufacturing forward.

    When these concepts are applied to different industries, loosely defined variations arise, such as “pharma 4.0”, “bioprocessing 4.0”, or “digital bioprocessing”.

    Bioprocess engineers set to benefit from the evolving industry

    So, what does this mean for bioprocessing engineers? How could it shake up the way they work? According to Dr James Rutley, Senior Bioprocessing Engineer at Synthace, the potential benefits are enormous: “Connected devices and automated collation, organization and analysis of data structures would enable greater data integrity and faster processing of time-critical information. Furthermore, adaptive control of unit operations using real-time data processing opens the door to much more efficient bioprocess optimization, potentially removing whole experimental iterations in the development of a robust, commercially viable bioprocess.”

    Automated data processing combined with physical process execution has the potential to boost many bioprocesses. In protein production, the number of possible parameter combinations grows exponentially with each parameter, of which there are many e.g. best expression construct, secretion signal peptide, inductor concentration, induction time, temperature and substrate feed rate in fed-batch operation. Using standard microtiter plate cultivation, the parameter selection process will often be based on empirical knowledge, and only a few process modifications can be tested. In contrast, the time to finding optimal bioprocess parameters can be dramatically accelerated by feeding robotics data that has been rapidly and automatically analyzed.

    In a more recent example, the value of automation has been demonstrated in cell culture, whereby the generation of retinal pigment epithelium (RPE) is an attractive therapy for treating age-related macular degeneration. Recognising that the differentiation of human pluripotent stem cells (hPSCs) remains a laborious, expensive, and long process; Regent et al. describe in Scientific Reports a fully automated RPE cell differentiation process, from hPSCs thawing to the banking of differentiated cells. Following their protocol, it is theoretically possible to produce a cell bank far larger than has previously been described.

    Overall, a shift towards a more automated, data-driven system would allow engineers to spend more time on innovative aspects of their role, and less time navigating tedious re-runs and breakdowns.

    The same applies for bench scientists; few would complain about spending less time preparing plates, re-doing assays and drowning in masses of data.

    As summarised in this PharmaTimes article by Dr Peter Crane, Corporate Strategy Manager at Synthace, scientists should be at the centre of the shift:

    “Digital solutions on their own will not deliver the productivity gains the industry seeks, only by introducing tools that empower R&D teams to work more efficiently in the laboratory setting will the true power of data science in biopharma be realized.”

    How automation can help biopharma

    Dr Crane explains: “Biopharmaceutical companies are under pressure to get products to market quicker, with every day of delay projected to cost approximately  $1-13M in lost revenue for a best-in-class product in a major indication. Likewise, in some rare diseases, it is highly possible to treat an entire latent patient population in the trial, leading to a natural acceleration amongst companies chasing these indications. As such, many companies with innovative products are choosing to compress their clinical trials through an accelerated approvals route, essentially weighing up patient benefit versus the risk of less Chemistry, Manufacturing and Control (CMC) data. This means that innovative products are being launched into the market, often without extensive CMC studies, as would be the case in a standard clinical development pathway. Within the cell and gene therapy industry, challenges around process and assay robustness, and the increased stringency of the commercial label are also hampering the ability of companies to deliver reimbursable product to patients (or expand their label into other indications). Likewise, companies adopting an expedited  505(b)2 route for less innovative products, are running into expensive failures often due to an incomplete understanding of the formulation design space. In all these cases a potential solution is to fully adopt a Quality by Design (QbD) approach earlier, with preliminary design space exploration through a DoE (design of experiments) even maybe being done within R&D. Enabling this burdensome task, earlier in the value chain, through QbD coupled to physical and data automation is where we see the potentially transformative impact new digitally enabled tools in biopharma.”

    CAR T therapy and other innovative personalized therapies hold great promise, yet these place significant pressure on bioprocessing companies to deliver complicated, high-quality products, for often small patient subsets, at affordable costs.  Fair to say, biopharma could do with a helping hand.

    Here, “scaling out” (bioreactors remain small, but more are used) is expected to enable the delivery of these innovative products to patients, where the traditional model of “scaling up” (increasing the size of bioreactors) may not. Havingsmall, closed and single-use bioreactors means that multiple patients can be treated in parallel. This then introduces further complications around sample traceability, release testing and facility planning -- all challenges that can be partially resolved through technology. The end goal being have localized manufacturing of patient-specific therapeutics, controlled for quality by a suite of industry 4.0 multi-site software and hardware.

    The table below shows examples of challenges in the bioprocessing industry, and lists ways in which Bioprocessing 4.0 can help:

    Challenge

    How Bioprocessing 4.0 Can Help

    R&D processes are typically not robust and require extensive optimization Physical and Digital Automation enables researchers to optimize their experiments quicker and accelerate the learning cycle.
    Manufacturing cell therapies is complex and far from optimized Better analytics combined with a Quality by Design approach would provide a deeper understanding of process intermediates as well as the final product.
    Maintenance of bioprocessing components requires downtime Automated sensors that predict wear and tear would minimize maintenance downtime, reduce spare part inventory, and enable the constant improvement of systems.
    Release testing can be a lengthy process Real Time Release Testing enabled by Process Analytical Testing can shorten the end-points allowing faster release of therapeutics to patients.
    It takes time to design, plan, and set up bioprocessing runs Automation of data structuring and processing, and feedback of analysed data into subsequent experimental designs would reduce time between runs, and help with run reproducibility.

    What’s it going to take?

    If the bioprocessing 4.0 future is to be realized, what technical changes are needed? Dr Crane highlights three places to start:

    • More equipment needs to be digitally- or cloud-enabled

    • Investment in internal IT systems to enable high data volumes

    • Investment in pre-processing systems to cater for experiments that produce large data sets (e.g. metabolomics a technique that starting to be adopted in bioprocessing)

    What would a 4.0 advocate say to those concerned about the time it could take to transition into a more connected laboratory environment? Crane says we must consider why the transition may be time-consuming:

    “Often it will be time-consuming due to the high CaPEx requirement of replacing equipment, or due to internal IT. In our vision, software should work with existing lab hardware, breaking down a slow procurement blocker and delivering benefits to bioprocessing teams, today”

    In this context, it is worth contemplating a quote by Ralf Speth, automotive executive and current CEO of Jaguar Land Rover: “If you think good design is expensive, you should look at the cost of bad design.”

    Dr Crane also notes that if data science is to become the new de facto standard, then all scientists need to learn the basics of working in a digitally enabled way. This demand is being recognized by some institutions; UCL recently advertised for the role of Lecturer in Digital Process Engineering.

    Other industries, such as finance, have shifted away from a “this is how we’ve always done it” mindset… now it seems it’s only a matter of time for biopharma to make the leap too.

    Other posts you might be interested in

    View All Posts