By Dr. Barbara Paldus, CEO Finesse Solutions, Inc.
The biomanufacturing industry is undergoing a major shift, from single-product processes and stainless steel infrastructure to flexible, multi-product facilities using single-use technology. Though single-use is being widely adopted, there still exists a lag in automation and measurement to make the most use of the technology and data integration.
Next Challenges in Biomanufacturing
The next set of biomanufacturing challenges go hand-in-hand: (1) the scale-down of the bioprocess, and (2) perfusion/continuous processing (CP). The first is driven by the fact that production titers continue to increase; a 6 g/L titer is not uncommon anymore, and this figure is already past the limit of a lot downstream processing capabilities. The progress in production titers has created a gap in single-use processing, where the upstream productivity is mismatched with the throughput and capacity on the downstream equipment. For a typical 200 liter bioreactor – far below the 2000 liter scale - companies will need to run multiple cycles downstream (specifically chromatography) or run parallel single-use units to meet throughput demands, raising important questions about how to automate and configure the plant for that level of throughput. Chromatography equipment will have to be configured differently to handle the large volume, as will filtration equipment, as high amounts of protein will clog filters. In these cases, companies must ask themselves:
CP offers many benefits, including increased productivity, but it brings complexities in process control, data recording and regulatory compliance. In the past, when engineers designed an end-to-end process it was divided up into unit operations that had clear boundaries: a unit operation batch followed by a transition point, the next unit operation followed by a transition point and so forth. A batch was comprised of this series of unit operations. This was regulated with unit operations in an organized manner.
With CP, there is continuous flow of material from the bioreactor through the last polishing step. How is a batch defined, now that product runs through all the unit operations continuously end to end? How is a process deviation documented when a batch fails? What are the affected materials in a failure? If a failure is detected at the end of the process, how does one know when the batch went bad, and can one keep the materials previously produced? With batch operations, it is much easier to determine the affected lots in a deviation.
The benefits to CP coupled with single-use technology are higher throughput, increased flexibility, lack of cleaning issues, and reduced operating costs. As downtime is minimized and titers are increased, products can be processed much more quickly than in standard batch operations. But as facilities scale processes down and link everything together, automation must be improved.
Measurement in Continuous Processing
The addition of CP also drives the need for analytics. Continuous operations do not have the same intermediate cut-off points to perform analytics that batch operations do, so there is a need for in-line measurement to continuously measure the protein or antibody quantity at a certain point.
This presents a challenge for the analyzers themselves, as well as the issue of how process data is fed back and how the process reacts to it. In some cases, the analyzers simply do not exist yet, but where they do exist, they may take considerable time to provide results. If there is a reduction in quality or a failure during processing, this lag makes it difficult to determine the point in time at which the issue began and diagnose the quality or contamination issue. In the meantime, the CP process is producing materials and incurring operating expenses.
Automation: Addressing Challenges and Making Use of Existing Infrastructure
Automation will play a major role in facilitating new processing strategies such as semi-continuous, perfusion, and continuous and will help bridge the gap for companies with existing infrastructure. It will enable more efficient dosing/feeding in the bioreactor, buffer dilution, pH changes in the process, and creation of gradients in the chromatography. In a continuous process with large amounts of buffers, automation will allow facilities to dilute concentrated buffers in-line to avoid sizeable liquid storage.
Updating Existing Infrastructure
One of the attractive qualities about single-use is that the components are easy to modify, unlike stainless steel equipment, which can require pipe cutting, welding, and the addition of ports. In a single-use context, the alteration is simply adding another tube, a sensor, or a junction by melting plastic. Though modification in the single-use sphere is relatively uncomplicated, automation must be added to maximize the potential of the technology. Flexible plug and play control systems will become critical in enabling older generations of single-use equipment to carry out more complex operations.
Additionally, components such as pumps and valve controls must be scaled down and made more accurate for continuous and smaller-scale processing. These physical improvements will be necessary as dose amounts become smaller - at a certain point, companies may hit a limit where a dose is a single droplet coming from a tube, which puts the burden of accuracy (or availability/existence) on the system components themselves.
Global Process Optimization: From Islands of Data to Integrated Facilities
In the past, users would normally start with unit operations of a batch process, but as processes move from single process steps to continuous, these “islands of data” must be tied together through a layer of automation for communication. There is an interdependency between measurement and real-time “action” that requires global process optimization.
OPC from Microsoft, a protocol that allows for real-time vertical integration, provides connection between different kinds of controllers (e.g. Siemens or Emerson DeltaV), data storage in one large repository, and also allows all the unit steps to interact through the upper layer. A facility can also add a manufacturing execution system (MES), such as Werum or Syncade (Emerson), so that the plant network architecture [OU1] consists of multiple layers, which perform the following functions:
Without the “upper layer,” data is not integrated and there is nothing to tie the transition between operations together. The integrated approach is a more streamlined process because a facility’s information systems and actions can function as a whole. If a problem is detected in one part of the process, the batch can be stopped before consuming a significant amount of materials (if a protein were of suboptimal quality, a facility would not want to waste the chromatography resins to purify a protein that would not go into an end product).
The benefit of integrated facilities is to provide user flexibility, and, in the case of multi-product facilities, the ability to reconfigure the process train depending on the campaign of the molecule being produced. The ability to mix and match equipment is extremely important, as certain vendors may have better solutions than others for a particular process step. This flexibility also has benefits for supply chain integrity and price: if a vendor has a quality or supply issue, or if they have a price elevation over time, the option to add a new supplier can provide stability or reduce costs.
Reconfiguring the Process Train
For multi-product facilities, it is critical that reconfiguring the process train is as fast and easy as possible. Because two products may have very different titers, variables like equipment selection, the order in which equipment is used, and the number of steps or cycles per piece of equipment should be easily re-configurable.
The traditional MES system is designed for one product and one process, with the goal of maximizing yield at the lowest possible cost. But multi-product facilities must be able to rapidly respond to product (and thus, process) changes using existing equipment in cases like the following:
The top MES layer is what can allow a facility to transition from Product A to B to C in a validated manner quickly, as it enables a facility to take unit operations in and out of utilization without losing integration.
The current challenges in biomanufacturing will be overcome with innovation in both automation and in measurement. The complexity of CP means that the challenge goes beyond simply automating each step, but being aware of the interdependence and transfers in between those steps, which will make the monitoring more complicated. Advanced sensors, yet-to-developed, will further enable the flexibility of multi-product and CP facilities. But with the ability to measure new parameters comes a need for more sophisticated automation to react to the new information, which may, in turn, require new measurements.
With many companies building new plants based on single-use, the technology has clearly moved into production in the last 5 years. The limits of bioprocessing have changed, where companies (such as Juno Therapeutics or Novartis) hope to extract stem cells or CAR T cells from an individual, process them in a “micro-factory” and then re-inject them into a patient. Each “micro-factory” is envisaged to integrate all required process steps to produce an injectable — conceptually comparable to a medical device. With hundreds of micro-factories creating personalized products under one roof, facilities will have to ensure robust data management, process control, and sample tracking are in place to be certain that the benefits are maximized safely and that end products are delivered to the right patients. Biomanufacturing companies will have to address hurdles, but the future is filled with opportunities for novel process control, analytic technologies, and ultimately, more effective therapeutics.