Buscar

Chapter 32 - Downstream Process Design

Prévia do material em texto

637
Biopharmaceutical Processing. https://doi.org/10.1016/B978-0-08-100623-8.00032-3
© 2018 Elsevier Ltd. All rights reserved.
Downstream Process Design, Scale-Up 
Principles, and Process Modeling*
Karol M. Łącki,*, John Joseph†, Kjell O. Eriksson‡
*Karol Lacki Consulting AB, Höllviken, Sweden, †GE Healthcare Lifesciences, Amersham, United Kingdom, ‡Gozo Biotech Consulting, Gozo, Malta
32.1 INTRODUCTION
Once a therapeutic biological molecule has the potential to become a new product candidate, a complex, multi-faceted 
development program must begin [1]. A big part of this program is the chemistry, manufacturing, and control (CMC) activ-
ity that focuses on development of a reliable manufacturing process, including the necessary quality control system (see 
Chapter 50). As stated in ICH Q8 guidelines [2], “the aim of pharmaceutical development is to design a quality product 
and the manufacturing process to deliver the product in a reproducible manner.” Therefore, the developed process must be 
validated to ensure that the produced product meets the safety requirement for human administration throughout the whole 
product lifecycle.
Process validation is defined as the collection and evaluation of data, from development through to commercial produc-
tion. It establishes scientific evidence that a process is capable of consistently delivering quality product and involves a series 
of activities taking place over the lifecycle of the product and process. These activities can be classified into three stages [3]:
●	 Stage 1—Process Design: The commercial manufacturing process is defined during this stage based on knowledge 
gained through development and scale-up activities.
●	 Stage 2—Process Qualification: During this stage, the process design is evaluated to determine if the process is capable 
of reproducible commercial manufacturing.
●	 Stage 3—Continued Process Verification: Ongoing assurance is gained during routine production that the process re-
mains in a state of control. This guidance describes activities typical of each stage, but in practice, some activities might 
occur in multiple stages.
In this chapter, we will only focus on Stage 1, the process design. The objective of designing a manufacturing process 
for a biopharmaceutical product is to find the best tools and procedures to consistently and economically make sufficient 
quantity of the target molecule, isolate it from the production system, and then purify the target molecule to the level of 
purity specified for the final product, in other words, the active pharmaceutical ingredient (API). Each API must have well-
understood characteristics, must meet predetermined quality attributes, and must be manufactured according to a robust 
and validated manufacturing process. As such, the process design should ensure that the production process is (i) suitable 
for routine commercial manufacturing that can consistently deliver a product that meets its quality attributes and (ii) these 
attributes can be reflected in planned master production and control records [3].
Robustness of a manufacturing process has been of key importance since the early days of the biopharmaceutical indus-
try. Rightfully, patient safety has been the focus for the early developed processes. However, as the industry has matured and 
manufacturing and regulatory experience increased, an additional challenge has emerged, that of cost of manufacturing. The 
pressure is now on to reduce the cost of manufacture without compromising the product quality, hence the patient safety.
The pathway to achieve a robust manufacturing process can be rather complex and, in some cases, lengthy. It involves 
several key business and scientific decisions. In this chapter, we will focus on the latter, thus assuming that the business 
case is solid and that the molecule to be produced will be introduced to the market, providing it fulfils all regulatory 
specifications.
Chapter 32
*Parts of text, figures, and tables used in this chapter are reproduced from L. Hagel, G. Jagschies and G. Sofer, 3—Process-design concepts, Handbook of 
Process Chromatography, second ed., 2008, Academic Press, Amsterdam, 41–80, with permission.
638 SECTION | VI Industrial Process Design
This chapter will provide the reader with the basic understanding of process design and introduce a few concepts impor-
tant for consideration during this stage of the product lifecycle. The main focus of the chapter will be on the downstream 
purification part of a manufacturing process. However, because the successful design of a downstream process always 
takes into account the various interdependencies between upstream and downstream parts, some important aspects of the 
upstream process design will also be discussed. Throughout the chapter, an example of a universal roadmap for process and 
control strategy development will be outlined and discussed by focusing on general rules and dedicated tools for develop-
ing, characterizing, and scaling-up of downstream processes. Generally applicable methodologies for sizing of filtration 
and chromatography unit operations will be outlined. Because the process design should also account for the functional-
ity and limitations of commercial manufacturing equipment [3], examples of relevant constraints will be given. The final 
section of this chapter discusses the use of various computational modeling strategies that could facilitate the design and 
scale-up of manufacturing processes.
32.2 PROCESS DESIGN LANDSCAPE
While the aim of every firm is to be able to design a process that will be robust and consistent in producing material for toxi-
cology studies to the manufacture of a licensed product, in reality, changes will need to be introduced as the development 
and clinical phases progress. The process evolves in several stages, each addressing a specific need: (1) the production of 
material for pre-clinical studies; (2) the production of material for clinical studies Phase I/II, and (3) large-scale production, 
usually first conducted for Phase III, but ultimately for the commercial market (see Fig. 32.1). During each of these stages, 
one or more well-managed transfers between the development lab and internal or external manufacturing sites and groups 
will be required. Costly changes of the process and concomitant critical attributes of the product will result in risk for de-
lays or even failure of the project. Therefore, the impact of these types of changes need to be minimized during the project 
progression from toxicology studies to licensed product, through coordinated and well- documented activities related to 
process development and establishment of a process control strategy. Efficient and robust process design can help mitigate 
the risk of change and delay on the overall process.
32.3 THE CORE ELEMENTS OF PROCESS DESIGN
Process design relies on two equally important and interdependent elements: (i) process development, and 
(ii) process control.
32.3.1 Process Development
Process development (PD) activities should lead to the establishment of what is termed the “design space,” which represents 
the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that 
have been demonstrated to provide assurance of quality [4]. Regulatory authorities expect process development to be based 
Manufacturing at scale
Need for cost efficient process
(high does indications and other
high dose therapies may not be
covered by health insurance
systems long time*) 
Manufacturing for clinical trials 
Process should have potential to
scale to full scale manufacturing
w/o much modification 
Process transfer
(internal/external)
high risk points
source of potential delays
high risk points
source of potential delays
Process transfer
(internal/external)
Stage 2 Stage 3Stage 1
Preclinical Phase I Phase II Phase III Manufacturing
Manufacturing for
toxicology
First version of the 
process developedw/ the initial product
quality specification
FIG.  32.1 Evolution of a process to manufacture a biopharmaceutical from process development to commercialization. Artwork courtesy of GE 
Healthcare, reproduced with permission.
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 639
on sound scientific methods and principles, in combination with risk management tools which are applied throughout the 
development process. The current industry dogma is that quality, which is generally expressed as identity, concentration, 
and purity of the product cannot be tested, but instead should be assured through process understanding and process control. 
This dogma is commonly referred to as assuring quality by design (QbD).
In general, process development studies should provide the basis for process improvement, process validation, con-
tinuous verification, and any process control requirements [4]. The PD program will identify any critical and key process 
parameters that should be monitored and controlled; that is, those parameters which could affect the product critical quality 
attributes (CQAs) and those that will be key to process performance from the economical perspective (that is, generally 
aiming for an optimization of process yield).
Depending on the level of experience within the organization, overall process development efforts and timelines can 
vary tremendously. Even within an experienced organization, a new molecule type can generate unexpected delays and 
challenges.
Process development is usually performed first on individual unit operations. These are then connected in a logical se-
quence to form a process capable of delivering a drug substance with the specified quality attributes. The process sequence 
itself may vary between the different product development stages (Fig. 32.1), in other words, between an early process 
development and the processes used to manufacture material for clinical phases I and II, and even phase III and commercial 
manufacturing. These changes might be dictated by large-scale manufacturing constraints, introduction of new technolo-
gies, or even an improved process understanding leading to a more robust process. Furthermore, because the manufacturing 
process is based on sequential operations, any change in one step may have some impact on subsequent steps. The earlier in 
the sequence the change will take place, the stronger might be its impact. To confound this situation further, development of 
the fermentation/cell culture process (upstream) and the recovery and purification process (downstream) is almost always 
carried out by different groups, often without much coordination of these activities. Hence, while the upstream team is al-
ready working on a new process version, the downstream team is still developing a process for material from the previous 
upstream process iteration. Similarly, development of process steps may occur without sufficient awareness of the existing 
operation limits and the capability of large-scale equipment in manufacturing. Therefore, an early awareness of the final 
scale of manufacturing and its potential constraints is essential for successful process design.
The above-mentioned interdependencies have been realized over the years and companies have developed manage-
ment frameworks and associated workflows to assure that potential effects of technology changes or process optimization 
are minimized. To manage this, companies have introduced decision points, or project progression toll gates [5], in their 
upstream process development by considering the impact of any key process variables on product manufacturability. For 
instance, as discussed in Chapter 4, should product titer be improved through optimization of the cell culture process, 
the impact is evaluated on the downstream purification steps of the process. Although a higher titer may lead to gains in 
product quantity, it may also lead to unfavorable changes in the impurity profile of feed material entering the purification 
operations. This could necessitate an increase in the number of purification steps or even change in methods, which may 
ultimately lead to a reduction in overall process yield.
32.3.2 Control Strategy
Process control aims to ensure that process variability is controlled within specifically defined boundaries, and derived from 
current product and process understanding, to guarantee quality of the product. At a minimum, the process control strategy 
should address monitoring and control of critical process parameters. These are parameters whose variability have a direct 
impact on the physical, chemical, biological or microbiological property or characteristic of the product. These character-
istics are more formally termed as critical quality attributes (CQAs). CQAs should be within an appropriate limit, range, 
or distribution to ensure the desired product quality [2]. Control strategy can also encompass material analysis, equipment 
and facility monitoring, in-process controls, and the finished product specifications. It should even cover the description of 
associated methods and frequency of monitoring and control [6].
In general, the concept of design space and the appropriate process control should lead to more flexible, and ideally 
cheaper, manufacturing processes over time. This can be achieved through process improvements and real-time quality 
control, eventually leading to a reduction of end-product release testing. In theory, improvements of an approved process, 
if they occur within the design space, should not be considered a change, and would thus not initiate a regulatory post-
approval process change procedure.
Process knowledge and understanding is the basis for establishing an approach to process control for each unit opera-
tion, and the process overall [3]. Strategies for process control can be designed to reduce and/or adjust input variation dur-
ing manufacturing (and so reduce its impact on the output), or combine both approaches.
640 SECTION | VI Industrial Process Design
As with the approach to process development, decisions regarding the type and extent of process controls can be aided 
by early risk assessments, and later enhanced and improved as process experience is gained during performance qualifica-
tion (PQ) and continuous manufacturing.
32.4 A PROCESS DESIGN FRAMEWORK
To keep the time and cost of process development under control, a structured approach, or framework, to process design 
should be followed. The framework will guide different process-development groups and make sure that the experience of 
process-development staff is shared and recorded over time. This approach creates a corporate culture in which individual 
knowledge and experience is turned into corporately owned and easily accessible assets yielding well-understood process 
solutions and ideally forming technology and process platforms.
Although the structural approaches will differ in details between companies, it could be argued that all should contain 
the following three core elements: selection of industrial tools, selection of industrial methods, and process integration 
(Table 32.1).
The first element should address selection of relevant industrial technologies and materials. These will include cell lines 
with full traceability of their origin and history of development, preferred chromatography resins and filters, and raw mate-
rials (chemicals), again with full traceability of their origin. The second element will contain a list of selected methods for 
the intended purpose and subsequent process development and process control activities. The last element should address 
integration of all processing steps, including minimization of all associated activities, such as buffer preparation or column 
packing.
32.4.1 Example of Process Design Workflow
An example of the structured approach to process design that the three elements are parts of is shown in Fig. 32.2. The 
workflowstarts with a target molecule, which is chosen based on the medical indication and the molecule’s mechanism of 
action. At this point, it is known if the molecule belongs to a class of molecules that have already been used as APIs (e.g., 
monoclonal antibodies), or if the class the molecule belongs to has a potential of becoming a next API platform, or if its 
uniqueness suggests that the process design project will be a one-time development effort. In the first case, the process 
design project should be based on the already existing knowledge and experience about processes designed for similar mol-
ecules to take a full advantage of the similarities of these molecules. Such processes are referred to as platform processes, 
and consist of standardized technologies, procedures, and methods. If the organization has already worked with a similar 
molecule, the process developer should be able to find instructions and guidance in the use of the company platform con-
cepts referenced in the company development plan.
If the developed molecule is the first in its class and there are indications that this class has a potential of becoming a 
next biopharmaceutical platform, it could be advisable to already at this point consider desired features of a platform and 
invest in high throughput process development to study as many process parameters as possible for solid understanding of 
TABLE 32.1 Three Core Elements of Process Design Framework
Element Example Comment
Selection of industrial tools Cell lines
Raw materials
Consumables
Documented evidence, internal and 
vendor audits, manufacturing experience
Selection of technologies and methods Analytical methods
Cell separation methods
Purification methods
Viral clearance
Product and impurity profiles, risk 
analysis, heuristic designs, experimental 
performance evaluation
Integration Use of one buffer system for multiple steps
Column packing
Use of disposables
Reduction of time-consuming associated 
activities, eliminate non-productive steps.
Reprinted from L. Hagel, G. Jagschies, G. Sofer, 3—Process-design concepts, in Handbook of Process Chromatography (second ed.), Academic Press, 
Amsterdam, 2008, pp. 41–80 with permission.
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 641
the process. Specialized analytical instruments, and even novel upstream and downstream technologies, etc. may be part of 
such extended development plan. In the long run, when new processes for the next candidate molecules from this molecule 
class are developed, this would result in significant time and resource savings.
Examples of respective focus areas within those parts that need to be addressed during the process design are given in 
Fig. 32.2 and discussed in the paragraphs below.
Regardless whether the process might become a part of a platform or not, it needs to be thoroughly developed, charac-
terized, and, finally, described so the relations between process parameters and CQAs of the product are well-understood. 
This is accomplished by following a generic process-design guide that is based on a commanding set of rules that reduce 
process failure, risk for product quality problems, and time to reach an acceptable process. The process design will need 
to focus on the upstream and downstream processes, and on the development of an analytical package that will support 
development of these processes.
Each process design starts with a selection of an expression system that will be used to produce the drug candidate. The 
expression system will, to some extent, define parts of the impurities profile the purification process will need to deal with. 
Therefore, from a process design perspective, the best place to start is with careful characterization of the target product and 
its impurity profile, followed by a risk assessment that leads to prioritization of the purification process tasks.
Defining the different steps in the most appropriate sequence is the next activity. Already at this stage, initial process 
integration should be discussed. Process integration facilitates the transfer of intermediate product between steps by op-
timizing the steps’ links, and should reduce the amount of non-productive activities required for each process step (e.g., 
preparation of too many buffers, unnecessary hold times, the need for buffer exchange between steps, extensive column 
packing, etc.).
Product stability under a range of typical process conditions needs to be in focus as early as possible into the process 
design activities. Understanding of the product stability will help to avoid process-related issues that impact the product’s 
biological activity. The analytical methods that are suitable for assessing stability and biological activity need to be avail-
able early in process design. As a matter of fact, they should be a part of the analytical package that is used for product and 
impurity profile characterization. Furthermore, the process development team needs to understand the capability of each 
assay (be familiar with the variability, the limits of quantification, LQ, and limits of detection, LD, for each of the methods) 
to avoid making false conclusions (e.g., about satisfactory levels of impurity clearance). At the same time, the analytical 
methods development team needs information about acceptable levels of removal of each key impurity category so that 
assays with sufficient sensitivity and specificity can be developed. More details on this topic can be found in Chapter 47. 
Target molecule
Medical indication, 
Mechanisms of action,
Product quality attributes 
Platform
Feasible?
Platform
Exists?
YES
YES
Upstream
technology
platform
Downstream
technology
platform
Analytical
technology
platform
Use validated methods
for CQA determination 
Platform process 
NO
Analytics
Analytics
Downstream
Upstream
Upstream
Platform PROCESS
a common or standard set of
technology platforms combined
in a logical sequence that may
be applied across multiple
products under development or
manufacture 
Analytics focus areas
• Product and impurity profiles characterization 
• Detection and quantification limits 
• Throughput 
• Reporting system 
• Prioritization: in process measurements vs QC
 lab analysis 
Upstream focus areas
• Cell line development 
• Cell bank preparation,
 validation, maintenance 
• Media System 
• Product and impurity profiles
 characterization 
• Clarification and recovery 
• Model systems 
• Scale-up principles 
Downstream focus areas
• Product and impurity profiles
 characterization 
• Buffer system 
• Model systems for unit
 operations 
• Purification steps sequence:
 (Capture,
 Removal/Intermediate,
 Polishing) 
• Virus clearance methods (if
 necessary) 
• Perform risk analysis &
 prioritize tasks 
• Purification steps integration
• On-/ln-line process monitoring
 system
• Scale-up principles 
Platform TECHNOLOGY
a common or standard method,
equipment, procedure or work
practice that may be applied
across multiple products under
development or manufacture 
NO
Platform process
development
non-Platform process
development
eam
FIG. 32.2 An example of a generic process design workflow. Adapted from L. Hagel, G. Jagschies, G. Sofer, 3—Process-design concepts, in Handbook 
of Process Chromatography (second ed.), Academic Press, Amsterdam, 2008, pp. 41–80.
642 SECTION | VI Industrial Process Design
A development timeframe should be developed by the analytics and validation teams. Those teams are also best suited to 
select assays that can support ongoing development, process monitoring, and process validation.
Furthermore, while the development of the different parts of the process is carried out by different scientists and/or sepa-
rate groups or sites, understanding limitations associated with different scale of operations is very important. What may be 
practical in the laboratory can lead to equipment design needs that are very tricky and costly to realize on the manufacturingfloor, for example, a complex product peak fractionation scheme easily realized with a laboratory fraction collector would 
need to be translated into a cascade of valves with a dead volume large enough to risk elimination of some of the resolution 
achieved by the purification step. Establishing communication channels through proper reporting procedure will reduce 
potential surprises. For instance, communicating with manufacturing teams can eliminate issues related to constraints in 
the manufacturing facility. Examples of such constraints could be pumping capabilities (flow rates required for operating 
within the desired time frame), a different sensitivity of process monitoring/measurement equipment used in development 
labs and in manufacturing, or the type of wetted material and column distribution system used at the two scales.
Finally, a word about reporting in a process-design project: it is essential that activities and studies resulting in process 
understanding be documented at every stage, and that documentation should reflect the basis for decisions made about the 
process. A good reporting practice includes both describing what has been done and documenting what has been left out. 
The development records make the selection of methods and all major decisions on options clearly understandable and all 
open issues easily identifiable. Modern electronic document storage and retrieval systems are recommended to enable ap-
propriate documentation management with the lowest risk for error and most efficient use of time. Use of template reports 
for the same or similar types of studies is recommended, as it allows for easier comparison of experiments performed at 
different time points, both from the data as well as the troubleshooting perspective.
32.4.2 Platform Processes
In some instances, the process design activity can be significantly simplified if experience with process development and 
with manufacturing of a similar product exists within the organization. In those cases, one can utilize so-called platform 
technology and platform manufacturing concepts.
The platform technology/process is “a common or standard method, equipment, procedure, or work practice that may 
be applied to the research, development, or manufacture of different products sharing a set of common properties” [7]. 
Platform manufacturing is defined as “implementation of standard technologies, systems, and work practices within manu-
facturing facilities, and their use for the manufacture of different products,” thus it is the approach of developing a pro-
duction strategy for a new drug starting from manufacturing processes similar to those used by the same manufacturer to 
manufacture of other drugs of the same type [7].
In principle, any platform technology, regardless of the industry/area that it is applied in, must be built on a founda-
tion of knowledge and experience. The same applies to platform approaches employed in process design for a biologic. 
Experience in processing the same class of proteins such as IgG antibodies provides the basis for successful repeated use of 
platform technologies, such as Protein A chromatography and the subsequent low-pH virus inactivation step, as well as for 
applying the same or a very similar design of the cell culture sequence and the methods to run the different culture stages.
The evolution of platform‐based approaches in biotechnology product development and manufacture is presented in 
Fig. 32.3. In the 1980s, platform approaches were not employed, but as the industry matured, the concept of a platform ap-
proach to PD and manufacture started to emerge in the late 1990s, within companies’ development programs, and started 
to be discussed in public in the early 2000s [8]. Currently, platforms are widespread within product development organiza-
tions, and many production facilities [7].
The advantages of using more standardization with the development process are multi-fold. One of the more attractive 
benefits is that the time required for process design and development can be more accurately estimated [8]. Because plat-
form processes are composed of several defined unit operations and methods, they lend themselves to simplified process 
design. Not only the order and type of steps can be templated, but also many process conditions (e.g., buffer, flow rate) can 
be fixed. The use of platform technologies for cell culture and cell clarification provides a greater likelihood of success for 
the downstream platform because the process impurities, as well as most host cell impurities, will be similar. Consequently, 
analytical approaches will also be the same, or can be leveraged, and as such, require reduced development time. A reduc-
tion in overall development time facilitates a reduction in the time to toxicology and first-in-human (FIH) studies. Process 
standardization also delivers benefits at manufacturing scale, as the modern commercial manufacturing facilities use stan-
dard/platform technologies and work practices to improve efficiencies and execution timelines.
Platform-process derived data at many drug developers/manufacturers have been accumulated and input into extensive 
databases. The biopharmaceutical industry, including the regulatory bodies/health authorities, are now able to exploit these 
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 643
data to improve biopharmaceutical drug development, manufacture, and regulation. It is also expected that platform de-
rived data at many drug developers/manufacturers will allow the biopharmaceutical industry, in cooperation with regulatory 
 bodies/health authorities, to exploit this data set to improve regulatory procedures based on more solid, in-depth information. 
Although it is well understood that there will always be some product‐specific nuances to deal with in every development 
program, the available platform data could be presented for review to support clinical trial and marketing authorization 
applications, instead of presenting the same type of data for a new molecule developed on the platform every time again. 
These platform data packages might include those describing viral clearance capabilities of process steps, clearance of pro-
cess related impurities, cleaning regimes for process equipment, etc. [7]. Other potential advantages are the reduction of the 
number of suppliers for raw materials and established waste disposal routines for the selected consumables [5].
One of the most widely used platform approaches is that for the development of monoclonal antibodies. This is de-
scribed in more detail within Chapters 39 and 51, readers are also referred to a recent review of a platform monoclonal 
antibody process from several biopharma companies and future trends in platform processing for mAb’s [9].
32.5 DOWNSTREAM PROCESS DESIGN METHODOLOGIES AND TOOLS
Details of process design for the upstream and the recovery parts of a representative manufacturing process are provided 
within Chapter 31, so here we are only briefly summarizing the most important facts about these parts.
Typically, the upstream process is started from a working cell bank (WCB), which has been prepared from a master cell bank 
(MCB). A seed train with several stages of increasing culture volume and cell mass leads to the final-scale fermenter, or bioreac-
tor production phase. Fermentation (microbial cells) and cell culture (mammalian and insect cells) require carefully controlled 
growth conditions that are supported by the culture medium, which contains nutrients and chemicals in dilute aqueous solution.
Whether mammalian or microbial cells are used, recovery of the product involves product isolation from the cell mass. 
Cell debris or whole cell removal is achieved with techniques such as centrifugation or filtration. A similar basic process 
layout is also applied to transgenic and insect cell production systems. More detailed descriptions of the recovery partof 
downstream processing is provided in Section III of this book (Chapters 9 and 15).
From the downstream process development perspective, the type of expression system has no impact on the develop-
ment methodology itself. The type of process-related impurities, and most likely even the product- related impurities will 
vary with the expression system, but the development workflow and the goals for downstream process development will 
be the same (i.e., establishing a design space that will guarantee manufacturing of quality product with its specific char-
acteristic and purity profile). Therefore, starting downstream process development with a thorough characterization of the 
product, its impurity profile, and listing potential contaminants should be the focus from the beginning of the project.
Downstream
(purification)
General
1980
Any technology that worked was
considered good enough:
Aqueous Two Phase Extraction,
membrane filtration,
precipitation, etc. 
Non specific modes of
preparative chromatography 
Limited number of
products
Innovation- and
research-driven
environment
Little focus on
manufacturability
No need for platform
like processes
Biotechnology products
approved
New dedicated facilities
established 
Broad array of upstream
technologies, 
Inflexible facilities 
First platform approach
introduced within company
development programs 
Matured Platforms within
process and product
development programs 
Many manufacturing
facilities using
standardized practices and
capable of multiproduct
facilities 
Mergers leading to larger
internal manufacturing
networks 
Establishment of CMOs 
New types of molecules
w/o a common backbone
may pose implementation
issues in established
facilities. 
Establishment of
single use (SU)
technologies
Improved manufacturing
flexibility 
Multiproduct facilities, 
Localized manufacturing for
some products 
Approval of biosimilars 
Quest for manufacturing
platforms for new molecular
entities, Fab's, nanobodies,
etc.,
Prepacked columns 
Single pass filtration 
Two step purification 
Weak partitioning, maturity of HTPD
Establishment of continuous
technologies
Modularity concept 
Introduction of alkali stable Protein A
(longer life time, more economic
operations)
Large number of molecules (shorten
development timelines)
Introduction of HTPD concept
Multimodal chromatography
QbD paradigm Establishment of chromatography
as a workhorse for purification
Introduction of commercial
Protein A resins
Establishment of process
development tools enabling
standardized approach to PD
1990
2000
2010+
FIG. 32.3 The evolution of general platform-based approaches in biotechnology product development and manufacture. Adapted from E. Moran, et al., 
Platform Manufacturing of Biopharmaceuticals: Putting Accumulated Data and Experience to Work. EBE Publications, 2013, pp. 1–21.
644 SECTION | VI Industrial Process Design
32.5.1 Product in-Process Stability and Impurity Profiles
With the expression system chosen, and with the information about the target molecule acquired, downstream process de-
sign can begin. Characteristic features of the target molecule that are relevant to process designers include: (i) the physico-
chemical and biological properties, (ii) its stability under the chemical and physical conditions it might be in contact 
with throughout the process, and (iii) the type of impurities that will need to be removed before the target molecule can 
be formulated into the drug product. For a successful process design all these features need to be characterized, and their 
dependency on the process conditions should be investigated, understood, and described. Methods for process and product 
characterization are discussed in Chapter 47.
Stability
In general, keeping the molecule in its natural environment will support preserving its activity. It should be noted, though, 
that this environment may contain harmful enzymes or lack stabilizing conditions that the production cell did provide. 
During biosynthesis of a therapeutic protein and its isolation from other biological material, the environmental conditions 
the target molecule is exposed to will most certainly need to be changed, sometimes toward harsh ones. Various changes to 
process conditions will affect the protein properties to a different extent, but the goal of the process development scientist 
is to characterize these changes and to make sure they are minimized or, preferably, prevented.
Factors affecting the stability of protein-based target molecules include the presence of protease, protein concentration, 
pH, temperature, co-solvents, salts (concentration and type), co-factors, and redox potentials [5]. Knowledge of the condi-
tions, such as pH, salt concentrations, additives, etc., that preserve the product will be crucial to a cost-effective manufactur-
ing strategy. Some of this information will be obtained during preliminary testing of the starting material, some during the 
process development. Furthermore, because molecule stability is more or less almost always related to a type of chemical 
reaction, the effect of time of exposure to a given condition needs to be considered, and the rates of the target molecule 
degradation need to be determined.
Finally, stability of the target molecule may also strongly affect its propensity toward certain types of interactions that can be 
exploited in a purification process. For instance, chromatographic separations are based upon the controlled interaction between 
the solute and a sorbent (resin). The critical properties, from an interaction perspective, of the protein may be exposed on the 
surface of the molecule or hidden in the interior parts and only available for interaction after modification (e.g., denaturation). 
This exposure of interior parts can be used to achieve a high degree of separation (e.g., as in reversed-phase chromatography), 
but may also result in irreversible loss of material and/or activity. To understand the limits of the stability window for the target 
molecule, as well as for the environment it is kept in, is thus essential knowledge obtained during process development.
Impurities and Contaminants
The objective of purification is to either completely remove any quality deteriorating components from the drug substance, 
or to reduce their content below acceptable levels from the patient safety perspective. As discussed in Chapter 4, in gen-
eral, one distinguishes between product-related and process-related impurities, which differ both in origin and strategy for 
prevention or removal.
Product-related impurities are molecular variants arising during manufacture and/or storage, which do not have properties 
comparable to those of the desired product with respect to activity, efficacy, and safety [10]. They may be present as a result 
of an inherent degree of structural heterogeneity occurring in proteins due to the biosynthetic processes used by living organ-
isms to produce them [10], or can be formed during processing as a consequence of enzymatic activity or exposure to certain 
process conditions such as high or low pH, extended storage, high shear conditions, etc. Any of these modifications will lead to 
different forms of the product, which may or may not be acceptable from the patient safety or mechanism of action perspective. 
When variants of the desired product are acceptable, they are considered product-related substances, and not impurities [10]. 
Consequently, purity demands will need to be determined based on the efficacy, potency, and safety profiles of the various forms.
Removal of product-related impurities is among the most challenging tasks for purification technology, as it usually 
requires resolution of substances with a very similar chemical composition and molecular structure. This similarity also 
creates a challenge for analytical departments, not only because the applicable methods need to be used to help develop and 
later monitor the manufacturingprocess, but also because the exact composition of the final product could be considered as 
an important company asset that could be used in legal disputes when a biosimilar version for the drug is developed. More 
information about different properties of biological products and analytical methods used in bioprocessing can be found in 
Chapters 3 and 47, respectively.
Process-related impurities can be classified into three groups (Table 32.2) where two of them are related to the upstream 
operation and one to the downstream operation. The upstream-derived process impurities are dependent on the selected 
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 645
expression systems (cell-substrate derived) and process conditions (cell-culture derive). A list of typical cell-substrate de-
rived process impurities can be found in Chapter 4 (Table 4.5). These impurities will also depend on the methods required 
to isolate the product. The lowest initial impurity levels are generally achieved with secretion systems grown in chemically 
defined, protein-free culture media and when the product is secreted into the cell culture medium. The highest levels are 
present when production cells need to be disrupted in order to release an intracellular product. The resulting cellular debris 
formed contaminates the target product during the initial recovery steps. From the process design perspective, it should be 
TABLE 32.2 Examples of Typical Process-Related Impurities
Description Examples Brief Explanation
Molecules and 
compounds that 
are derived from 
the manufacturing 
process, and are 
classified into three 
general categories 
[10]:
(i) cell substrates-
derived (e.g., 
host cell 
proteins, host 
cell DNA)
(ii) cell culture-
derived (e.g., 
inducers, 
antibiotics, 
or media 
components, 
and extractables 
and leachables)
(iii) downstream 
processing-
derived (e.g., 
enzymes, 
chemical and 
biochemical 
processing 
reagents, 
inorganic salts, 
solvents, ligands 
and other 
leachables and 
extratables)
Cell-culture 
nutrients, 
chemicals
Non-consumed component of cell culture media and byproducts from the cell metabolic 
pathways
Host cell proteins Any production source has its own native proteins, which will need to be removed to 
obtain the target product with the desired purity. In cellular production systems one 
refers to them as host cell proteins (HCP). Typically, with mammalian cells, such as CHO, 
HCP are released when the cells are damaged during recovery of the product from the 
bioreactor, or because of natural cell death toward the end of cell culture. With E. coli 
cells, the isolation of inclusion bodies and extensive washing steps allow removal of most 
of the bacterial HCP despite their significant release upon cell disruption (see Chapter 9)
Proteolytic 
enzymes, other 
enzymatic 
activity
Particularly important category of HCPs characterized by enzymatic activity causing 
degradation and/or modification of the desired product and other HCPs. Examples: 
proteases and glycosidases
Endotoxins In E. coli, and other Gram-negative bacteria, endotoxins are present in the cell wall and are 
released during cell disruption—one reason why there has been an effort to design the cells 
so they can secrete product, rather than sequester it in inclusion bodies. For mammalian 
cultures, endotoxin should not be an issue, or at least it is one that is controlled by 
compliance with good manufacturing practices. In cell culture, endotoxins are considered 
a contaminant, not an impurity derived from the host organism. Endotoxins can be 
introduced into purification processes by contaminated water, buffers, additives and resins
Cellular DNA, 
other nucleic 
acids
Cellular nucleic acids, such as genomic DNA, are released in the same way that HCP 
are released. The amount of nucleic acid in the starting material is also dependent on 
the degree of cell death during late cell culture or the disruption of producer cells during 
isolation of the product
Virus Production sources of mammalian origin such as cells, tissue, human plasma, and 
transgenic milk can be contaminated by exogenous virus. Mammalian cells often carry 
endogenous virus, e.g., CHO cells inherently contain retroviral particles. There is also a risk 
that transmissible spongiform encephalopathy (TSE) agents can contaminate mammalian 
production sources as well as in-process raw materials that are of mammalian origin or 
manufactured with materials of mammalian origin. From a potential virus contamination 
perspective, use of bacterial expression system could remove the contamination risk, but 
those systems suffer from other limitations, e.g., higher HCP levels and endotoxins
Cell debris, lipids The lowest initial impurity levels are generally achieved with secretion systems grown 
in chemically defined, protein-free culture media and when the product is secreted into 
the cell culture medium. The highest levels are present when production cells need to be 
disrupted to release the intracellular product and cellular debris contaminate the target 
product during the initial recovery steps
Antifoams, 
antibiotics
Used to reduce foaming during the cell culture process, usually based on surface active 
agents (surfactants) that have a deterioration effect on column and filters. Antibiotics are 
considered API and should not be present in another drug formulation
Leakage Leached ligands present in the elution pools. Example: Protein A
Extractables, 
e.g., from plastic 
surfaces
widespread use of single use components, e.g., bags, flow paths, connectors, potentially 
harmful leachable or extractable impurities that could be released from the plastic material 
into product stream under harsh conditions also fall into the process impurities
646 SECTION | VI Industrial Process Design
remembered that the amount of and type of the process-related impurities present in the starting material for purification 
operations can be minimized through selection of processing methods or process controls. Downstream-derived impurities 
include column leachable components, buffer components, surfactants, flocculation, and precipitation agents. These days, 
they also include compounds introduced via the use of single use components (e.g., bags, flow paths, connectors). These 
include potentially harmful leachable or extractable impurities that could be released from the plastic material into the prod-
uct stream under harsh conditions, which also fall into the process impurities.1 In general, the level and type of leachable 
and extractable will depend on process conditions including exposure time, temperature, pH, and the type of buffer salts 
used. Attempts to align consensus on standardization of procedures and methods when it comes to single-use system testing 
have resulted in the Biophorum Operations Group’s (BPOG) extractables protocol [11]. The protocol provides a consensus 
of testing needs as discussed and agreed upon by the members of BPOG and can be followed by non-members as well.
As discussed in Chapter 4, a high level of process-related impurities and cell debris requires greater efforts during re-
covery and purification as they might have detrimental effects on the recovery and/or purification steps, leading to a higher 
cost of the downstream process.
32.5.2 Basics of Downstream Process Development
With the product characteristic and impurity profile in hand, process development can focus on identifying potential tech-
nologies to be used in the process being designed. As discussed herein, if the process is to be based on a platform process, 
the choice of these technologies is already made. If, on the other hand, a process needs to be developed from scratch, then 
based on the product characteristics and types of impurities, process data available from the product development activi-
ties, heuristic information, and literature reviews a list of most promising technologies can be compiledand a few process 
sequence alternatives proposed. These alternatives should be ranked and/or quickly tested to determine those on which 
focus should be placed. Also at this stage, it could be advisable to consider the functionality and limitations of commercial 
manufacturing equipment. Furthermore, before starting any extensive process development work, care must be taken to as-
sure that PD can be performed using established laboratory and pilot-scale scale-down models, and that scientific principles 
employed will assure that results obtained and conclusions drawn from PD studies are representative of the commercial 
manufacturing. Use of high-throughput process development, high-end modern analytical techniques, and often modeling, 
both statistical and mechanistic, should also be discussed at this stage and a decision made about which of those will be 
applied to what question or challenge throughout the PD activities.
Although in the past, experimental work was addressed in a random fashion, today the approach to process development 
is based on systematic risk management, structured experimental planning, and execution. The focus of the new approach, 
sometimes called the enhanced approach, as compared with the traditional approach that focused on specifying set points 
and operating ranges for process parameters, is to use risk management and scientific knowledge to identify and understand 
the material attributes and process parameters that influence the critical quality attributes of a product [12].
There are a few methods to assure that the right level of scientific knowledge is attained. These include (i) evaluation of 
the effect of process variables one at a time, (ii) applying mathematical/mechanistic modeling to support experimentation 
effort to formulate the model that is later used to describe the process, (iii) use of the concept of the design of experiments, 
DoE. Of course, any combination of these methods can be applied in practice, depending on the level of complexity and 
initial knowledge about the challenge at hand.
Evaluation of the process variables one at a time is not recommended as a first choice, unless it is known from previous 
studies that the effect of one variable on the process outcome can be decoupled from the effect of other variables. In the case 
of protein purification, this is rarely the case. Few simple examples include effect of load concentration in a capture step or 
elution pH (more examples can be found in Table 32.5).
Mechanistic modeling provides a very powerful tool for guiding process design. With a formulated, and validated, 
model, an in silico assessment of the impact of process parameters can be performed, thus saving time and resources. 
However, what needs to be remembered is that a mechanistic model does not replace experimental effort, after all, in a 
majority of cases, initial experimental data is needed to formulate the model. Instead, the model is used as a guiding tool 
for designing more relevant experiments and testing effects of various process variables (including those difficult to test 
experimentally, e.g., lot-to-lot variability of chromatography resins and filtration membranes). The validated model can, 
and should, be used for optimization to find the best compromise between the optimum conditions and the process robust-
ness. Model validation is very important, as a use of an incorrect physical model, or even wrong model assumptions, will 
1. The same impurity concern regarding the presence of leachable and extractable compounds applies to the use of single-use bioreactors (SUBs) com-
monly utilized at different stages of the upstream operation.
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 647
lead to erroneous results. However, it could be argued, at least from the scientific perspective, that the underlying principles 
for almost all unit operations employed in today’s downstream processing are well understood and described in scientific 
literature. This fact, together with the unprecedented pace in improvement of computational resources, makes mechanistic 
modeling the future method of choice for process development and process control.
While mechanistic modeling is gaining more popularity, it seems like the use of DoE is still the industry standard. 
DoE, if applied correctly, will help, from a statistical perspective, in quantifying relationships between process parameters 
and process outcomes, including the product CQAs. The correctness of this approach is built on the foundation of process 
know-how and all empirical knowledge available. Results from DoE studies are evaluated using statistical analysis, and 
are expressed in terms of the so-called statistical model. The statistical model can be used to quantify the impact of the 
important process variables (chosen from the list of the variables tested experimentally) on the process output, and define 
the robustness of the process through analysis of normal process variability [13].
32.5.3 Introduction to Risk Analysis, Design Space Concept, and Process Control
Risk Analysis
Regulatory authorities endorse a “science and risk-based approach” for designing and validating biomanufacturing pro-
cesses. They recommend that process design, including development and control strategies, is facilitated by early risk as-
sessments that are subsequently enhanced and improved as process experience is gained. Process development activities 
play a pivotal role in this process as they provide answers on what risks may be present, and how they can be eliminated. 
Elimination of these risks should be based on thorough characterization of different process steps to establish relations 
between process parameters/variables and the process outcome from the product quality perspective. These relations will 
allow a multidimensional (several variables) operating space, the so-called design space, to be defined, either for the whole 
process or for separate unit operations.
Different types of risks associated with bioprocesses have been discussed in Chapter 4 (Table 4.3). Some of them can 
be used in the early risk assessment process, which should help in specifying a preliminary list of process parameters, and 
their interdependencies, to be considered in the process development.
Design Space
The concept of the design space is based on identification of process parameters and their subsequent classification into 
those: (1) affecting the product CQAs, the so-called critical process parameters, CPPs, (2) those that need to be controlled to 
maintain the process performance but do not influence CQAs, the so-called key process parameters, KPPs, (3) the non-key 
process parameters, also referred to as general process parameters (GPPs) that do not have a meaningful effect on product 
quality or process performance (Table 32.3).
The parameter classification process starts with identification of potential process variables that might need to be tested 
(e.g., process parameters such as flow rate, pH, and temperature). Although the initial list of potential parameters can be 
quite extensive, typically it can be simplified by performing a preliminary ranking of parameters applying risk assessment 
tools based on prior knowledge and existing experimental data. In principle, the list will be refined (both the parameters and 
TABLE 32.3 Classification of Process Parameters
 Narrow Rangea Wide Rangeb
Quality Critical Non-key (General)
An adjustable parameter of the process that has been 
demonstrated to be well controlled within a wide range, although 
at extremes could have an impact on quality
 An adjustable parameter (variable) of the process 
that should be maintained within a narrow range 
so as not to affect critical product quality attributes
Process Key
An adjustable parameter or the process that, 
when maintained within a narrow range, ensures 
operational reliability
aAnd/or difficult to control.
bAnd/or easy to control.Adapted from PDA, Technical Report No. 42: Process Validation and Protein Manufacturing. PDA J. Pharm. Sci. Technol. 59(S-4) (2005).
648 SECTION | VI Industrial Process Design
their ranking) throughout the whole process development program by determining the significance of individual process 
parameters and their potential interactions. At the end of PD the list will yield a ranking of the significant process variables 
that were identified based on studies performed to achieve a higher level of process understanding, and establishing the 
design space. The type of studies that can be considered to achieve those goals include a combination of design of experi-
ments, mathematical models, or studies that lead to mechanistic understanding [2].
It is recommended by regulatory authorities that the rationale for inclusion of a given parameter in the design space 
is presented. Also, it can be expected that, in some cases, arguments behind exclusion of parameters should be provided. 
Although the inclusion of the parameters should be self-evident from the results describing the design space, the exclusion 
rationale might be more difficult to explain as the rationale can be based on multiple factors.
An example of a risk-assessment approach and subsequent process characterization workflow is presented as follows. It 
is based on one of the approaches described in CMC’s Biotech Working Group A-mAb case study [14]. It uses risk ranking 
to classify process variables based on their potential impact on CQAs, process performance, and possible interactions with 
other parameters. In this risk assessment method, each parameter is assigned two rankings: one addressing the parameter's 
potential impact on CQAs (main effect) and the other describing the parameter’s potential interactions with other param-
eters. The rankings for impact on the main effects have higher weights assigned than the rankings for impact on lower 
criticality quality and process attributes. In the case of lack of data and/or a weak rationale for the assessment, the parameter 
should always be ranked at the highest level. It should be remembered that the impact assessment is always based on the 
variation of the parameters within the anticipated design space.
Main and interaction effects assessed are then multiplied and the overall “severity score” (SS) is calculated. The se-
verity score is used to classify parameters and determine which type of studies (i.e., DoE, univariate, or none) should be 
performed during process characterization (Table 32.4).
Severity score calculations help to assign process parameters into three categories: (i) primary parameters warranting 
the multivariate evaluation (e.g., DoE studies); (ii) secondary parameters whose evaluation could be based on univariate 
studies either if a proper justification can be provided or the severity score is low, and (iii) parameters that don’t require 
new studies whose ranges could be established based on prior knowledge or modular claims. Similar to the logic used in 
the impact assessment step, if no data or rationale is available to justify either no study or univariate studies, the parameter 
will be considered as part of the multivariate studies.
An example of risk ranking for the Protein A chromatography step is shown in Table 32.5. In this case, all variables 
with a severity score equal to or greater than 8 would be identified for the multivariate studies: load flow rate, protein load, 
equilibration/wash flow rate, elution buffer molality and end pool collection point; all parameters with score 4 would be 
warranted univariate studies and the remaining parameters would be classified as having no impact (within the ranges in-
cluded in the expected design space). The results obtained in those studies, and possibly in follow-up experiments, would 
lead to establishment of the parameter classification (CPP, KPP, etc.), their ranges, and corresponding controls.
One of the inherent drawbacks of the DoE approach used for process validation studies is that the number of experi-
ments increases exponentially with the number of parameters. Consequently, for purely practical reasons, it is expected 
that the risk assessment exercise should end up with a suitably low number of relevant parameters (4–6), which in turn can 
compromise its credibility [15]. To address this DoE drawback and to address the process validation guidelines [3], a new 
approach to process validation and thus to classification of process parameters has been proposed [15,16]. The approach is 
based on what is known in the field of mathematics as the Latin Hypercube Sampling (LHS). In the LHS approach adopted 
for process design, all the process parameters that are considered for the risk assessment exercise are tested experimentally, 
but only in combination with other process variables/parameters. Thus, each parameter is tested at N distinct levels (values) 
chosen from a statistically relevant parameter range (assuming a relevant probability density function describing the param-
eter variability, e.g., normal or uniform), where N is equal to the number of process parameters to be tested. As a result, the 
TABLE 32.4 Parameter and Severity Classification
Severity Score Parameter Classification Type of Studies
Very high Primary Multivariate study
Medium-high Primary or secondary Multivariate or univariate with justification
Low-medium Secondary Univariate
Low No impact No additional study required
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 649
LHS approach delivers a statistically relevant experimental design, which produces data representative to a “routine manu-
facturing outcome” (i.e., a sort of a process control chart). In principle, the LHS approach has a huge potential to simplify 
risk assessment discussions and identify process parameters based on hard evidence rather than on qualitative historical 
data that are often not even properly archived. However, a widespread implementation of this method will require examples 
to be presented to regulatory authorities and discussed within the bioprocessing community. Until then, other approaches 
will need to be used to justify process design based on a risk assessment approach.
A design space can be established for each single-unit operation, or it can be developed for a series of unit operations 
constituting a part or the whole process. The former approach is simpler to develop, but it might require overall more ef-
forts to characterize the whole process and might carry a risk that potential interactions between unit operations are not 
uncovered. The latter approach might, on the other hand, provide higher operational flexibility. Indeed, some novel DoE 
methods [17] and risk assessment approaches [18] can be used to characterize multiple sequential process steps simultane-
ously. These new approaches hold the potential for uncovering hidden interdependencies between steps while still being in 
line with the design space concept.
A design space can be developed at any scale [2]. From the scientific perspective, this is correct as long as the model 
systems used in the lab and/or the pilot scale are representative of the final manufacturing scale operation. However, if 
potential risks in the scaled-up operations exist, they should be described and a control strategy proposed. In general, it is 
recommended that the design space be described in terms of relevant scale-independent parameters (e.g., column volumes, 
residence and contact times, loads normalized per volume or surface area, etc.). This approach allows the same design space 
description to be applicable to multiple operational scales. Principles of scale-up of downstream processes are described 
later in this chapter.
As discussed before in connection to the process design workflow, a good documentation practice is pivotal when de-
signing a process, and therefore it plays an important role in the establishment of the designspace. Good documentation 
of development and characterization studies, even those resulting in unexpected results, is critical in enabling technology 
transfer, troubleshooting and investigations of out-of-specification (OOS) results. Development reports are also critical for 
future process changes. When a process change is made and there are no development data, it may not be clear what effect 
the change will have on product quality.
TABLE 32.5 Example of risk Ranking Analysis for Protein A Chromatography Stepa
Phase Parameter
Main Effect
highest 
Main effect 
score (hM)
Parameter 
Interaction Effect
highest 
Interaction 
Score (hI)
Severity 
(hM × hI)CQA PA CQA PA
All phases Column bed height (cm) 1 1 1 4 2 4 4
Load (CCCF) Flow rate (CV/h) 4 2 4 2 2 2 8
Protein Load (g/Lbed) 4 4 4 4 1 4 16
Load concentration (g/L) 1 1 1 1 1 1 1
Equilibration 
and Wash(s)
Buffer pH 1 1 1 1 1 1 1
Buffer molarity (mM TRIS) 1 1 1 4 1 4 4
Buffer molarity (mM NaCl) 1 1 1 4 1 4 4
Flow rate (CV/h) 4 2 4 4 1 4 16
Volume (CV) 1 1 1 4 1 4 4
Elution Buffer molarity (mM acid) 4 1 4 4 1 4 16
Flow rate (CV/h) 1 2 2 1 1 1 2
Start pool (CV) 1 1 1 1 1 1 1
End pool (CV) 1 1 1 8 1 8 8
a the list of parameters is provided here for illustration purposes only.
Adopted from CBW Group, A-Mab: A Case Study in Bioprocess Development, 2009, pp. 1–278.
650 SECTION | VI Industrial Process Design
Finally, a word about an important, yet non-mandatory part of the design space analysis, the so-called edge of failure 
analysis, where the edge of failure is defined as the point in the design space where a boundary to a variable or parameter ex-
ists, beyond which the relevant quality attributes or specification cannot be met [14]. Although the FDA does not generally 
expect manufacturers to develop and test the process until it fails [3], the edge of failure analysis might help with assessing 
and defining process risks [19]. The edge of failure can be determined experimentally by exploring the design space until 
failures are found and by simulating the extrapolated design space even though failures were not experimentally detected. 
The latter approach is preferred simply because of the cost argument, but it requires a validated model to be developed.
Process Control
The critical and key process parameters identified and their operating ranges become an essential part of the process control 
strategy. The process understanding gained during the process development should, in addition to classification of process 
parameters, identify sources of process variability and propose adequate control mechanisms to reduce their impact on the 
product quality and process performance. As discussed in Chapter 4, variability can be caused by the biological production 
system, but also by post-cell culture effects, such as the impact of process conditions on the biological molecule, incom-
pletely removed enzymatic activity, or inadequate process control. Additionally, chromatographic resins, filter membranes, 
and other re-usable consumables have a certain batch-to-batch variability during processing, especially as their usable life-
time approaches. As such, process design includes the definition of a window of operation for each step that ensures consis-
tency of product quality and accommodates the ‘natural’ variability as well as that of raw materials and their performance. 
In practice, this is applied through the use of a safety factor/margin that involves underutilising the full capacity of a specific 
operation (e.g., chromatographic resin or filter loading). Underutilising the capacity of a resin or filter ensures a constant 
level of performance for a longer length of time before degradation causes a reduction or variability in specified performance.
At the same time, success of the control strategy proposed will heavily rely on the fact that all sources of variability, 
such as different component lots, production operators, environmental conditions, and differences in measurement systems 
in the production and laboratory settings are considered. Furthermore, it should be assured during the PD phase that vari-
ability typical to commercial processes is tested while establishing the design space. The importance of having a focus on 
process variability is exemplified by the notion that the ability to minimize process variability as early as possible in the 
process train will simplify the overall process control strategy, improve process robustness, and even minimize the need for 
end product testing [2].
On the other hand, it could be envisioned that with the right level of process understanding, future manufacturing control 
paradigms will allow for the variability of input materials to be less tightly constrained. Instead, the process parameters can 
be automatically adjusted to account for the variability of incoming material through an appropriate process control strategy 
to ensure a consistent product quality. An example of such an approach to deal with effect related to lot-to-lot variability of 
an HIC resin has been proposed [20] (see Chapter 19).
32.6 COMBINING STEPS FOR AN EFFICIENT PROCESS
As already mentioned when discussing the design space for sequential operations, in a manufacturing process for biologics, 
the outcome from a previous step will influence the performance of the subsequent steps. However, the steps are not neces-
sarily linked to each other in an optimal fashion in the first version of the process. Sub-optimal links can be time consuming 
and costly; they may even force the introduction of additional steps or adjustments to the intermediate product (associated 
operations). Process integration is the part of the process development phase where these issues are resolved. For process 
integration to be successful, the effect of process stream variability is accounted for when characterizing different purifica-
tion stages. Furthermore, strategies for minimizing these effects need to be developed.
If process steps are developed independent of each other, it is likely that they use different buffers, and elution condi-
tions of one step that may not be adjusted to allow direct loading onto the next. This can be changed so that each step uses 
only one or two basic buffers, and elution conditions match loading conditions for the next step as closely as feasible for 
good performance overall. An extreme, yet elegant example of such a strategy was proposed by Mothes et al. [21]. The 
strategy, also referred to as ASAP (Accelerated, Seamless Antibody Purification), is based on a buffer system that allows a 
seamless integration of three chromatography steps where the elution buffer from one step is the load buffer for the subse-
quent step. Specifically, the eluate from the Protein A step can be loaded directly onto a second-step column and the eluate 
from the second step is applied directly onto the polishing column. A key benefit of ASAP continuous processing is the 
elimination of intermediate product storage, adjustment of the intermediate for load onto the next step, and potential extra 
steps such as UF/DF associated with pH, buffer molarity, and protein concentration adjustment. With those steps removed, 
either process cycle times or column sizes can be reduced.
 Downstream Process Design, Scale-Up Principles, and Process Modeling Chapter | 32 651
If the use of this, or a similar, buffer system is not possible, one may still design for the possibility of modifying the 
intermediate product composition with in-line adjustments (i.e., without hold-tanks between the steps). An example of a 
technology-enabling operation of several chromatography steps connected in series is the so-called straight through pro-
cessing (STP) concept developed in collaboration between GE Healthcare and Janssen (see Chapter 27). Alternatively, the 
recently introduced single pass tangential flow filtration technology (SPTFF) [22] makes the in-line adjustment, including 
a concentration step,feasible.
However, in general, for the whole process, the number of different buffers and cleaning solutions should be minimized 
to three or four, and sodium hydroxide can be made the standard for cleaning columns and filters in many processes. With 
this strategy, the process integration will be much simpler.
32.7 SCALE-UP AS PART OF PROCESS DESIGN
Once the correct steps and sequence of unit operations have been identified for achieving the required product purity and 
quality, the process design itself should be evaluated for its feasibility of operation at the manufacturing scale. Process pa-
rameters, including considerations for the constraints at larger scales, should be accounted for at the bench level. However, 
quite often not all data required for successful scale-up can be collated from the bench, particularly for new processes 
without any historical implementation at larger scales. Therefore, refinements of the small-scale process may need to be 
evaluated and tested at an intermediate scale between that of the bench and production to allow a more efficient scale-up. 
Thus, small-scale processes are generally scaled to a bridging, pilot scale for further process optimization before scaling to 
the final manufacturing level.
Before performing scale-up, one needs to decide on which scale the final process is to be operated. Scale-up is driven 
by market demand, but also on the manufacturing capacity of the facility where the process is to be installed. As discussed 
in Chapter 4, the demand is addressed by calculating how many batches will need to be run per year, at a given overall 
process yield, assuming: (i) a range of realistic titer levels and (ii) feasible working volumes of the bioreactors. The former 
will be based on the advances in cell culture technologies and the latter on facility capability (e.g., available floor space, 
auxiliary equipment, ability to run multiple bioreactors in parallel and/or staggered fashion, etc.). Once feasible scenarios 
are identified and the mass of product per batch is known, it is a simple task to scale all unit operations in the manufactur-
ing process. For instance, after knowing how much product is produced in a single bioreactor, the size of DSP equipment 
is determined from the available processing time for production. The DSP may be sized to accommodate the batch to be 
processed either in one single lot or in several cycles using smaller equipment. Although the latter approach is sometimes 
more cost effective, it should be noted that cycle time is independent of the scale of operation, hence more cycles will take 
longer to complete.
Once equipment size is determined, their feasibility should be evaluated for any constraints of the manufacturing fa-
cility. For instance, work shift patterns for batch operations and process scheduling in the cases where the facility is to 
manufacture multiple products will need to be considered. This is to ensure that the total process time does not exceed the 
allocated manufacturing campaign window. If a process is to be retro-fitted into an existing facility, there may be space 
restrictions for certain equipment, limiting the feasible sizes that can be utilized. In the case of a new facility, constraints 
related to equipment size can be relaxed by purchasing larger equipment or more units. Flexibility for equipment dimen-
sioning at large scale can be created by avoiding locking the key dimensioning parameters too early. In general, a successful 
scale-up should balance the needs of the technical solutions utilized with that of the economy of a manufacturing run, while 
ensuring the product quality requirements for patient safety.
32.7.1 Upstream
A cell culture process is typically developed in bench scale bioreactors and then scaled up for commercial production. 
However, although the recent advances in cell technologies have enabled high titer processes, these high cell density cul-
tures begin to become a challenge from the perspective of process scale-up. Especially factors such as mixing time, oxygen 
transfer, and carbon dioxide removal need to be carefully considered when scaling up high cell density processes. For 
example, poor mixing can result in local nutrient gradients within the bioreactor, leading to reduced cell growth and produc-
tivity. However, at the same time, low-power input mixing is recommended because of high sensitivity of mammalian cells 
to shear stress. The mixing will, in turn, affect both oxygen transfer and dissolved CO2 removal. The former is of critical 
importance, as mammalian cultures are aerobic processes and often oxygen is the limiting nutrient, while the latter has been 
linked with lower cell productivities and even to product quality due to a different glycosylation. In other words, optimizing 
oxygen supply and carbon dioxide removal, while avoiding cell damage, is the key to the mammalian cell culture scale-up. 
652 SECTION | VI Industrial Process Design
A more detailed discussion on cell culture scale-up is provided in Chapter 31. For the purpose of this chapter we are only 
providing a very brief example of three criteria for scaling suspension cell cultures [23]. The criteria are listed in Table 32.6.
In terms of manufacturability and scalability, mammalian cells have historically been considered difficult to work 
with due to factors such as low yield, medium complexity, serum requirement, and shear sensitivity, although the latter 
has generally been incorrectly overemphasized. After two decades of intensive development work in cell line, media, and 
bioreactor condition optimization, cell densities of 15–25 million viable cells/ml can be routinely achieved for monoclonal 
antibody fed-batch processes, giving production titers of 3–5 g/L; high titers up to ~10 g/L, and cell densities of more than 
50 million viable cells/mL in fed-batch processes, which have been recently reported by a few companies at major confer-
ences [24,25]. The enhancement of specific productivity per cell is achieved not only by selection of highly productive 
clones, but also by optimization of medium composition and bioreactor operation conditions.
Cell line stability is another factor that should be considered because volumetric and specific productivity decline as cell 
age increases for some cell lines. Such unstable clones are not suitable for large-scale production because cell age increases 
with scale as the cell culture process is scaled up through serial culture passages of the seed train and inoculum train. In ad-
dition to cell line stability, growth and metabolite characteristics that can affect process robustness and scalability also need 
to be assessed. Robust cell growth with high viability and low lactate synthesis is usually desirable. High-lactate producing 
clones are not preferred to avoid the osmolality increase that accompanies the addition of base needed to maintain pH.
Therapeutic antibodies are produced in mammalian host cell lines, including NS0 murine myeloma cells, and Chinese 
hamster ovary (CHO) cells [25–27]. The selection of expression system is determined by its ability to deliver high produc-
tivity with acceptable product quality attributes and the preferences of individual companies, which is often influenced by 
their historical experiences.
As discussed in Chapters 5 and 31, a typical cell culture manufacturing process begins with thawing of a cryopreserved 
cell-bank vial, followed by successive expansions into larger culture vessels such as shake flasks, spinners, rocking bags, and 
stirred bioreactors [27]. When culture volume and cell density meet predetermined criteria, the culture is transferred to the 
production bioreactor in which cells continue to grow and eventually express product. Getting the required cell density and 
volume of culture to inoculate the production bioreactor is achieved via a cell culture seed train. The cells are usually run 
through many cultivation systems that become larger with each

Continue navegando