6/19/2019 1 MVA and DOE: Throughout the Product Lifecycle Dr. Charles E (Chuck) Miller Camo Analytics • Background: • MVA and DOE: “The Tools” vs. “The Philosophies” • Synergies • Product Lifecycle • Thesis: The tools and concepts of MVA and DOE are relevant throughout the product lifecycle • Case studies: • On-line NIR Spectrometer: 30 years old! • Pharma RTRT Application • PAT Biotech Applications Outline
25
Embed
MVA and DOE: Throughout the Product Lifecycle · method updates, process updates Designs for PAT and Analytical Method transfers & method updates Designs to support process improvements
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
6/19/2019
1
MVA and DOE: Throughout the Product Lifecycle
Dr. Charles E (Chuck) MillerCamo Analytics
• Background: • MVA and DOE: “The Tools” vs. “The Philosophies”• Synergies• Product Lifecycle
• Thesis: The tools and concepts of MVA and DOE are relevant throughout the product lifecycle
• Case studies:• On-line NIR Spectrometer: 30 years old!• Pharma RTRT Application• PAT Biotech Applications
Outline
6/19/2019
2
Multivariate Analysis (MVA)
• Multiple Linear Regression (MLR)
• Principal Components Analysis (PCA)
• Partial Least Squares (PLS) Regression
• Cluster Analysis• Linear Discriminant Analysis
(LDA)• Some Others…..
• Most relationships are seldom binary and linear
• Combine both “Statistical and Chemical” thinking (Martens, Naes 1989)
• Utilize Domain Knowledge whenever possible
• Accept, embrace and utilize increasing multivariate nature of data
• “All models are wrong, but some models are useful”
The Tools The Philosophy
• Experimental Design Schemes:
• Screening, Factorial, Composite, Mixture,….
• Confounding• ANOVA, Regression
Design of Experiments (DOE)
• Taguchi’s three concepts:• Design Quality into the Product• Achieve Quality by Minimizing
Deviation from the Target• Measure the Cost of Quality as a
Function of Deviation from the Standard (continuous improvement)
• Key Principles: Randomization, Replication, Blocking and Control
The Tools The Philosophy
6/19/2019
3
• Mixture DOEs for Multivariate Calibration of PAT analyzers
• Use MVA and DOE for candidate selection, optimization
• Use DOE to optimize MVA calibration model development process
• Use MVA to analyze multivariate responses generated from a DOE
• Multi-way MVA tool (PARAFAC), for multiplicative ANOVA (MANOVA)
• “Design Space” concept in QbD
The Many Synergies of MVA and DOE
Geir Rune Flåten, Frank Westad, Pat Whitcomb, Synergy of DoE and MVA, IFPAC 2019
• Generate data that sufficiently cover the range of PAT analyzer responses expected to be generated during real-time operation, and
• Generate data that can be used to sufficiently characterize any non-linear effects in the analyzer data.
DOE for Multivariate PAT Calibration: Objective(s)
FEED
REACTOR
PRODUCT
ANALYZER
6/19/2019
4
• Non-Linearities: X-X and X-Y• Modeling Math: Inverse modeling math (MLR,
PLS, PCR) vs. Direct modeling math (CLS, and extensions thereof).
• Interactions: Don’t need to model, but presence needs to be taken into account
• Y Variable Type: Compositional (ex. API concentration), or non-compositional (ex. dissolution rate, hardness).
• Uncontrolled Factors: Environmental factors that could influence the response (X) variables
• Limited Resources: ex. API (active ingredient) for generating standards, especially during small-scale process development.
DOE for Multivariate PAT Calibration: Key Considerations
• Design Space: “The multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality” (ICH Q8R2)
• Regulatory flexibility –Working within the design space is not considered as a change
QbD: “Design Space” Concept
S. Chatterjee, IFPAC 2012 0
2
4
6
02
46
0
2
4
6
8
10
B
x2
A
C
x1
x3
Linear algebra says: • Space (sub‐space) extends infinitely
in all relevant dimensions• Region represents only a specific
part of the space
DOE:1. establish relevant process variables (screening), 2. define the space in terms of manipulated process
variables
MVA: used to handle multivariate outputs from DOE, and to enforce Design Space Compliance in real‐time
Should it be called “Design Region” instead?
6/19/2019
6
• Non-DOE (Happenstance) data- to balance, avoid redundancies
• Even within a DOE, subset selection critical:
• Calibration set, Validation set, test set
Sample Selection in NIR
Næs, T. and Isaksson, T.(1989). Selection of samples for calibration in near-infrared spectroscopy. Part I. General principles illustrated by example. Appl. Spec. 43, 2, 328-335Isaksson, T. and Næs, T.(1990) Selection of samples for calibration in near-infrared spectroscopy, Part II. Selection based on spectral measurements. Appl. Spec., 44, 7, 1152-1158.
• Traditionally, DOE tools are relegated to the early (development) stages of the lifecycle
• MVA tools are gaining traction throughout the lifecycle (especially for PAT calibration), but are still relatively under-utilized, considering the increased data volumes being generated
Product Lifecycle
6/19/2019
7
• “PAT” = Process Analytical Technology
• All four value propositions depend on DOE and MVA
• PAT calibrations in development• PAT calibrations in supply• High-volume data screening/analysis• Scale-up process• MVA monitoring
•PAT = “catalyst” for more DOE, MVA usage!
Lifecycle: “PAT”-Centric View
Manoharan Ramasamy, Nathan Pixley, Bruce Thompson, Chuck Miller, Louis Obando, John Higgins, Mark Eickhoff, IFPAC 2015
Product Lifecycle TableLifecycle Phase Development Marketable Product
Pharma phase Pre‐clinical, Phase I Phase II Phase III Launch Growth Maturity‐Decline
Business Activities Safety testing Patient testing‐
Efficacy, side
effects
Patient testing‐
Efficacy, effectiveness,
safety, PAI
Start manufacturing
per filing
Manufacture per
filing, possible filing
updates
Generic competition
Technical Activities HTS, basic R&D,
exploration
Clinical studies,
MFG for studies,
process scale‐up,
control strategy
Clinical studies, MFG
for studies, assess
therapeutic effect,
prepare filing, MFG &
PAI readiness
Closely monitor
process, QA systems
Scale up MFG to
demand;
Ongoing process monitoring
DOE Activities Screening,
development DOEs
Clinical trial
designs, PAT
mixture/cal
design; pilot
plant designs
Clinical trial designs,
PAT mixture/cal design;
pilot plant designs
Designs for PAT
method updates,
process updates
Designs for PAT and
Analytical Method
transfers & method
updates
Designs to support process
improvements
MVA Activities Exploratory MVA;
Lab Analytical
development
PAT MV
calibration
models for R&D,
Exploratory MVA
Scale‐up/down process
modeling; PAT method
optimization, filing
preparations
“Hypercare” Process
monitoring & fault
detection; PAT
method monitoring
(ODs), PAT updates
PAT calibration
transfer, monitoring
(outlier diag),
method
maintenance;
Retrospective
process MVA
PAT calibration
maintenance, MVA process
investigations, deviation
support; Retrospective
process MVA
6/19/2019
8
• CASE 1: In-line NIR Reaction Monitoring: Since 1989• CASE 2: Pharma Real Time Release Testing (RTRT)
Application: Since 2005• CASE 3: Upstream Biotech PAT and MVA Applications:
Since 2008
Case Studies
Acknowledgements: DuPont, Merck
CASE1: In-line FTNIR Feed Monitoring
• Purpose: Reactor control in a continuous process
• Redundant with on-line GC, process model
• Complex composition space!• 4 production units across 2 plant sites• 100s of product grades• >10 constituents • 31 PLS models total
10.3.2010 (A): 132 MD alarms over 8 batches! All negatives!
Investigation of outlier metric limits (too “tight”???)
18.3.2011 (all): new MD limits set using 95% CL
30.6.2011: A model updateB and C model
updates
29.7.2009 (B): 6 MD alarms over 2 weeks, all negatives
B :3 MD alarms, all confirmed positives!
2 4 6 8 10
x 104
0
500
1000
1500
Sample
F v
alue
F value
F above limit
F below limit
27.1.2009 (A): 2 F alarms, confirmed positives!
•A model generated useful metrics right away; B and C models did so after a model update• Since 2006, only 7 confirmed positive tablets (out of >120k!)
• All were flagged by the outlier detector!• None resulted in tablet quality issues!
• However, 132 false alarms for model A in 2010
B model update effect on F value
12.2.2007 (A): 2 F alarms, confirmed positives!
XRCT confirmed “lumps” in tablet!
A strengthB strengthC strength
6/19/2019
15
Model A Outlier metrics: 2006-2011
29
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
50
100
150
M-Distance Value
freq
uenc
y
CALIBRATION
VALIDATION
0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.160
10
20
30
40
50
M-Distance value
freq
uenc
y
0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.080
1000
2000
3000
4000
5000
6000
M-Distance Value
freq
uenc
y
Calibration (N=167) and Validation (N=123)
Routine Analysis (N=101310)
132 MD‐based false alarms
M‐distance
F‐value
0 20 40 60 80 100 120 140 160 1800
50
100
150
200
250
F-Value
freq
uenc
y
CALIBRATION
VALIDATION
0 1 2 3 4 5 60
20
40
60
80
100
120
F-Value
freq
uenc
y
0 1 2 3 4 5 6 70
1000
2000
3000
4000
5000
6000
7000
8000
9000
F-Value
freq
uenc
y
• MD and F metrics of routine samples follow F‐distribution “fairly well”
• A few high‐F outliers in the calibration set, caused some “inflation” in the F‐value limit
• All 132 false alarms were caused by MD (not F value)
Evidence to support increase in CL for MD limits
Model A Update 2011
MD INDEX
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1 50 99 148
197
246
295
344
393
442
491
540
589
638
687
736
785
834
883
932
981
1030
1079
1128
1177
Spectra
MD
I
2006 Model
2011 Model
Model update evaluation includes studying the behavior of outlier metrics!
F-Probability
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
1 50 99 148
197
246
295
344
393
442
491
540
589
638
687
736
785
834
883
932
981
1030
1079
1128
1177
Spectra
FP
2006 Model
2011 Model
MEAN DIFFERENCE
0.0
0.5
1.0
1.5
2.0
2.5
1070108
1070137
1070257
1070436
1070444
1070510
1070696
1070698
1071027
1080184
1080265
1080570
1080812
1080930
1081082
1081223
1090025
1090207
1090264
1090451
1090452
1090455
1090456
1090567
1090757
1090933
1090964
1090976
1091138
1091391
1091463
1100057
1100181
1100183
1100234
1100237
1100238
1100239
1100244
1100245
1100246
1100247
1100437
1100581
1100749
1101010
1101182
1101287
1101288
1110034
1110039
1110040
1110041
1110043
1110075
1110086
1110150
1110156
2006 Model
2011 Model
RMSEP
0.0
0.5
1.0
1.5
2.0
2.5
1070 108
1070 137
1070 257
1070 436
1070 444
1070 510
1070 696
1070 698
1071 027
1080 184
1080 265
1080 570
1080 812
1080 930
1081 082
1081 223
1090 025
1090 207
1090 264
1090 451
1090 452
1090 455
1090 456
1090 567
1090 757
1090 933
1090 964
1090 976
1091 138
1091 391
1091 463
1100 057
1100 181
1100 183
1100 234
1100 237
1100 238
1100 239
1100 244
1100 245
1100 246
1100 247
1100 437
1100 581
1100 749
1101 010
1101 182
1101 287
1101 288
1110 034
1110 039
1110 040
1110 041
1110 043
1110 075
1110 086
1110 150
1110 156
2006 Model
2011 Model
2006 Model had very good prediction resultshard to further improve!!
Both outlier metrics had improved behavior M‐distance. in particular
Prediction
Perform
ance
Outlier
Perform
ance
6/19/2019
16
Site 1
Site 3
Life Cycle of RTRT NIR method
NIR 1a
Site 2Dose ADose BDose C
NIR 2a
Dose ADose BDose CDose DDose E
Dose ADose BDose CDose DDose E
NIR 2b
NIR 2b
Transfer
Transfer
Inter-instrumentTransfer
Inter-instrumentTransfer
NIR 3a
NIR 3b
Inter-instrumentTransfer
NIR 2c
Intra-site transfer
Inter-sitetransfer
Inter-site transfer?
Manoharan Ramasamy, Nathan Pixley, Bruce Thompson, Chuck Miller, Louis Obando, John Higgins, Mark Eickhoff, IFPAC 2015
RTRT still in service today‐ at three sites!
• Exploratory MVA models• Multi-scale MVA models• Pilot Plant MVA
monitoring• In-line Raman (PAT)
CASE 3: MVA Upstream Biotech Applications
Charles E. Miller, Louis Obando, John P. Higgins, Gert Thurau, AAPS 2012 Annual Meeting, Chicago IL, 10/17/12
6/19/2019
17
Product Lifecycle Table- CASE 3Lifecycle Phase Development Marketable Product
Pharma phase Pre‐clinical, Phase I Phase II Phase III Launch Growth Maturity‐Decline
Business Activities Safety testing Patient testing‐
Efficacy, side
effects
Patient testing‐
Efficacy, effectiveness,
safety, PAI
Start manufacturing
per filing
Manufacture per
filing, possible filing
updates
Generic competition
Technical Activities HTS, basic R&D,
exploration
Clinical studies,
MFG for studies,
process scale‐up,
control strategy
Clinical studies, MFG
for studies, assess
therapeutic effect,
prepare filing, MFG &
PAI readiness
Closely monitor
process, QA systems
Scale up MFG to
demand;
Ongoing process monitoring
DOE Activities Screening,
development DOEs
Clinical trial
designs, PAT
mixture/cal
design; pilot
plant designs
Clinical trial designs,
PAT mixture/cal design;
pilot plant designs
Designs for PAT
method updates,
process updates
Designs for PAT and
Analytical Method
transfers & method
updates
Designs to support process
improvements
MVA Activities Exploratory MVA;
Lab Analytical
development
PAT MV
calibration
models for R&D,
Exploratory MVA
Scale‐up/down
process modeling; PAT
method optimization,
filing preparations
“Hypercare” Process
monitoring & fault
detection; PAT
method monitoring
(ODs), PAT updates
PAT calibration
transfer, monitoring
(outlier diag),
method
maintenance;
Retrospective
process MVA
PAT calibration
maintenance, MVA process
investigations, deviation
support; Retrospective
process MVA
Upstream Biotech: Data Types
day
OFAV
batch
batch
SV
time
QV
In‐line Sensors:Ex: P, T, flow, DO, OUR, CERHigh frequency (~1/min)More Relevant to Operations
Off‐line Analytical :Ex: viability, Glu, LacLow frequency (~1/day)Relevant to Operations and Quality
Post‐Batch Analytical:Ex: IEX, N‐glycan %sOnce per batchRelevant to Quality
Engineer: “Air flow had been lowered to 2100 slpm before inoculation, but it did not get re-set to 3360 slpm for BATCH phase”
PPQ2 Batch, 21 June 2013:
6/19/2019
22
Pilot Plant MVA Monitoring
Model DevelopmentOn-Line Monitoring
• “Wider” model space: includes DOE process states• Analytics platform auto-integrates process data for later modeling work• Training opportunities for upcoming full-scale process validation, launch
Raman-Predicted Glucose (g/L)Raman prediction flagged as OUTLIERBioprofile Glucose (g/L)
Charles E. Miller , John P. Higgins, Louis Obando, Jorge Vazquez, IFPAC 2018
6/19/2019
23
• MVA and DOE have strong synergy• Driven by PAT applications
• Tools and concepts apply throughout the product lifecycle
Summary
Case 3: PAT Method Events-Original Site
46
• C model generated useful metrics right away; A and B models did so after a model update
• Since 2006, only 6 confirmed “positives” (out of >120k!)• all of which were flagged by the outlier detector!• all of which had no product quality issues!
• However, 132 false alarms for C model in 2010 Re-evaluate outlier metric limits (too conservative?...)
6/19/2019
24
• In service since 2006• Uses Bruker MPA NIR Diffuse
Transmittance• Three dosages (A, B, C)
• To date: billions of tablets manufactured, 100,000’s tablets analyzed
• Original Models based on mixture of DOE + normal manufacture
• On-going verification strategy supported by:
• Multivariate outlier diagnostics • Comparison to reference (LC)
CASE 2: Pharma Real Time Release Testing (RTRT) Application
John Higgins, Zhihao Lin, Charles E. Miller, Nathan Pixley, Manoharan Ramasamy, George Zhou, and NiyaBowers‐ IFPAC 2014
• “In-space” metric (M-distance)• Expresses “distance from model center”• Generally, describes structured variance in the data• For some calibration data, do NOT expect random distribution!
• Especially for Pavia’s [“DOE” + process] calibration data!
• “Out-of-space” metric (F-value)• Describes less-structured variance (“noise”) in spectral space• Therefore, expect more randomly distributed metric values
The Two Outlier Metrics: Different Expectations!
0 0.2 0.4 0.6 0.8 10
2
4
6
8
Distance from Model Center
Fre
qu
en
cy
6/19/2019
25
• 23 15L vessels• Two 1000L vessels• One product in development• Two processes: seed and production fermentations
• Each has three process phases
• Observation- AND batch-level models• >500 batches• 11 process variables • 25 SBOL models running concurrently, since Dec 2011