Heather Runes, Ph.D., MMTech Genentech, A Member of the Roche Group Wei-Meng Zhao and Dieter Schmalzing Genentech, A Member of the Roche Group January 27, 2014 CMC Strategy Forum Washington, D.C. Lifecycle Management of Commercially Approved QC Potency Assay for a Biotech Product. – A Case Study
34
Embed
Lifecycle Management of Commercially Approved QC …€¦ · Genentech, A Member of the Roche Group Wei-Meng Zhao and Dieter Schmalzing Genentech, ... Case Study: Method 1 (BLA) 2
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Heather Runes, Ph.D., MMTech Genentech, A Member of the Roche Group Wei-Meng Zhao and Dieter Schmalzing
Genentech, A Member of the Roche Group
January 27, 2014
CMC Strategy Forum
Washington, D.C.
Lifecycle Management of Commercially Approved QC Potency Assay for a Biotech
Product. – A Case Study
Presentation outline
• Method lifecycle management
• Considerations for post approval potency method change
• Three Case Studies:
1. Assay Replacement
2. Assay Enhancement
3. Assay Replacement
• Conclusions
Page 2
Page 3 Method lifecycle management
• Method validation for commercial use
• Assay training and transfer to QC Sites
• Critical reagents management
• X-site assay monitoring
• Technical support
• Continuous improvements:
• Targeted technical enhancements
• Complete replacement
• Retirement of preceding method
* For more information, see Paul Motchnik’s presentation on 1/28 (11:40 – 12:05)
Product Launch
New Technologies
Tech Transfer and
Routine Method
Maintenance
Pro-active
Assessment of
Method
Performance and
Regulatory
Expectations*
Drivers and Activities – Post-approval Potency
Method Change
• Drivers:
• New regulatory requirements
• Better understanding of MOAs
• Change in vendor support: reagents, hardware, software
• Superior technology (i.e automation)
• Increased efficiency (i.e. higher throughput)
• Work safety (i.e. ergonomic risk reduction)
• Activities:
• Assessment of criticality of change (i.e. regulatory impact)
• Development
• Robustness
• Validation
• Comparability study (new vs old method)
• Submission
Page 4
Method comparability
• Detect quantitation difference between the new and current methods
• Comparing validation data from two methods is not sufficient
• Only reference standard is used in validation
• Validations were performed at difference times with different personals,
equipment, etc.
• Head to head comparison using same samples
• Lot release samples
• Stability samples
• Stressed samples
• Pre-define acceptance criteria
• Consider specification and manufacturing capability
• Sample size
• Statistically determined to ensure reasonable chance of passing the
acceptance criteria
Page 5
Example 1 Page 6
Specification Range ≈ Manufacturing Capability
Tolerance for uncertainty: Lower
Acceptance Criteria: Tighter
Sample size for comparability: Higher
Example 2
Page 7
Specification Range > Manufacturing Capability
Tolerance for uncertainty: Higher
Acceptance criteria can be wider but
tighter criteria are applied in practice
Sample size for comparability: Lower
Case study – The Product
• Legacy protein therapeutic, on the market for many years
• Narrow potency acceptance range
• Multiple and different types of changes to potency assay during the
lifespan of the product up to present
Page 8
Method 2 Ref Std 2
Assay Control 2
Method 3 Ref Std 3
Assay Control 3 Spec Change
Method 4 Ref Std 4
BLA Method 1 release and stability acceptance criteria
Method History
Page 9
Method Format Mechanism
of Action
Driver for Change
1
Automated
Technology
1
Original Original BLA Method
2
Manual
Micro-titer
Plate 2
Same as BLA
Method 1 Instrument no longer supported
3
Manuel
Micro-titer
Plate 3
Same as BLA
Method 1
Non specific binding property
plate used in Method 2
4
Automated
Technology
4
Same as BLA
Method 1
Introduction of automation to
improve assay precision
Case
Study 1
Page 10
Case Study: Method 1 (BLA) 2 – Replacement
• Method 2 was developed and validated and comparability between the
two methods was shown
• 21 release samples were performed in comparability exercise
• No predefined acceptance criteria
• No extensive statistical analysis
• Comparison of same control: 0.5% difference
• RSD increased from 3% to 6%
• Increase of potency on release results after implementation
• iOOS events exceeding upper limit
• Root cause: nonspecific binding property of the plate
• Higher potency in sample positions
• This was not identified during development and validation
Page 11
Case
Study 2
Page 12
Case Study: Method 2 3 – Enhancement
• Plate change to remove nonspecific binding of the plate used in Method 2
• Plates use the same type of material but are treated differently on the
surface
• Positional effects were removed and validation was performed
• No head-to-head comparison was performed
• Removal of the positional effect was deemed sufficient
• iOOS events at lower end of the acceptance criteria during routine testing
• Due to the shift introduced by method change, proposal for specification
change was sought
Page 13
Approach for specification change
• Head to head comparison between Method 2 and Method 3 was
performed
• Approach was not accepted by agency since the specification was set based
on BLA Method 1 data
• No possibility of head-to-head comparison between BLA Method 1 and
Method 3
• Define potency method bias using most well-controlled historical data set
available
• Control represents a sample that is constant over time.
• Method changes are expected to affect control and release samples in a
similar fashion
• Chose large data set with no changes in lots of control or reference standard
Page 14
Control trending data Page 15
Same reference standard + same assay control over time
direct comparison of performances of 3 different methods over time
Analysis of the control data
• Shift in results from original method (BLA Method 1) to Method 3 is –
3.5%
Page 16
Difference in
potency between
methods
Mean shift 95% Confidence Interval
Lower Limit Upper Limit
Method 1 – Method 3 3.5% 2.6% 4.4%
Method 1 – Method 2 0.7% -0.1% 1.5%
Method 2 – Method 3 2.8% 1.8% 3.9%
Impact of reference standard change
• Change in reference standard added additional bias of - 1.8%
• Root cause: Use of international reference standard value instead of the
experimentally determined value
Page 17
Summary
• Shift in results from original method (Method 1) to Method 3 is – 3.5%
• Change in reference standard added additional bias of – 1.8%
• Overall change in results is - 5.3%, which is highly relevant in relation to
the narrow specification
• Propose to revise the acceptance criteria by lowering both the upper
and lower acceptance limits by 5.3% based on extensive data mining
and experimental data
• Proposal was accepted by agency
Page 18
Case
Study 3
Page 19
Technical limitation of Method 3
• To further mitigate the risk of iOOSs, a new method (Method 4) that uses
automation was developed to improve assay precision
Page 20
Challenge Solution
Results strongly dependent on analyst
techniques
–Labor intensive
–Ergonomic risks
–High training cost
–High assay-to-assay variability
–High system suitability failure rate
Automated process from dilution to
loading
Outdated method to calculate potency
–Potency determination based on
interpolation of standard curve
–No information on dose response
comparison between sample to
standard
Current standard for potency
calculation: Parallel Line Analysis
– Slope ratio criterion for parallelism
– Linearity criterion for standard and
sample
Method 4 assay development
• Lessons learned from previous cases of method changes
• Extensive development work was performed to understand Method 4
assay accuracy and precision and assay bias between Method 3 and
Method 4 if any
• Multiple DOE studies were performed
• Loading order effect investigated and resolved
• Validation and method comparability studies were only initiated after
extensive development work and both served as confirmatory exercises
to confirm what were shown during assay development