You are on page 1of 65

Overview of Amgens

Commissioning and
Qualification
Program

Ronald Brunelle
Bob Buhlmann
Nick Haycocks

1
Agenda

Overview of Amgen Commissioning and


Qualification Program Ronald Brunelle
Workshop Warm-up Review of Key
Deliverables - Nick Haycocks
Computer Systems Developing a Risk
Based Approach Bob Buhlmann
Coffee break 3.20
Overview of Amgens
Commissioning and
Qualification
Program

Ronald Brunelle - Quality


Assurance Director
Amgen

3
Agenda

Introducing Change

Highlights of the Program

Maturing the Program

Lessons Learned and Next Steps

Workshop Warm-up - Review of Key


Deliverables
Why do this

Goal
Simplify Validation
Better, Cheaper, Faster
Industry Guidance Utilized
ASTM E2500
ICH
Introducing Change
From Impact Assessments to System
Boundaries

C&Q impact assessments time


consuming
Impact assessment giving inconsistent
results across sites
Component and system impact
assessments not helpful in the risk
based ASTM 2500 model
Boundary Diagrams

Boundary diagrams were created for all systems in


Amgen
Allowed a consistent manner for systems and
component assessments
No need to perform the C&Q impact assessment
process
Set the stage for Risk Based approach to qualification
Change from Impact categorization to
Levels Mindset Change
Level 1- Equipment assets within a system
where operation or maintenance activities can
affect the critical quality attributes, the critical
aspects, or the critical process parameters of the
product the system delivers.
Level 2 - Equipment assets within a system
where operation or maintenance activities can
have an adverse business impact.
Level 3 - Equipment assets not included in the
definitions of level 1 or 2.
Highlights of The Program
The ASTM 2500 Standard Guide Provides a
Framework for a New Approach to Commissioning
and Qualification
A science and risk-
based approach to
assure that GMP
equipment & systems
are:
Fit for use
Perform satisfactorily

Amgen interpretation and


application of:
ASTM-2500
ICH Q9
EU Vol. 4 Annex 15
FDA Process Validation
Guidance
ISPE Baseline Guide
Vol. 12: Verification
(Draft)
Amgen Document Hierarchy plays key
role in change
Policy:
Governing principle/position that mandates
or restricts action. Quality
Manual

Standard: GMP Quality


Operational principles that are typically Policy

translated into other required / controlled


GMP
documents (SOPs) for how to / execution Operating Specifications
Standards
information
Procedures
Procedure:
Instructions for performing tasks, activities Records
or practices that includes the identification
of roles and responsibilities.
Using the Document Hierarchy

Commercial Operating Commercial Operating Standard


Standard for Facility for
Design Validation of Level 1 Systems

Site
Automation Quality Risk
GEP SOPs Validation
SOP Assessment
SOPs

Integration of Engineering and Quality can be


achieved through Document Hierarchy
GEP Framework
Design
Review
Engineering Automation
Change Project
Management Delivery
Quality Quality GEP review
Requirements Requirements Concept and
Concept design Detail design detail design
stage checks stage checks stage

Commissioning
Planning and
Execution

Process BMA/BAS
Sample HAVC System Pipe System Equipment System Security
Sample Sample Facility
Summary Commissioning Commissioning commissioning commissioning commissioning and Commissionin System
Reports Plan / template Specifications sample report sample report sample report Automation g sample Commissioning
guidance report
GEP Framework

Engineering
Quality Systems

Engineering Walk
Vendor Data Vendor CMMS Instrument Receipt Down and
Requirements Assessment Data Calibration Verification Punch
listing

PDI / FAT User Construction Instrument Installation


Testing Requirements Inspection Check Out Verification
Validation, Qualification, and Commissioning
Processes

Cleaning Process
Validation Validation

User
Requirements Installation Development
Design Implement Change Decomm -
Inspection aCommissioning
nd and Operational SIP / PQ
Qualification for Use Control issioning
Testing Testing
Risk
Assessment Commissioning

IQ OQ I OQ Performance
Monitoring
Qualification

Validation

Verification: Activities within any of these processes


Qualification Process and Quality Oversight
Risk assessment
Accept and Release controls
s
confirmed
ntrol

Verification Strategy scope


d Co

Quali
and acceptance
criteria for risk
Build / Construct
nt an

controls

ty Ov
ssme

Specifications and Design Risk assessment

ersigh
controls
identified
Asse

Requirements

t
Approvals of
Risk

program
standards, GEPs
Qualification
SOPs
Quality Oversight model at full maturity of
the Engineering, Automation and Quality
processes.
Plan Design Build Test

User Commissioning FAT Receipt Commissioning Qualification


Quality Risk
Requirements Plan Verification & Tests - Automation Summary Performance
Assessments
Installation checkout, Performance Report Testing / PQ

Verification tests, SAT

Development
Validation Plan Design Reviews Testing


Quality Pre and Post Approval Quality Review Quality Approval
Quality Oversight model at early stages of
maturity of the Engineering, Automation and
Quality processes.
Plan Design Build Test

User Commissioning FAT Receipt Commissioning Qualification


Quality Risk
Requirements Plan Verification & Tests - Automation Summary Performance
Assessments
Installation checkout, Performance Report Testing / PQ


Verification tests, SAT

Development
Validation Plan Design Reviews Testing


Quality Pre and Post Approval Quality Review Quality Approval
Amgen Relational Model for Qualification
SMEs
Process
Product
System User
Regulatory Requirements
Quality
Vendors
Traceability

System Qualification
Risk Commissioning Performance
Specifications report
Assessment / Testing Qualification

Feed into ongoing


Design maintenance
Vendor Review Vendor Documents management
Assessment Good Engineering systems
Practices
Change
Management
Maturing the Program
Process Maturity

Engineering
Qualification

Integrated
Processes

Quality

Process
Overall Maturity

We are Some or few of the program elements incorporated.


here Quality pre- and post- approval of most qualification documents
Early Risk based approach not fully developed or well understood.
Adaptation Use and overlap of existing processes (SOPs, etc.)
Validation and engineering integration not instituted

And Many but not all program elements incorporated


here Quality pre- and post- approval of most qualification documents
Transitional Risk based approach not fully utilized and consistent
Reduction and movement away from existing processes (SOPs, etc)
Validation and engineering integration in progress

Most or all program elements properly implemented


Quality approvals focused on critical aspects as defined in mature program
Mature Risk based approach fully utilized and consistent
Existing processes (SOPs, etc) eliminated or completely aligned with program
Validation and engineering integration fully realized

24
Assessing Maturity
PROGRAM ITEM MATURITY STAGE

EARLY TRANSITIONAL MATURE


ADAPTATION STAGE STAGE

User requirements

Planning Validation and Commissioning

Supplier Assessment / Use of Supplier documentation

Automation Requirements and Specifications and Testing

Engineering Standards & Specifications

Risk Assessment

Design Review

Commissioning

Calibration

Confirmation of Qualification

Quality Oversight
Maturity - Commissioning

We are
here No predefined commissioning strategy
Early Company documents developed independently of vendor documents
Adaptation Limited use of SMEs

And Defined commissioning strategy


here
Partial integration of testing, multiple test documents.
Transitional Some duplication of testing
Some use of SMEs

Defined integrated test strategy


Mature Acceptance criteria for Quality and Engineering identified
Extensive reliance and accountability on SMEs
Commissioning considered as part of qualification

26
Assessing Maturity
Program Maturity stage of program as defined by program item
item
EARLY ADAPTATION TRANSITIONAL STAGE MATURE STAGE

User New Systems New Systems New Systems


requirements URS not implemented or URS implemented but URS consistently
inconsistent inconsistent implemented
Site SOP and or template Site SOP and or Approved SOP and
utilized template utilized Template
Use of other docs such Use of other docs Access to library of
as R/D such as R/D existing URSs
Project User GEP SOP referenced
Requirements only
Legacy Systems
Legacy Systems Guidance in OS-000034
Legacy Systems Guidance in GEP SOP enforced
Guidance in OS- 000034 OS-000034 Approved SOP and
Use of Change control Use of Change control Template
records records Access to library of
Update existing R/D docs Update existing R/D existing URSs
or other spec documentation docs

27
Lessons Learned - Wins

One site inspection by agency - the process


not challenged
Resulted in focus of Quality on critical
aspects
Different groups understand and appreciate
roles and interactions more clearly
Faster implementations and more robust
systems have been recognized
System based vs. component based thinking
28
Lessons Learned - Challenges
Expectations around documentation
practices
GEPs not well established and understood
Comfort zones are challenged and
resistance varies
Use escalation process for resistance

29
Maturing the Process
Engineering is intensifying their practices and
delivery models to provide more robust systems
Alignment of procedures and practices between
groups (Engineering, Automation, Validation)
Understanding and accountability of SME Role
Legacy systems applicability
Intensify training for risk assessments
Individual site assessments to ensure maturity
progression

30
Benefits Beyond Large Capital Projects

Project Scope
Operate &
Plan Design Build Test
Maintain

Adopted Practices

SMEs Additional Value


Process
Product
System
User Change Control
Requirements
Regulatory Assessments
Quality
Vendors Additional Failure Event Impact
Quality Risk Benefits Assessments
Assessment
Preventive
Maintenance
Others?
Thank You!

Ronald Brunelle -
rbrunell@amgen.com

32
Workshop Warm-up
Review of Key
Deliverables

Nick Haycocks - Quality


Assurance Senior
Specialist

33
Review of Key Deliverables

User Requirements
Risk Assessment
Qualification Summary Report
User Requirements - Purpose
Serves as:
The guiding document for the engineering
design, commissioning, and qualification process
for a system.
A focal point for documenting and
communicating requirements.
It is maintained throughout the lifecycle of the
system.
User Requirements Structure
Sections include:
Performance Requirements
Design Requirements
Operational Requirements
Automation Requirements
Miscellaneous Requirements
User Requirements Structure
ID No Category and Requirement Type Source
Parameter
1 Performance
This is a good requirement (for a
granulator)
1.1 Process The system must be capable of a Quality SPP
chopper speed range of 1500 and 3000
rpm 10%.
1.2 General This is an appropriate requirement
The system must include a scale
capable of measuring the 100 Liter tank Quality SPP
and contents with an accuracy of 0.1kg
2 Automation
General This requirement is inappropriate: Business
The control system must be operable in
2.1 Quality /
automatic or manual modes manual will
QRAES
allow stepping through the control
sequence for each recipe.
Why: Standard functionality should not
be specified within URS. This
requirement should be defined in the
detailed design documents.
3 ETC
Quality Risk Assessment
(Quality Risk Assessment for Equipment &
Automated Systems)
Risk assessment method for assessing
and evaluating quality risks as they pertain
to equipment and automated systems
Utilizes Severity, Likelihood, Detection
Business risks and safety risks are not
part of the Quality Risk Assessment
Applicability of the Quality Risk Assessment
To identify the equipment design and
performance elements that impact or
control the system quality attributes
To provide a quality focus within the
design, commissioning, and testing
activities
To assess the impact of changes to
legacy equipment and automated
systems
Requisites
System Quality Attributes and System
Process Parameters
Draft or Final User Requirement
Document or Equivalent
Appropriate SMEs
Preliminary or Final Design Specifications
Quality Risk Assessment
SQAs - A physical, chemical, biological, or
microbiological property or characteristic of the system
output that should be within an appropriate limit, range,
or distribution.
SQAs should always be able to be mapped to: Safety, Identity,
Strength, Quality, Purity

SPPs - A process parameter whose variability has an


impact on the SQA and therefore should be monitored or
controlled to ensure the process produces the desired
quality.
SPPs should always be able to be mapped to the SQAs
Compressed Air Example SQAs; SPPs
System Quality Attributes (SQAs)
??
??
??
System Process Parameters (SPPs)
??
??
??
Compressed Air Example SQAs; SPPs
System Quality Attributes (SQAs)
Particle count viable and non viable
Hydrocarbon content particulate and vapor
Moisture content
System Process Parameters (SPPs)
Pressure?
Flow?
Quality Risk Assessment (QRAES)

Design features Procedures Activities


to control the required to needed to verify

Impacted Systems
What
Reference No.

Source or cause likelihood of control the the risk controls

Occurrence

Detection
Effect / mechanisms
Severity
Potential hazard of the hazard hazard likelihood of are in place Risk
SQA consequence of would detect Comments
to SQA (how can the occurrence hazard working as Level
the hazard the result if the
hazard occur?) (Instrumentatio occurrence specified
risk occurred?
n, alarms and (SOPs, training (development
interlocks). manuals) testing, IV, OV)

Installation
Breakdown of the Verification (IV)
Compliance with could not be
product c ontact PM's will i nclude
the s pecifications completed i f there
materials of the r equirement to
for the product or are deviations
Particulate Potential construction Specification of inspect the
ingredient c ontact from the Consider
1 Purity contamination contamination of (metallic) - due to appropriate equipment during N/A
materials will be specifications, that Rouge
(metallic) the s ystem product the product, materials. maintenance a nd
confirmed through cannot be r eviewed
excipients or report a ny
installation and a ccepted
cleaning degradation or
verification (IV). through
materials. corrosion found.
engineering c hange
management.
The s pecification Operational The r eport from
includes the verification (OV) maintenance of
2 N/A N/A N/A N/A requirement to N/A will c onfirm that any damage, N/A N/A
clean a nd the s ystem was degradation or
passivate the cleaned a nd corrosion found.
system prior to passivated prior to
Qualification Summary Report

Confirms that identified risk controls


were tested and working properly
Confirmation that commissioning has
been successfully completed
Provides traceability of testing activities
to QRAES, verification documentation,
and requirements / specifications
Example - Summary Report
TABLE OF CONTENTS
1. PURPOSE
2. SCOPE
3. SYSTEM DESCRIPTION
4. APPROACH
5. REFERENCES
6. RESPONSIBILITIES
6.1 Operations
6.2 Validation
6.3 Quality
7. System Verification Summary
7.1 System Design Documentation
7.2 System Engineering Management Review
7.3 Quality Risk Assessment for Validation Traceability Verification
8. DISCUSSION
9. CONCLUSION
Thank You!

Nick Haycocks
haycocks@amgen.com

47
Computer Systems
Developing a Risk
Based Approach

Bob Buhlmann - Director


Corporate Quality Assurance
Amgen, Inc.
Agenda
GAMP
Risk Management Process
Functional Risk Assessment
Introduce Data Integrity Assessments
Why
What
Do NOT
When SDLC
Uses and Benefits
Summary

49
Data Integrity Assessment
Is the concept for delivering a risk based
approach to computer system validation

How to identify which data and records are


important

Is a shift for Quality into the Data of a System


and less on the functional testing of the
system

50
GAMP Risk Management Process
System GxP Determination
What is the Overall System Impact

Detailed Implement Required Controls


Assessment
N Verify Controls
Required? Review Risk/Monitor Control

Y
Identify Functional Risks
Assess Functional Risks
Identify Required Controls

51
GAMP Functional Risk Assessment

Product Data Patient


Quality Integrity Safety

Should be used to identify and manage risks to patient safety,


product quality, and data integrity

that arise from failure of the function on consideration

Identify functions with impact patient safety, product quality,


and data integrity through the User Requirements
Specifications and Functional Specifications

52
How do you provide for Product Quality
and Patient Safety ?
Data
Integrity

Product Patient
Quality Safety

The data of Computer Systems contains the risks that


relate to product quality and patient safety

53
Data Integrity Assessments Why
The growing need for electronic data:
Globalization
Operational Efficiency
Paperless
Companies need to:
Validate systems for Intended Use
Validate systems to ensure Data Integrity
Meet Regulatory Requirements (e.g., Part 11,
SOX)
In preparation for FDAs Part 11 inspection
initiative
Computer Systems and deployments have grown in complexity 54
Data Integrity Assessments Why -cont.
Ensure Data Integrity and identify controls to
prevent:
Inaccurate, incomplete, or missing data
Uncontrolled data modification (e.g., no recorded
reason)
Cross outs of data
Evaluate high impact computer systems for:
Electronic records identification and documentation
Data flow and Audit trails
Data backup/restore/archival
Security and Training
Electronic signatures
Critical Quality Parameter = Data (integrity) 55
Data Integrity Assessments What
High impact to product quality data systems
(e.g., Lab instruments, LIMS, CDS, MES)
Identify design elements that impact or control the
system quality attributes (data) in the area of:
Data Output
Data Change/Entry
Data Security
Data Backup
Provide a quality focus within the design, testing,
operations, and periodic review activities
Provide a basis for assessing the impact of changes to
systems

56
Data Integrity Assessments Do NOT
Determine the need or level (Rigor) of testing

Identify the need to perform a Supplier


Assessment

Identify Project Related Risks


Schedule Impacts

Resource Constraints

Financial Related Risks

Safety Related Risks (e.g., Data Center)


57
Data Integrity Assessments SDLC

Build/
Configure
Design Test

Data
Plan Integrity Maintain
Assessment

58
Data Integrity Assessments
Uses and Benefits
Determine/Define the risk and controls for the
systems electronic data:
Defines critical data fields
Identifies data that requires review
Ensure systems remain Validated for Intended
Use:
Requirements/Design Specification(s) Completeness
Supports the periodic review process
Provide continual improvement
Supports the Change Control Process

59
SUMMARY

Reviewed GAMP for the need for data


assessments

Data Integrity provides assurance of product


quality and patient safety

Foundation for conducting periodic reviews

60
Thank You!
Bob Buhlmann

61
Technique for Data Integrity Assessment
Question:
If reprocessing is performed, or if data is modified, describe
how and when someone would detect if it was performed
and by what means would it be detected

Response:
Reprocessing of data in the system is called reactivation.
Reactivation of the data is performed if there is any
correction to be made to the data (results) that is already
approved; this reactivation process and impact assessment
of the reactivation of data entities in LIMS is controlled and
governed by procedures.

62
Data Changes - Example
# Action Required
What if (Risk) Effect Controls to Reduce and Detect Risk

Testing and Incident management


Invalid Results are Test Results: Invalid Procedural control: C of A review process will
Approved by the results will be printed on catch the error. Batch is reactivated with a
Approver the C of A reason code. Sample Reactivated with a reason
code. Test Reactivated with a reason code.
Test is rejected by the approver with a reason
and Batch reapproved. Per SOP-XXXXXX

The result updates are captured in the audit trail


that is reviewed in the Lot review process per
SOP-XXXXXX.

System Control: The results that are out of


range are in purple font and crossed out with
purple color. Visual recognition for Out of range
results.

63
Technique for Data Integrity Assessment
Question:
Describe how (electronic/hardcopy) results are reviewed/
approved and by whom. What data is being reviewed/
approved (i.e. Sample ID, Lot Number, particle size).

Response:
An Authorized user logs samples and generates Sample
Labels and Sample Collection work list . Once the sample
plans/ Jobs are logged the following are printed and sent to the
client for sample collection.
Sample Labels
Mfg, EM and Water and Stability Collection Work lists
Samples are collected and the chain of custody is initiated
and Samples are received into Sample Management area and
distributed to the Lab.

64
Data Output - Example

# Action Required
Controls to Reduce and Detect
What if (Risk) Effect
Risk

Sample Management

Sample Labels are Label Control: Multiple System control: Once the chain of
reprinted sets of Labels for the custody is initiated on the sample,
same sample custody cannot be re initiated again on
the same sample. Also once the
Labels would be printed custody is started on a particular
and applied to samples sample; it will drop off of the collection
that would be tested and if work list. Samples are scanned in
the results did not pass, using a Bar Code and no label can be
the label could be scanned in more than 1 time. The
reprinted on a second Event Log captures how many labels of
sample that would then be been printed for each label.
tested.
Procedure

65

You might also like