You are on page 1of 7

he first three parts of this series

described the requirements of the


FDA rule on electronic signatures
and records. We focused on data
security and data integrity, demonstrating
how access to the system and to critical
system functions could be limited to
authorized personnel and how the integrity
of the data could be assured at the time of
data analysis and evaluation (13). We
discussed how the creation, modification,
and deletion of records are recorded in a
computer-generated audit trail. According to
a column by Barbara Immel, records and
validation issues were among the top
problem areas discovered in domestic and
international preapproval inspections in
1998 (4).
But 21 CFR Part 11 contains more than
what we have discussed. It also requires
ensuring data integrity through the entire
retention period detailed in the applicable
rules for good laboratory practices (GLP),
good clinical practices (GCP), and current
good manufacturing practices (CGMP).
Regulations in Part 11 refer only to the use
of electronic records and signatures, not to
which records need to be retained or for how
long. Those guidelines can be found in
previously published regulations called
predicate rules (a previously published set
of rules such as GLPs, GCP, or CGMPs that
mandate what records must be maintained,
the contents of those records, whether
signatures are required, and how long the
records must be retained). Long-term
archiving and ready retrieval of records
throughout the mandated lifetime is
probably the requirement most difficult to
implement.
Archiving and Retrieval
Long-term storage and ready retrieval are
mentioned in paragraph 11.10 (b) and (c) of
21 CFR, Part 11.
Ludwig Huber and
Wolfgang Winter
Wolfgang Winter is worldwide product manager,
data systems, and corresponding author Ludwig
Huber is worldwide product marketing manager,
HPLC, at Agilent Technologies GmbH, PO Box
1280 D-76337, Waldbronn, Germany, +49 7243
602 209, fax +497243 602 501,
ludwig_huber@agilent.com, www.agilent.com
T
The fourth installment of the
series describes the need for, and
how to achieve, long-term data
archiving and retrieval of
electronic records and explains
which types of data should be
stored. Recommendations for
selecting media and software
ensure not only long-term storage,
but ready retrieval for data
reprocessing as required by the
FDA rule.
Implementing 21 CFR
Part 11 in Analytical Laboratories
Part 4, Data Migration and Long-Term Archiving
for Ready Retrieval
. . . procedures and controls shall include
the following: (b) The ability to generate
accurate and complete copies of records in
both human readable and electronic form
suitable for inspection, review, and copying
by the agency. . . . (c) Protection of records to
enable their accurate and ready retrieval
throughout the records retention period. (5)
Implementation challenge. The difficulty in
implementing this part of the rule is caused
by a combination of three challenges.

Records must be stored and available in


electronic form. Data can be printed, but
printed material is not a substitute for an
electronic record, and the electronic record
must not be deleted. The typewriter
excuse is now invalid (3,6).

Records must be stored as complete


and accurate copies. A complete copy
includes metadata, such as processing
parameters and audit trail logs (3). In
chromatography, metadata include
integration parameters and calibration
tables (2). Metadata allow reviewers to
obtain the original results from the raw data.

Records must be readily available


throughout the entire retention period.
Inspectors want to be able to replay data
using the same process the system operator
used when the data were initially generated.
Although that is usually possible shortly
after the analysis was carried out, it is more
difficult several years later. The retention
period is specified in the predicate rules and
local legislation and may be up to 10 years
or more.
FDA reasoning. At the time the proposed rule
was prepared, industry comments suggested
alternative wording that would make
providing electronic copies optional,
suggestions such as that companies could
provide FDA with paper copies only (5).
Dissenters to the rule argued that providing
FDA with electronic copies was
unnecessary, unjustified, and impractical
considering the different types of computer
systems that might be in use. In the
preamble to the rule and at conferences,
FDA and its representatives defended their
position with three primary reasons that
suggested why paper printouts were no
substitute for electronic records.

You cannot print on paper everything


thats available in electronic form.

FDA wants to use the same tools to


evaluate the data that the operator used to
create the data. For example, FDA wants to
Regulatory Matters
Regulatory Matters
be able to reintegrate chromatograms using
original integration parameters to see
whether they make sense.

FDA wants to take advantage of


modern electronic search tools, which are
expected to make inspection work more
efficient. Without such tools, inspections
would take longer to complete, resulting in
delays in the approval of new medical
products. To operate effectively, the agency
must function on the same technological
plane as the industries it regulates.
Archiving and replaying data in
laboratories is usually easy as long as the
same computer hardware, operating system
(OS), and application software are still in
use. Processing parameters are stored in the
same folder as raw data. The original raw
data and the processing parameters can be
reloaded, and the data can be reprocessed.
The problem occurs when new application
software, a new operating system, or an
entirely new computer system is loaded, and
the original system is retired.
FDA Guidelines
FDA has stated clearly in an industry
guide and several conferences that Part 11
regulations extend beyond the retirement of
a computer system. For example, in the
following passage, FDA refers to the use of
computers in clinical trials.
Recognizing that computer products may be
discontinued or supplanted by newer (possibly
incompatible) systems, it is nonetheless vital
that sponsors retain the ability to retrieve and
review the data recorded by the older systems.
This may be achieved by maintaining support
for the older systems or transcribing data to the
newer systems.
. . . FDA expects to be able to reconstruct a
study. This applies not only to the data, but also
how the data were obtained or managed.
Therefore, all versions of application software,
operating systems, and software development
tools involved in processing of data or records
should be available as long as data or records
associated with these versions are required to be
retained. Sponsors may retain these themselves
or may contract for the vendors to retain the
ability to run (but not necessarily support) the
software. Although FDA expects sponsors or
vendors to retain the ability to run older versions
of software, the agency acknowledges that, in
some cases, it will be difficult for sponsors and
vendors to run older computerized systems. (7)
The question is how to achieve that
ability to access old records. Many in the
industry are concerned that old computer
hardware will have to be retained. Section
71 of the Part 11 summary states, The
agency notes that . . . persons would not
necessarily have to retain supplanted
hardware and software systems provided
they implemented conversion capabilities
when switching to replacement
technologies (8).
Similarly, Paul Motise, senior staffer and
consumer safety officer in CDERs Office of
Compliance, said during a conference in
Berlin, The agency did not expect
companies to save computer hardware and
software for the sole purpose of recreating
events. We anticipated that it would be
possible to make an accurate and complete
copy of those electronic records.
Consequences for laboratories. From those
comments, we conclude that records must be
kept electronically and be retrievable
whether or not the computer system used for
original data entry and evaluation is still in
use, and it is the users responsibility to
make sure that data can be retrieved in its
original form. That can be achieved by
keeping older systems hardware and
software available or by converting data to a
new system. Two primary considerations in
selecting a method for retrieving data in its
original form are the type of storage media
selected (because the stored digital
information needs to last for 10 years or
more) and how records will be read,
reevaluated, and printed if the software used
to generate the original result is obsolete.
This article discusses alternative ways to
meet the requirement for ready retrieval.
We use chromatographic data as examples.
We approach the problem by dividing the
solution into two parts: the selection of
storage media and considerations for long-
term storage, and the selection of software to
read and reevaluate the data. Before
addressing those considerations, we first
discuss the types of data that need to be
archived and retrievable. We also briefly
discuss the storage requirements for signal
and spectral detectors in chromatography.
Types of Records to Store
FDAs guidance for using computers in
clinical trials describes which types of
information need to be stored and migrated to
new systems. It is important to generate
accurate and complete copies of study data
and collateral information relevant to data
integrity. That information would include, for
example, audit trails and computational
methods used to derive the data. Any data
retrieval software, script, or query logic used
for the purpose of manipulating, querying, or
extracting data for generating reports should
be documented and maintained for the life of
the report. The transcription process must be
validated (7).
Types of data. In analytical measurements,
we typically have three types of data: raw or
original data, processed data, and metadata.
When the definition of raw data for
computerized systems was discussed a few
years ago, the recommendation was that a
user could decide whether raw data were the
original data captured on the computer or
the first computer printout. At that time, raw
data could be defined and stored on paper.
Part 11 has changed that. Raw data are
created when saved the first time on a
durable storage device, typically a
computers hard drive. In chromatography,
raw data are usually area slices calculated
from predefined time slices and from the
intensity of the signal (Figure 1).
From raw data, the software calculates
peak areas and amounts, and the results are
processed data (in long division, 1,000 5
would be the raw data, the work you had to
show on your paper in fourth grade math
DATA MIGRATION
is translating data
from one format or
storage device to
another. It is needed
when a company
adopts a new system
that is incompatible
with the one it used
previously.
class would be the metadata, and 200
would be the processed data.). The
parameters used for calculating the processed
data from the raw data are metadata. In
chromatography, metadata are mainly
integration parameters and calibration factors.
The predicate or applicable regulatory rules
define what type of data must be archived.
For example, some rules require archiving
raw data; others do not.
Spectral data. UV-visible (UV-vis) diode
array detectors in HPLC and mass
spectrometers are both spectral detectors
that add a third dimension of wavelength or
mass to run time and signal intensity
characteristics. Spectra are acquired during
the run and used to confirm a compounds
identity and to check a peaks purity. Those
runs can generate a large amount of data.
Although storage capacity is no longer much
of a problem, we need to be creative in
handling all that data in a database.
With a long chromatogram (for example,
one hour with a few peaks), acquiring all
spectra during the entire run would be a
burden because the only spectra needed are
from a peaks elution. We can do that with
appropriate diode array detector (DAD)
firmware (a combination of hardware and
software with the programming written
directly into read-only memory). The
benefits of acquiring the spectra during the
peaks elution are immediately apparent
only when we look at the storage capacity
needed for that example: 14 megabytes of
storage are needed for that single
chromatogram when all spectra are stored
for the entire run, but only 400 kilobytes are
needed when we acquire spectra just during
a peaks elution. To acquire spectra only
during a peaks elution requires a peak
detector in the firmware of the DAD, which
cannot be accomplished as part of the
computer software alone.
The type of data stored depends on the
applicable predicate rule. Part 11 requires
storing all data in electronic form if they are
saved on a durable storage device. That
includes the original or raw data, metadata,
and processed data. Motise makes that clear
in his statement about raw data storage in
chromatography: GMPs require you to
keep all laboratory data for as long as the
batch record must be kept and that includes
the chromatographic raw data itself. (9)
Storage Media
Electronic records can be stored on a variety
of media, such as computer hard drives,
digital tapes, CD ROMs, and digital video
disks (DVDs). Selection criteria for the most
appropriate media are the information
technology (IT) environment, existing
practices, storage capacity, anticipated
physical life-span, and most important, the
anticipated lifetime until obsolescence.
When you are determining retention time,
obsolescence is more important than the
physical life-span of storage media. Based
on historical evidence, storage media can be
expected to become obsolete within five
years. Most types of media may have longer
life-spans, but there are no guarantees that
they will be available longer than five years.
The market makes older storage media
obsolete when better new ones are available.
Examples of short life-spans are the 8-inch
floppy disk, the 5.5-inch floppy disk, tape
cartridges, hard sectored disks, and seven-
track tapes. Not only do the media disappear
from the market, but the appropriate drives
capable of reading the information on those
media disappear also.
Damaged data. Another problem with digital
media is that it is difficult to determine when
the data are damaged. Because of our lack of
experience with the longevity of digital
media and the risk of media and drive
obsolescence, records should be copied to
new media, either to the same type of media
or to the new media the company has
selected. The chain of media storage must be
unbroken throughout the entire lifetime of
the records.
More difficult challenges in the storage
and retrieval of digital information lie in the
softwares ability to make the information
available in human readable form.
Software to Read/Interpret Records
Digital information is recorded as 0s and 1s.
Therefore, that information makes sense
only with software that understands how to
access and convert the sequence of 0s and 1s
into meaningful numbers. We all know
about that from word processors and
presentation programs. To replay an
animated presentation requires the original
software and selected revisions of that
software. If such software is unavailable
from the presenter, viewer software is
distributed together with the presentation
file. Similarly, text documents lack correct
headers, footnotes, and sometimes
formatting such as paragraph types,
highlighting, and indentation when
reprocessed with something other than the
original word processing software.
Similarly, if we transform a spreadsheet into
a table, we need the formula that related the
cells of the table to each other.
The situation is similar with records
acquired from analytical instruments such as
chromatographs and spectrometers. The
software from chromatographic data systems
integrates peaks and calculates amounts of
analyte by comparing the results of the
unknown sample with known standards. If
you want to obtain the same results years
after an analysis was done, you still need the
same software. Before Part 11 was issued, a
common practice was to save the information
in a more generic file format like the
Figure 1. Raw data, metadata, and processed data in chromatography
Raw data
(area slices)
Metadata
processing parameters,
integration parameters,
calibration tables
Processed data
Peak area, amount
RT
Signal (mV)
Peak area
Regulatory Matters
Regulatory Matters
Even though instrument vendors intend to
work on developing standards, no such
generic solution is available now. Without
an available generic standard, system
portability (using a program on a different
computer without modification) and a
software vendors track record for migrating
legacy systems (inherited systems from
earlier platforms) should be the selection
criteria for a new data system.
Designing portable software. The key
challenge in using portability as the data
retention solution is the technical feasibility
of data migration. Visiting research fellow at
the University of Surrey, R.D. McDowall
(McDowall Consulting, Bromley, UK)
cautions that Although many vendors
adhere to the netCDF format, there are small
differences that may prevent a full match
between systems and the results they
produce (12). The migration must be
carefully considered when moving data
between systems from different vendors, and
when systems are replaced with new ones
from the same vendor. So, the data when
retrieved must be able to be reanalyzed with
the same results. To do this you may need to
have the original software or the software
will need to ensure that historical data can
be imported and reprocessed to obtain the
same results, suggests McDowall (13).
Portable algorithms. By applying state-of-the-
art software design methodology, a vendor
can prevent most of the difficulties
associated with changing software and
hardware platforms. Proper encapsulation
(combining elements to make a new entity)
at standardizing processing parameters have
been made through the Analytical
Instrument Association (AIA, Alexandria,
VA), which developed the ANDI protocols.
Their efforts are insufficient for compliance
with Part 11 the current limitations
include missing spectral functionality, for
example. Now, the American Society for
Testing and Materials (ASTM, West
Conshohocken, PA) committee is working
on further definitions, but no results are in
sight.
Standard functions. The real difficulty for the
standards approach is finding a common set
of data processing functions, rather than
agreeing on a standard data file format. The
standards approach works only if all
software products have the same
functionality or feature set and probably
require the same algorithms (a formula or
set of steps for solving a particular problem)
for data evaluation. To survive in the
market, products (including software) must
introduce proprietary functionality. If ready
retrieval requires all software be able to
regenerate the results, that also means all
software must have the same functionalities.
To secure and extend their market positions,
vendors will always introduce new
functionalities requested by users to increase
their productivity. That natural behavior
conflicts with the generic standards
approach. Whereas the standards approach
can work for well-established and static
techniques, standards will be difficult to
achieve for the emerging technologies used
when developing new drugs.
analytical data interchange (ANDI) format
(10,11). Analysts could reload, display, and
print chromatograms and reports, together
with information on the operators name,
analytical methods, and calibration data
across data systems from different vendors.
The disadvantage of that approach is that it
did not allow replaying the data on a different
system to yield the same result.
That disadvantage makes the practice of
saving information in a generic file format
unacceptable for the current interpretation of
Part 11. Inspectors want to be able to
reprocess information and generate results in
the same way as the original operator. That
means we need to store and retrieve raw
data, processing parameters, and other
transformations such as method specific
calculations, calibration tables, and
recalibration history data. As long as the
same data system software is used, that
poses little problem. Instrument vendors
usually take care that the same results are
obtained when moving from one revision to
another, as long as the software is on the
same platform (the same hardware and
software). Ideally, processing parameters are
stored in a folder together with raw data and
results. Typically, problems arise if the
vendor moves to a new platform or if the
company selects a new vendor to meet
business or technical requirements, or
because the previous vendor has gone out of
business. If the system is retired, the
company may no longer have the software
to reprocess older data.
Alternative solutions. In theory several
solutions are available for reprocessing older
data. In practice, each has severe limitations.
Before Part 11, four solutions existed, and
the first was to print and save digital
information as hard copy. With Part 11 in
effect, three solutions remain: using data and
software standards so that data can be shared
between different software packages;
keeping old computer hardware, application
software, and operating systems; and
migrating data to new systems in a
supervised, controlled process.
Data and Software Standards
Using data and software standards means
that the user of a system generates raw data
and analytical results using a specific set of
processing parameters. The data format is
generic, agreed to by different vendors, and
is therefore interchangeable. Some attempts
Figure 2. Storage requirements for signal and spectral detectors with different spectral
acquisition modes
14,000 Kb
400 Kb
1 signal 290 kb
Peak spectra (apex, slopes) 32 kb
A1 spectra during a peak 400 kb
A1 spectra during a run 14,000 kb
UV-vis diode array, run one hour, 10 peaks
of the algorithmic portions of a system from
the user interface and other operating
systemdependent layers of the physical
system lessens the dependency on a
particular system environment or operating
system. That method of structuring software
enables portable algorithms that vendors can
use across generations of systems or with
new ones specifically designed for a
different operating system.
Agilent Technologies (then called
Hewlett-Packard) successfully used that
approach when designing the Generic
Integration Engine (GENIE) integration
algorithm almost 20 years ago and for the
algorithms for peak identification,
quantification, and calibration first used in
Agilent ChemStations in 1991. Portable
algorithms allow a software vendor to test
the algorithms separately from the rest of the
system and to test them under a different
operating system. For example, Agilent
tested its quantitative algorithms under both
Windows (Microsoft, Richmond, WA) and
Unix (Bell Labs, Murray Hill, NJ) long
before Windows NT became dominant.
Using the same algorithms in established
data systems like ChemStations and in the
new Cerity networked data systems
(Agilent) is a prerequisite for enabling the
migration of legacy electronic records.
Portable algorithms allow vendors to
continue maintenance, providing users with
a viable support path for the future.
Designing laboratoryspecific applications
that use a componentbased software
platform was recently discussed in the
literature (14,15).
Revalidation and regression testing. The
described encapsulation approach allows the
test engineering teams employed by vendors
to develop powerful computer-based,
automated regression test suites (software
that tests changes to computer programs to
make sure that the older programming works
with the new changes). Even in the early
stages of a software products life cycle,
those test suites allow a vendor to perform
extensive software testing on a particular
module.
Known input can be fed into the module
interfaces, and the resulting output can be
compared with predefined test specification
results (Figure 3). Regression test software
can automatically determine whether the
result is within the defined acceptance
limits, and it can flag deviations. An
invaluable tool during systems development
and testing, the same approach can be
applied effectively and consistently to other
aspects of system validation, particularly for
the revalidation or requalification of a
system, such as for a revision update
supplied by the vendor.
System verification tools. For chromatography
data systems, our company has implemented
a system verification utility in the
ChemStations. The utility allows
chromatography results to be recorded using
freely definable chromatograms and
evaluation parameters, and those results are
stored in a binary checksum protected
register, which counts the number of bits
transmitted and included with the
transmission so that the receiving program
can check to see whether the same number
of bits arrived. When operational
qualification is necessary after a revision,
the update reruns the verification test and
compares the newly determined results with
the prerecorded results. The outcome of that
comparison is documented in an appropriate
system verification report (Figure 4).
Keeping Old Hardware in Museums
The second method for data retention and
ready retrieval is to keep a generation of
computer hardware, application software,
and operating systems in some kind of a
computer museum. Old computers run the
original software to access, reevaluate, and
display original results. The only advantage
to this approach is that it makes your
company independent of vendors. It can be a
temporary solution if a specific vendor goes
out of business.
Problems associated with the museum
solution are well known. Computer chips
have a limited lifetime. Integrated circuits
decay because of processes such as metal
migration and dopant diffusion. Obsolete
computers are difficult to keep running at a
reasonable cost for a long period of time.
Even if large companies could afford it,
access would be limited to one or a few sites
in the world, which would prohibit ready
retrieval. So computer museums are an
unreliable solution to data retention and
ready retrieval.
Data Migration to New Systems
Data migration entails making sure either
that new systems can process data generated
on older systems or that data can be
converted to work on a new system.
Typically that approach works well when a
single vendor is involved, the time periods
are short, and the software remains on the
same platform. Vendors usually make sure
that data are backward compatible so data
from a legacy system can be processed on
new systems. That practice (and sometimes
its lack) is also well known from office
programs. For example, newer versions of
word processing software typically can read
Figure 3. Regression test tool used to qualify results from Cerity (Agilent Technologies)
against prerecorded test specifications
Regulatory Matters
Regulatory Matters
requirements and functional specifications
list. The company should also develop test
data sets that represent typical samples and
use those to test the compatibility of the new
software with the older versions. Performing
such tests should be part of the change
control procedures when changing an
existing system and it should be part of the
migration strategy.
Compliance for Older Data
One question that frequently comes up is
what to do with data and metadata that were
recorded or that will be recorded until all
and could cause problems at slightly
different rounding algorithms.
Migration procedures. Although less than
ideal, validated migration procedures now
seem to be the only viable solution to data
retention and ready availability. Vendors can
help reduce the burden of the migration
process by providing conversion routines
and software to check the validity of the
conversion on data sets specified by the
user. Those functions should be built into
products, not developed as an afterthought.
Companies purchasing software are advised
to include those functions in their user
formatted text documents written on older
versions.
Compatibility. Vendors play a major role
here. The ideal scenario would be for the
new software product, which can be either
an update of an existing product or a new
software platform, to automatically
guarantee full compatibility with previous
models, either directly or after data
conversion. Full compatibility means that
the new product must have all the
functionality that the old product had. New
functions can be added, but previous
functions must not be removed.
Checking validity. With every software
revision, either of an application or an
operating system, the validity of previously
recorded data files should be checked. That
is best done by using a few sets of raw data
and the associated processing parameters
acquired from real samples. Again, the
software vendor can help. Ideally, validity-
check software should be provided with a
revision that will automatically compare
results generated on the new product with
those from the older one. The validity check
software must also be validated.
In a validity check, results are calculated
and compared with previously specified
acceptance limits. Specifying such limits is
important and can prevent later trouble. For
example, the acceptance values for
chromatographic peak areas and amounts
should be in a range of 0.05 to 0.1%, which
is about the best precision of the analytical
results. Specifying better accuracy, for
example up to seven digits, is unnecessary
Figure 4. The system verification test for
ChemStations (Agilent Technologies)
shows that such tests can include the
complete data path, which includes the
digital data acquisition using a prerecorded
chromatogram stored in the 1100 array
detector.
A recent article in BioPharms sister
publication LCGC Europe proposed
seven steps for data migration and
system retirement (12). The article
suggests that companies inventory the
existing system and which departments
use it, perform a risk assessment, write
a system retirement plan including roles
and responsibilities, gather information
about the current hardware and
software, write a system
decommissioning and data migration
plan, execute the work, and report the
work in a retirement report.
The team chartered to define the
system migration needs to make well-
informed decisions on which data need
to be migrated and which do not.
Writing the decommissioning and
retirement plan and executing the work
will be the biggest items on the to-do
list.
We propose the following refinements
to the data migration plan.

Develop a migration policy and


strategy for your company.

Develop an active implementation


plan with time schedules and
checkpoints.

Define data and metadata for all


system categories.

Try to reduce the amount of data to


be archived. For example, save spectra
only during a peaks elution instead of
during the entire run. Adapt the
calculations and report formats used in
the data systems so they produce the
results mandated in your standard
operating procedures. Prevent results
that are irrelevant for laboratory
purposes and that result in more
overhead when migrating to a newer
system. The latter requires flexibility in
the data system.

Define the type of data to be retained.


For example, define whether raw data
must be archived.

When running the analysis, save the


processing parameters in the same
directory as the data (both the raw data
and the results).

Validate the proper functioning of the


previous step by retrieving data and
metadata and reprocessing the
analysis.

Include backward compatibility in the


user requirements and functional
specifications for future revisions and
platforms.

Select a proper storage media for


long-term archiving. Adhere to the
prescribed storage conditions. For
example, when archiving on tape,
regular tape retensioning is required.

Develop and implement a procedure


to check the integrity of data at regular
intervals. Again, save the processing
parameters in the same directory with
the data.

Before you retire a system, make sure


that the data can be accurately
processed on the new systems. Results
should be within the limits as specified
during the original analysis.
A STRUCTURED APPROACH TO SYSTEM MIGRATION
by a discussion of biometrics. In computer
security, biometrics refers to authentication
techniques that rely on measurable physical
characteristics that can be automatically
checked, such as fingerprints, retinas and
irises, voice patterns, facial patterns, and
hand measurements for system access and
electronic signatures.
Acknowledgments
The authors would like to thank Risto Peltonen and
Michael Beck of Agilent Technologies in
Waldbronn, Germany, for their help in configuring
and running the Cerity regression test suites.
References
(1) L. Huber, Implementing 21 CFR Part 11 in
Analytical Laboratories: Part 1, Overview and
Requirements, BioPharm 12(11), 2834
(1999).
(2) W. Winter and L. Huber, Implementing 21
CFR Part 11 in Analytical Laboratories: Part 2,
Security Aspects for Systems and
Applications, BioPharm 13(1), 4450 (2000).
(3) W. Winter and L. Huber, Implementing 21
CFR Part 11 in Analytical Laboratories: Part 3,
Ensuring Data Integrity in Electronic
Records, BioPharm 13(3), 4549 (2000).
(4) B. Immel, GMP Issues: Step Up to the
Responsibility of QA and QC, BioPharm
13(2), 5859,70 (2000).
(5) Code of Federal Regulations, Food and Drugs,
Title 21, Part 11, Sections 11.10(a) and
11.10(b), Electronic Records; Electronic
Signatures; Controls for Closed Systems
(U.S. Government Printing Office,
Washington, DC, 1999). Also Federal Register
62(54), 1342913466. Available at
www.access.gpo.gov/nara/cfr/waisidx_99/21cf
r11_99.html.
(6) B. Immel, GMP Issues: An Electronic Eye
Opener, BioPharm 12(6), 6063 (1999).
(7) Center for Biologics Evaluation and Research,
Guidance for industry: Computerized Systems
Used in Clinical Trials (FDA, Washington,
DC, April 1999). Also Federal Register
necessary administrative and technical
controls required to implement Part 11 have
been developed. Part 11 became effective on
20 August 1997. Before that date, it is
inapplicable, and records generated by a
computer could be stored on paper. Part 11
is not retroactive.
From 20 August 1997 on, all records
generated by a computer and stored on a
durable storage device must be recorded and
archived electronically. Most laboratories
did not have, and many still do not have, the
procedures and tools in place to comply with
Part 11. Some software is unable to
electronically store metadata together with
the raw and processed data. In its
compliance policy guide, FDA makes it
clear what level of compliance it
expects (16). FDA representatives have also
made it clear in numerous discussions that
they expect administrative controls to be in
place, including procedures and policies.
Technical controls can take a little bit
longer, but the implementation process
should be a best effort. FDA expects an
active implementation plan with a time
schedule and checkpoints.
Investing time in an effective migration
procedure is important because with each
migration cycle, the amount of digital
information will increase almost
exponentially.
Looking Ahead
The final installments of Implementing 21
CFR Part 11 in Analytical Laboratories will
discuss the importance of appropriate
computer control of analytical instruments
to ensure compliance with Part 11, followed
64(89). Available at www.fda.gov/ora/
compliance_ref/bimo/ffinalcct.htm
(8) Code of Federal Regulations, Food and Drugs,
Title 21, Part 11, Summary, Electronic
Records; Electronic Signatures; Controls for
Closed Systems (U.S. Government Printing
Office, Washington, DC, 1999). Also Federal
Register 62(54), 13446. Available at
www.fda.gov/ora/compliance_ref/part11/.
(9) P. Motise, FDA Requirements for Computers
in Analytical Laboratories, paper presented at
the ECA Conference, Berlin, September 1999.
Available at www.labcompliance.com/
conferences/august99.htm
(10) E1947-98 Standard Specification for
Analytical Data Interchange Protocol for
Chromatographic Data (American Society for
Testing Materials, West Conshohocken, PA,
1999). Available at www.astm.org.
(11) E194898 Standard Guide for Analytical Data
Interchange Protocol for Chromatographic
Data (American Society for Testing Materials,
West Conshohocken, PA, 1999). Available at
www.astm.org.
(12) R.D. McDowall, Chromatography Data
System V: Data Migration and System
Retirement, LCGC Europe 13(1), 3035
(2000).
(13) R.D. McDowall, Just e-sign on the Bottom
Line? LCGC Europe 13(2) 7986 (2000).
(14) T.A. Rooney, Computers in Chemistry
Chromatography Data Systems Just Got
Easier: New Networked Software Delivers
Flexibility and Ease-of-Use, Todays Chemist
at Work, 9(2) (ACS Publications, Washington
D.C., 2000), pp. 1724. Available at
http://pubs.acs.org
(15) L. Doherty, J. Welsh, and W. Winter, A
Networked Data System for Specific
Chromatography Applications, Am. Lab.
32(3), 5058 (2000). Also available as Agilent
Technologies (Palo Alto, CA) publication
number 59800231E.
(16) Compliance Policy Guide: 21 CFR Part 11;
Electronic Records, Electronic Signatures
(CPG 7153.17) (FDA, Washington, DC,
13 May 1999). Available at www.fda.gov/
ora/compliance_ref/cpg/cpggenl/
cpg160-850.htm. BP
Regulatory Matters
Agilent Technologies
publication number
5980-2324E
Reprinted from
Volume 13, Number 6
pages 58-64
June 2000
AN ADVANSTAR
#
PUBLICATION
Printed in U.S.A.
Copyright Notice Copyright by Advanstar Communications Inc. Advanstar Communications Inc. retains all rights to this article. This article may only be viewed or printed (1) for personal use. User may not
actively save any text or graphics/photos to local hard drives or duplicate this article in whole or in part, in any medium. Advanstar Communications Inc. home page is located at http://www.advanstar.com.

You might also like