A. |
Company
Overview |
Done |
Notes |
A.1 |
Audit details (address, audit team, supplier
representatives) |
|
|
A.2 |
Company size, structure and summary of history
(number of sites, staff, organizational charts, company history) |
|
|
A.3 |
Product/service history (main markets, how many
sold, use in healthcare sector) |
|
|
A.4 |
Summary of product/service under audit (product
literature) |
|
|
A.5 |
Product/service development plans |
|
|
A.6 |
Tour of facility (to verify housekeeping, general
working environment, working conditions) |
|
|
B. |
Organization
and Quality Management |
Done |
Notes |
B.1 |
Management structure (roles, responsibilities) |
|
|
B.2 |
Method of assuring quality in product/service
(quality system, responsibilities for quality) |
|
|
B.3 |
Use of a documented Quality Management System
(QMS) e.g., existence of a quality policy and objectives, quality manual,
process definitions/procedures, standards |
|
|
B.4 |
Maturity of QMS (relevance to product/service
under audit) |
|
|
B.5 |
Control of QMS documentation (reviews, approvals,
distribution, updates) |
|
|
B.6 |
Maintenance of QMS documentation (regularly
reviewed and updated when appropriate) |
|
|
B.7 |
QMS certified to a recognized standard (e.g.,
ISO9001) |
|
|
B.8 |
Method of checking compliance with QMS (internal
audits, management reviews) |
|
|
B.9 |
Qualification and suitability of staff |
|
|
B.10 |
Independence of auditors, inspectors, testers,
reviewers |
|
|
B.11 |
Staff training (general, QMS, product/service
related, new staff, changes to QMS, regulatory issues, training records) |
|
|
B.12 |
Use of sub-contractors (individuals, companies): |
|
|
Method of selection |
|
|
|
Sub-contractor qualifications and training
records |
|
|
|
Specification of technical and quality
requirements in orders placed |
|
|
|
Method of accepting product delivered by
sub-contractor |
|
|
|
B.13 |
Experience of validation process (with other
customers, previous Supplier Audits, services provided by supplier,
involvement in regulatory inspections) |
|
|
B.14 |
Awareness of healthcare regulatory requirements
(knowledge of regulations, subscription to publications, attendance of
relevant events, involvement in industry groups) |
|
|
B.15 |
Continuous improvement programme (use of metrics
to evaluate and improve effectiveness of QMS) |
|
|
C. |
Planning
and Product/Project Management |
Done |
Notes |
C.1 |
Use of quality and project plans (per
project/product, defining activities, process definitions/procedures,
responsibilities, timescales) |
|
|
C.2 |
Status of planning documentation (reviews,
approvals, distribution, maintenance, and update) |
|
|
C.3 |
Documentation of user/supplier responsibilities |
|
|
C.4 |
Use of Validation Plan where supplied by user
company |
|
|
C.5 |
Project management and monitoring (mechanism,
tools, progress reports) |
|
|
C.6 |
Accuracy of, and conformance to, planning and
management process definitions/procedures |
|
|
C.7 |
Use of formal development life cycle |
|
|
C.8 |
Evidence of formal contract reviews where
applicable |
|
|
D. |
Specifications |
Done |
Notes |
D.1 |
User Requirements Specifications |
|
|
D.2 |
Functional Specifications |
|
|
D.3 |
Software Design Specifications |
|
|
D.4 |
Hardware Design Specifications |
|
|
D.5 |
Relationship between specifications (together
forming a complete specification of the system which can be tested
objectively) |
|
|
D.6 |
Traceability through specifications (e.g., for a
given requirement) |
|
|
D.7 |
Status of specifications (reviews, approvals,
distribution, maintenance and update) |
|
|
D.8 |
Accuracy of and conformance to relevant process
definitions/procedures |
|
|
D.9 |
Use and control of design methodologies (CASE
tools) |
|
|
ALSO READ: Factory Acceptance Testing and Site Acceptance Testing
E. |
Implementation |
Done |
Notes |
E.1 |
Specification of standards covering the use of
programming languages (e.g., naming and coding conventions, commenting rules) |
|
|
E.2 |
Standards for software identification and
traceability (e.g., for each software item: unique name/reference, version,
project/product reference, module description, list of build files, change
history, traceability to design document) |
|
|
E.3 |
Standards for file and directory naming |
|
|
E.4 |
Use of build files to compile and link individual
software configuration items into a formal release of the software product |
|
|
E.5 |
Use and version logging of development tools
(e.g., compilers, linkers, debuggers) used for each software configuration
item and the formal build and release of the software product |
|
|
E.6 |
Evidence of source code reviews prior to formal
testing (checking design, adherence to coding standards, logic, redundant
code, identification of critical algorithms) |
|
|
E.7 |
Independence and qualifications of reviewers |
|
|
E.8 |
Source code reviews recorded, indexed,
followed-up and closed off (with supporting evidence). Evidence of
timely management action for reviews which remain open. |
|
|
E.9 |
Listings and other documents used during source
code reviews identified, controlled, and retained with review reports |
|
|
F. |
Testing |
Done |
Notes |
F.1 |
Explanation of the test strategy employed at each
level of development (e.g., module testing, integration testing, system
acceptance testing or alpha/beta testing) |
|
|
F.2 |
Software Test Specifications |
|
|
F.3 |
Hardware Test Specifications |
|
|
F.4 |
Integration Test Specifications |
|
|
F.5 |
System Acceptance Test Specifications |
|
|
F.6 |
Structure and content of each test script (unique
reference, unambiguous description of test, acceptance criteria/expected
results, cross-reference to controlling specification) |
|
|
F.7 |
Relationship between test specifications and
controlling specifications (demonstrating that the system has been thoroughly
tested with traceability between specifications and tests) |
|
|
F.8 |
Evidence that test specifications cover: |
|
|
Both structural and functional testing |
|
|
|
All requirements |
|
|
|
Each function of the system |
|
|
|
Stress testing (repeat testing under different
conditions) |
|
|
|
Performance testing (e.g., adequacy of system
performance) |
|
|
|
Abnormal conditions |
|
|
|
F.9 |
Status of test specifications (reviews,
approvals, distribution, maintenance, and update) |
|
|
F.10 |
Formal testing procedure to execute test
specifications (method of recording test results, use of pass/fail, retaining
raw data, reviewing test results, progressing and resolving test failures) |
|
|
F.11 |
Status of test results and associated review
records (indexed, organized, maintained, followed up on failure) |
|
|
F.12 |
Involvement of QA function (as witnesses and/or
reviewers) |
|
|
F.13 |
Independence and qualifications of testers and
reviewers |
|
|
F.14 |
Accuracy of, and conformance to, relevant test
process definitions/procedures |
|
|
F.15 |
Control of test software, test data, simulators |
|
|
F.16 |
Use of testing tools (documented, controlled, and
versions recorded in the test results) |
|
|
F.17 |
Traceability of test results back to the
appropriate specifications/test scripts |
|
|
ALSO READ: Checklist for OOS Investigations in Pharmaceutical Industry
G. |
Completion
and Release |
Done |
Notes |
G.1 |
Documented responsibility for release of product,
such as certificate of conformity, authorization to ship (including evidence
that testing has been accepted with/without reservations) |
|
|
G.2 |
Handover of project material in accordance with
quality plan/contract (e.g., release notes, hardware, copies of
documentation/software) |
|
|
G.3 |
Provision of user documentation (user manuals,
administration/technical manuals, update notice with each release) |
|
|
G.4 |
Records of releases (i.e., which customers have
which version of system/software) |
|
|
G.5 |
Warranties and guarantees |
|
|
G.6 |
Archiving of release (software, build files,
supporting tools, documentation) |
|
|
G.7 |
Availability of source code and documentation for
regulatory inspection (e.g., use of ESCROW accounts, and which product
releases are covered) |
|
|
G.8 |
Customer training (summary of courses provided by
staff or third parties) |
|
|
G.9 |
Accuracy of, and conformance to, release process
definitions/procedures |
|
|
H. |
Support/Maintenance |
Done |
Notes |
H.1 |
Explanation of support services (agreements,
scope, procedures, support organization, responsibilities, provision locally/
internationally, use of third parties, maintenance of support agreements) |
|
|
H.2 |
Duration of guaranteed support (number of
versions, minimum periods) |
|
|
H.3 |
Provision of Help Desk (levels of service, hours
of operation) |
|
|
H.4 |
Fault reporting mechanism (logging, analysing,
categorizing, resolving, informing, closing, documenting, distribution,
notification of other customers with/without support agreement) |
|
|
H.5 |
Link between fault reporting mechanism and Change
Control |
|
|
H.6 |
Method of handling of customer complaints |
|
|
H.7 |
Accuracy of, and conformance to, support process
definitions/procedures |
|
|
I. |
Supporting
Processes and Activities |
Done |
Notes |
I.1 |
Documentation management (covering QMS and
product/project documents): |
|
|
In accordance with QMS/Quality Plan |
|
|
|
Following documentation standards |
|
|
|
Indexed, organized |
|
|
|
Reviews carried out prior to approval and issue |
|
|
|
Reviews recorded, indexed, followed up and closed
off (with supporting evidence) |
|
|
|
Evidence of timely management action for reviews
which remain open |
|
|
|
Formal approvals recorded, meaning of approvals
defined |
|
|
|
Distribution controlled |
|
|
|
Document history maintained |
|
|
|
Removal of superseded/obsolete documents |
|
|
|
I.2 |
Software Configuration Management: |
|
|
System for identifying, controlling and tracking
every version of each software configuration item |
|
|
|
System for recording the configuration of each
release (i.e., which software configuration items and versions are used) |
|
|
|
Identification of the point at which Change
Control is applied to each software configuration item |
|
|
|
Control of build tools and layered software
products, including the introduction of new versions |
|
|
|
I.3 |
Change Control covering software, hardware,
documentation: |
|
|
All change requests formally logged, indexed and
assessed |
|
|
|
Rejected requests identified as such, reasons
documented, signed by those responsible and the originator informed |
|
|
|
Changes authorized, documented, tested and
approved prior to implementation (except for emergencies) |
|
|
|
Emergency process definition/procedure documented
covering reviewing, testing, approving, and recording |
|
|
|
Impact of each change (on other items and on
requirements for re-test) assessed and documented |
|
|
|
I.4 |
Security process definitions/procedures (physical
access, logical access to accounts/software, virus controls) |
|
|
I.5 |
Backup and recovery process
definitions/procedures (secure storage and handling of media, on-site,
off-site, recovery procedure exercised) |
|
|
I.6 |
Disaster recovery process definition/procedure
(tried) and documented evidence of disaster recovery testing |
|
|
I.7 |
Control of purchased items bought on behalf of
user (e.g., computer hardware, layered software products), including
associated packaging, user documentation, warranties |
|
|
I.8 |
Accuracy of, and conformance to, relevant process
definitions/procedures |
|
|
Comments:
Views:
__________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
Vendor: _______________
Address: _______________
Auditor: _______________
Persons Present: _______________
ALSO READ: SOP for Vendor Qualification and Approval