1. Do software programmes which have been developed by outside vendors or academia to support calculations by test facilities using an OECD Test Guideline, and are referenced in OECD Test Guidelines, need to be validated by test facilities?
It is important to note that use by test facilities of the software programmes for these Test Guidelines is voluntary. That is, a test facility is not required to use the programmes to complete a test. However, if a test facility uses such a programme it must have the appropriately scaled validation documentation of risk assessment, user requirement specifications, acceptance testing records, and a listing in the computerised systems inventory.
Currently, software programmes have been developed for four OECD Test Guidelines:
While these software programmes were not developed by OECD, they are referenced in the Test Guidelines, and can be downloaded for free on the OECD public website. These software programmes are provided as a courtesy of the developers of the test methods, but have not been validated, reviewed or approved by OECD. The developer, and not the OECD, should be described as the supplier of the software. In addition, the maintenance of the software or of the calculation sheets is not guaranteed by OECD.
If a test facility does use one of these products, and if that facility wishes its test results to be compliant with OECD GLP, it must follow the guidance as described in Advisory Document 17 Application of GLP Principles to Computerised Systems.
These software programmes are considered “Commercial Off-The-Shelf products (COTS)”, as described in OECD Advisory Document 17. Therefore, they require appropriate validation depending on the risk and the complexity of any customisation. If an application (e.g. a spreadsheet) is not complex, it might be sufficient to verify functions against user requirement specifications. In addition, user requirement specifications should be written for any application that is based on a COTS product. Documentation supplied with a COTS product should be verified by test facility management to ensure it is able to fulfill user requirement specifications.
A test facility should perform a validation of a COTS software programme to ensure that it meets their needs, depending on prior validation by the provider. There must be documentation from the supplier that they have done the validation. Documentation must be available on the validation performed by the test facility.
Therefore, if a test facility uses one of the software programmes to support data intended for a regulatory submission or to support a regulatory decision, and a validation has not been conducted by the vendor or documentation to support that validation is not available, there would be an expectation that the test facility would perform a full validation to ensure that it meets their needs as described in Advisory Document 17. Even if a validation has been conducted by the vendor, and documentation exists, it is still the test facilities' responsibility to determine whether that is sufficient and, thus, further validation is not necessary. As mentioned above, the extent of the validation would likely depend upon prior validation (and documentation to support such a validation) as well as adaptation by the test facility (configuration) to meet their individual needs. And, if the user modifies the software programmes (customisation), there would be an expectation that the test facility would perform a full validation to ensure that it meets their individual needs. Documentation must be available on the full validation performed by the test facility.
Software programmes available for Test Guidelines on OECD's Public Websites:
(Posted on 27 March 2017)
1. Section 2.5 of OECD Advisory Document Number 16 (Guidance on the GLP Requirements for Peer Review of Histopathology) indicates that all correspondence regarding the histopathological evaluation of the slides used for peer review between the sponsor and the representative of the test facility and the peer reviewing pathologist should be retained in the study file. Could the interpretation of this requirement be clarified?
The clarification is as follows: Correspondence refers to any communication that is needed to reconstruct how slides were selected and reviewed. This should include communications regarding the interpretation of any observations (preliminary or final) on adverse or non-adverse effects made during the review.
(Posted on 27 March 2017)
2. Section 2.8 of OECD Advisory Document Number 16 (Guidance on the GLP Requirements for Peer Review of Histopathology) indicates that where the peer reviewing pathologist’s findings were significantly different from the original interpretation of the study pathologist, a description of how differences of interpretation were handled and changes made to the study pathologist’s original interpretation should be discussed in the final report. Does this apply to both retrospective and contemporaneous peer review?
The clarification is as follows: Section 2.8 relates specifically to a retrospective peer review. For a contemporaneous peer review, there is an expectation that all correspondence (letters, e-mail etc.) relating to differences in the interpretation (preliminary or final) of slides between the original pathologist and the peer reviewing pathologist which may impact on the conclusions of the study (e.g. NOEL/NOAEL) are to be retained in the study file.
(Posted on 27 March 2017)
The OECD Principles of Good Laboratory Practice (GLP) provide provisions for errors in the final report to be corrected and admissions to be addressed by issuing a study report amendment. However, it would not be appropriate to use a study report amendment to facilitate the reanalysis of data or add new data to a final report except under exceptional circumstances.
Exceptional circumstances will include requests from receiving authorities to reopen a GLP study. Such requests are usually made so that data can be reanalysed. For example, studies may be reopened to reassess statistical analyses or to review histology findings.
Monitoring authorities will usually not allow a study to be reopened if the test facility or study sponsor wants to reanalyse or add data. However, most monitoring authorities will assess each request to reopen a study on a case-by-case basis.
If a GLP study is reopened, any changes to the original text or the addition of new text must be presented in the form of a report amendment. All the original data must be retained in the final report and the reason for reopening the study should be documented in the amendment. If additional work is performed that was not required in the original study plan, it should be covered by a study plan amendment.
As detailed in Section 9, Part 9.1 (5) of The OECD Principles of GLP, reformatting of the final report to comply with the submission requirements of a national registration or regulatory authority does not constitute a correction, addition or amendment to the final report (posted on 21 January 2016).
The early termination of a study may occur prior to, or after, the completion of the experimental phase of the study, but before the data has been assessed or incorporated in a final report. In both situations, a study plan amendment must be produced in order to provide an explanation of why the study was terminated. Some compliance monitoring authorities may expect that the key findings up to the point of termination are summarised and that the summary report is subject to a Quality Assurance (QA) audit (posted on 21 January 2016).
There is no requirement to finalise the validation of all methods that will be used to conduct a GLP study before the initiation of the study. However, there is an expectation that methods are fully validated before the results of the study are considered to be valid (posted on 21 January 2016).
Unless stipulated in national regulations, there is no requirement to perform method validation in compliance with GLP. Since parameters of the validated method are used in the GLP study (for example threshold, linearity, accuracy, precision, stabilities, equipment settings, etc.), data should be accurately recorded and stored in a manner that protects its integrity. Validation data may be required for study reconstruction and, consequently, it should be retained for an appropriate period of time.
The quality assurance statement should clearly identify the study and include all the types of inspections that are relevant to the study (including inspections performed as part of a study phase(s)). Associated information should include the dates the inspections were performed and the dates inspection results were reported to management, the study director and if applicable the principal investigator. Some monitoring authorities will require the quality assurance statement to include confirmation that facility audits have been performed.
The statement should confirm that the final report reflects the raw data and some monitoring authorities will require this fact to be clearly stated in the statement.
Verification of the study plan by QA personnel should be documented (see OECD Principles on Good Laboratory Practice, Doc. No. 1). This would also apply to study plan amendments. The GLP Principles do not formally require that these verifications are included in the quality assurance statement but this is often the case (posted on 15 July 2014).
The OECD Principles on GLP (Section II, Par. 2.2.1.f) require QA personnel to sign the statement. The Principles do not restrict this responsibility to specific QA employees such as, for example, the manager of a QA department. However, the procedures for compiling the statement and the responsibility for signing the statement should be described in QA procedures (posted on 15 July 2014).
As is the case for all operative procedures covered by the GLP Principles, the QA programme of inspections and audits should be subject to management verification. What constitutes verification will differ from one monitoring authority to another. In some cases verification will include a requirement for the independent inspection of QA activities. In all cases both QA staff and management should be able to justify the methods used for the conduct of the audit programme (posted on 15 July 2014).
National GLP monitoring authorities may request information relating to the types of QA inspections conducted and the dates they were performed and reported to management. They may also request the names of the QA auditors who performed specific activities so that their training records can be reviewed. However, QA inspection findings will not normally be examined by inspectors as this is likely to have a negative impact on the way in which some QA personnel report findings. Nevertheless, some national monitoring authorities may occasionally require access to the contents of inspection reports in order to verify the adequate functioning of QA or to verify that management has received and acted upon reports from QA concerning problems that are likely to seriously affect the quality or integrity of the facility or a study.
Under no circumstances should QA reports be inspected as an easy way to identify inadequacies within the facility or problems associated with a specific study.
Compliance monitoring authority inspectors will need to verify the effectiveness of QA activities as part of the inspection of QA. In order to do this it is highly likely that they will routinely review QA procedures and other supporting records (with the exception of the inspection report). These documents will be used to verify key requirements including the independence of QA from study specific activities, that critical study phases are monitored in accordance with the facility’s policies and that the frequency of audits is sufficient etc. (posted on 15 July 2014).