Review Article - (2014) Volume 5, Issue 3
Bioanalytical methods, based on a variety of physico-chemical and biological techniques such as chromatography, immunoassay and mass spectrometry, must be validated prior to and during use to give confidence in the results generated. It is the process used to establish that a quantitative analytical method is suitable for biomedical applications. Bioanalytical method validation includes all of the procedures that demonstrate that a particular method used for quantitative measurement of analytes in a given biological matrix, such as blood, plasma, serum, or urine is reliable and reproducible for the intended use. The present manuscript focuses on the consistent evaluation of the key bioanalytical validation parameters is discussed: accuracy, precision, sensitivity, selectivity, standard curve, limits of quantification, range, recovery and stability. These validation parameters are described, together with an example of validation methodology applied in the case of chromatographic methods used in bioanalysis, taking in account to the recent Food and Drug Administration (FDA) guidelines and EMA guidelines.
Keywords: Bioanalytical method validation, Validation parameters, Application, FDA and EMA guidelines
A bioanalytical method is a set of procedures involved in the collection, processing, storage, and analysis of a biological matrix for a chemical compound. Bioanalytical method validation (BMV) is the process used to establish that a quantitative analytical method is suitable for biomedical applications. Reassurances as to the quality of the method and its reliability come from adopting a minimum series of validation experiments and obtaining satisfactory results. Characterization of the stability of analytes in biological samples collected during clinical studies together with that of critical assay reagents, including analyte stock solutions, is recognized as an important component of bioanalytical assay validation. Bioanalytical method validation includes all of the procedures that demonstrate that a particular method used for quantitative measurement of analytes in a given biological matrix, such as blood, plasma, serum, or urine, is reliable and reproducible for the intended use [1].
Validation involves documenting, through the use of specific laboratory investigations, that the performance characteristics of the method are suitable and reliable for the intended analytical applications. The increased number of biological agents used as therapeutics (in the form of recombinant proteins, monoclonal antibodies, vaccines, etc.) has prompted the pharmaceutical industry to review and redefine aspects of the development and validation of bioanalytical methods for the quantification of this therapeutics in biological matrices in support of preclinical and clinical studies.
Bioanalytical method validation employed for the quantitative determination of drugs and their metabolites in biological fluids plays a significant role in the evaluation and interpretation of bioavailability, bioequivalence, pharmacokinetic, and toxicokinetic study data [2]. These studies generally support regulatory filings [3]. The quality of these studies is directly related to the quality of the underlying bioanalytical data. It is therefore important that guiding principles for the validation of these analytical methods be established and disseminated to the pharmaceutical community.
Bioanalytical method validation is vital not only in terms of regulatory submission but also for ensuring generation of high quality data during drug discovery and development. BMV assures that the quantification of analyte in biological fluids is reproducible, reliable and suitable for the application [4].
Method validation is a process that demonstrates that the method will successfully meet or exceed the minimum standards recommended in the Food and Drug Administration (FDA) Guidance [1,5] for accuracy, precision, selectivity, sensitivity, reproducibility, and stability. Chromatographic methods (high-performance liquid chromatography [HPLC] or gas chromatography [GC]) have been widely used for the bioanalysis of small molecules, with liquid chromatography coupled to triple quadrupole mass spectrometry (LC/MS/MS) being the single most commonly used technology [6].
The objective of validation of bioanalytical procedure is to demonstrate that it is suitable for its intended purpose. The most widely accepted guideline for method validation is the ICH guideline Q2 (R1), which is used both in pharmaceutical and medical science [7]. Other guidelines, which are much more detailed, which require more extensive validation and which also have defined strict limits for the most of determined parameters are focused directly toward bioanalysis. They are represented by a “Guideline on Bioanalytical Method Validation” by EMA [3,8] and “Guidance for Industry, Bioanalytical Method Validation” by FDA [1,5]. Additionally, as a matter of discussion of recent years, new parameters are required to determine within validation process including matrix effects, carryover and dilution integrity. Detailed study of the stability of analytes under various conditions during the method application is an important specific of bioanalytical methods [5,9].
The present manuscript highlights different bioanalytical method validation parameters which could be used for the validation of routine analytical method developed. The manuscript could be used as a guide in some Therapeutic Drug Monitoring, Bioavailability and Bioequivalence studies of existing and new drug candidates.
The reason for validating a bioanalytical procedure is to demonstrate the performance and reliability of a method and hence the confidence that can be placed on the results. In addition, Shah et al. [10] has stated that all bioanalytical methods must be validated if the results are used to support registration of a new drug or the reformulation of an existing one. It should be noted that the initial validation is only a beginning, as a method should be monitored continually during its application to ensure that it performs as originally validated [11]. Validation involves documenting, through the use of specific laboratory investigations, that the performance characteristics of the method are suitable and reliable for the intended analytical applications.
1. It is essential to used well-characterized and fully validated bioanalytical methods to yield reliable results that can be satisfactorily interpreted.
2. It is recognized that bioanalytical methods and techniques are constantly undergoing changes and improvements; they are at the cutting edge of the technology.
3. It is also important to emphasize that each bioanalytical technique has its own characteristics, which will vary from analyte to analyte, specific validation criteria may need to be developed for each analyte [12].
4. Moreover, the appropriateness of the technique may also be influenced by the ultimate objective of the study. When sample analysis for a given study is conducted at more than one site, it is necessary to validate the bioanalytical method(s) at each site and provide appropriate validation information for different sites to establish inter-laboratory reliability [13-15].
Bioanalytical method validation is classified into three types
A. Full validation
B. Partial validation
C. Cross validation
Full validation
The full validation is an establishment of all validation parameters to apply to sample analysis for the bioanalytical method for each analyte [1,15-19]. Full validation is important:
1. When developing and implementing a bioanalytical method for the first time.
2. For a new drug entity.
3. A full validation of the revised assay is important if metabolites are added to an existing assay for quantification [19-21].
Partial validation
Partial validations are modifications of already validated bioanalytical methods or Modification of validated bioanalytical methods that do not necessarily call for full revalidation [15,16,18]. Partial validation can range from as little as one intra-assay accuracy and precision determination to a nearly full validation. Typical bioanalytical method changes that fall into this category include, but are not limited to:
1. Bioanalytical method transfers between laboratories or analysts
2. Change in analytical methodology (e.g., change in detection systems)
3. Change in anticoagulant in harvesting biological fluid
4. Change in matrix within species (e.g., human plasma to human urine)
5. Change in sample processing procedures [21]
6. Change in species within matrix (e.g., rat plasma to mouse plasma)
7. Change in relevant concentration range
8. Changes in instruments and/or software platforms
9. Limited sample volume (e.g., pediatric study)
10. Rare matrices
11. Selectivity demonstration of an analyte in the presence of concomitant medications Selectivity demonstration of an analyte in the presence of specific metabolites [1,17-19].
Cross validation
Cross-validation is a comparison of validation parameters when two or more bioanalytical methods are used to generate data within the same study or across different studies [15,18,22].
1. An example of cross-validation would be a situation where an original validated bioanalytical method serves as the reference and the revised bioanalytical method is the comparator. The comparisons should be done both ways.
a. When sample analyses within a single study are conducted at more than one site or more than one laboratory, cross validation with spiked matrix standards and subject samples should be conducted at each site or laboratory to establish inter laboratory reliability.
b. Cross-validation should also be considered when data generated using different analytical techniques (e.g., LC-MSMS vs. ELISA) in different studies are included in a regulatory submission [1,15,17,21].
In today’s drug development environment, highly sensitive and selective methodsare required to quantify drugs in matrices such as blood, plasma, serum, or urine. Chromatographic methods are the most commonly used technology for the bioanalysis of small molecules and the general terms presented below take in account to this type of analytical method.
It is well accepted the FDA Guidance for Industry, Bioanalytical Methods Validation (2001) as a reference for current validation practice and a briefly description of it is given in the common terminology [23].
The common terms used in bioanalytical method validation is given as follows, these are available in FDA guidance or other publications, but are provided here for convenience.
Accuracy
The degree of closeness of the observed concentration to the nominal or known true concentration [14,21,24-26]. It is typically measured as relative error (%RE) [27]. Accuracy is an absolute measurement and an accurate method depends on several factors such as specificity and precision [11,28]. Accuracy is sometimes termed as trueness. Accuracy is determined by replicate analysis of samples containing known amounts of the analyte (i.e., QCs) [29]. Accuracy should be measured using a minimum of five determinations per concentration. A minimum of three concentrations in the range of expected study sample concentrations is recommended. The mean value should be within 15% of the nominal value except at LLOQ, where it should not deviate by more than 20%. The deviation of the mean from the nominal value serves as the measure of accuracy [14]. The two most commonly used ways to determine the accuracy or method bias of an analytical method are (I) analyzing control samples spiked with analyte and (II) by comparison of the analytical method with a reference method [7,24].
Accuracy is best reported as percentage bias which is calculated from the expression [27]:
Precision
The precision of a bioanalytical method is a measure of the random error and is defined as the closeness of agreement between a series of measurements obtained from multiple sampling of the same homogeneous sample under the prescribed conditions [15,26]. Measurement of scatter for the concentrations obtained for replicate samplings of a homogeneous sample. It is typically measured as coefficient of variation (%CV) [27] or relative standard deviation (R.S.D.) of the replicate measurements [27,30].
Precision should be measured using a minimum of five determinations per concentration. A minimum of three concentrations in the range of expected concentrations is recommended. The precision determined at each concentration level should not exceed 15% coefficient of variation (CV) except for the LOQ where it should not exceed 20% CV [5,21,25,31].Precision may be considered at three levels: repeatability, intermediate precision and reproducibility.
Repeatability
Repeatability expresses the analytical variability under the same operating conditions over a short interval of time (within-assay, intraassay). Repeatability means how the method performs in one lab and on one instrument, within a given day. Precision measured under the best condition possible (short period, one analyst etc.).
Intermediate precision
It includes the influence of additional random effects within laboratories, according to the intended use of the procedure, for example, different days, analysts or equipment, etc. (betweenassay, inter-assay). Intermediate precision refers to how the method performs, both qualitatively and quantitatively, within one lab, but now from instrument-to-instrument and from day-to-day [24,32]. Precision measure of the within laboratory variation due to different days, analysts, equipments, etc.
Reproducibility
Reproducibility is the precision between laboratories (collaborative or interlaboratorystudies), is not required for submission, but can be taken into account forstandardisation of analytical procedures. Ability of the method to yield similar concentration for a sample when measured on different occasions [27]. Reproducibility refers to how that method performs from lab-to-lab, from day-to-day, from analyst-to-analyst, and from instrument-to-instrument, again in both qualitative and quantitative terms [7,32].
Linearity
The ability of the bioanalytical procedure to obtain test results that are directly proportional to the concentration of analyte in the sample within the range of the standard curve [15,18,24,27]. The concentration range of the calibration curve should at least span those concentrations expected to be measured in the study samples. If the total range cannot be described by a single calibration curve, two calibration ranges can be validated. It should be kept in mind that the accuracy and precision of the method will be negatively affected at the extremes of the range by extensively expanding the range beyond necessity. Correlation coefficients were most widely used to test linearity.
The ability of the bioanalytical method to measure and differentiate the analytes in the presence of components that may be expected to be present. These could include metabolites, impurities, degradants, or matrix components [27]. Selectivity is the documented demonstration of the ability of the bioanalytical procedure to discriminate the analyte from interfering components [30,33]. It is usually defined as “the ability of the bioanalytical method to measure unequivocally and to differentiate the analytes in the presence of components, which may be expected to be present” [1,34]. Analyses of blank samples of the appropriate biological matrix (plasma, urine, or other matrix) should be obtained from at least six sources. Each blank sample should be tested for interference, and selectivity should be ensured at the lower limit of quantification (LLOQ) [35]. These interferences may arise from the constituent of the biological matrix under study. They may depend on characteristics of the individual under study, be it an animal (age, sex, race, ethnicity, etc.) or a plant (development stage, variety, nature of the soil, etc.), or they could also depend on environmental exposure (climatic conditions such as UV-light, temperature and relative humidity) [30]. The actual FDA guidance for bioanalytical method validation requires the use of at least six independent sources of matrix to demonstrate methods selectivity.
Specificity is the ability to assess unequivocally the analyte in the presence of components that may be expected to be present [36-38]. For example, in high-performance liquid chromatography with UV detection (HPLC-UV), a classic chromatographic method, the method is specific if the assigned peak at a given retention time belongs only to one chemical entity; in liquid chromatography with mass spectrometry detection (LC-MS) the detector could measure selective an analyte, even if this is not fully separated from endogenous compounds etc. Despite this controversy, there is a broad agreement that specificity/ selectivity is the critical basis of each analytical procedure.
The lowest amount of analyte that can be detected but not quantified [24]. The calculation of the LOD is open to misinterpretation as some bioanalytical laboratories just measure the lowest amount of a reference solution that can be detected and others the lowest concentration that can be detected in the biological sample [11]. There is an overall agreement that the LOD should represent the smallest detectable amount or concentration of the analyte of interest.
The quantitation limit of individual analytical procedures is the lowest amount of analyte in a sample, which can be quantitatively determined with suitable precision and accuracy [24,25,39].
The range of concentration, including the LLOQ and ULOQ that can be reliably and reproducibly quantified with suitable accuracy and precision through the use of a concentration response relationship [25,27,38]. The FDA Bioanalytical Method Validation document defines the lower limit of quantification (LLOQ) and the upper limit of quantification (ULOQ) as following,
Lower limit of quantification (LLOQ)
The lowest concentration of an analyte in a sample that can be quantitatively determined with an acceptable precision and accuracy [1,19,27,30,34].
Upper limit of quantification (ULOQ)
The highest amount of an analyte in a sample that can be quantitatively determined with an acceptable precision and accuracy [1,19,27,30,34].
Several approaches exist in order to estimate the lower limit of quantification (LLOQ). A first approach is based on the well-known signal-to-noise (S/N) ratio approach. A 10:1 S/N is considered to be sufficient to discriminate the analyte from the background noise [11]. The other approaches are based on the “Standard Deviation of the Response and the Slope”. The computation for LLOQ is:
LLOQ = 10σ/S
Where σ is the standard deviation of the response and S = the slope of the calibration curve. Another approach to estimate the LLOQ is to plot the RSD versus concentrations close to the expected LLOQ.
The standard curve for a bioanalytical procedure is the existing relationship, within a specified range; between the response (signal, e.g., area under the curve, peak height, absorption) and the concentration (quantity) of the analyte in the sample i.e. Calibration (standard) curve is the relationship between instrument response and known concentrations of the analyte. It is also called as calibration curve. This standard or calibration curve should be described preferably by a simple monotonic (i.e. strictly increasing or decreasing) response function that gives reliable measurements, i.e. accurate results as discussed thereafter [30].
A calibration curve should be prepared in the same biological matrix as the samples in the intended study by spiking the matrix with known concentrations of the analyte. A calibration curve should consist of a blank sample (matrix sample processed without internal standard), a zero sample (matrix sample processed with internal standard), and six to eight non-zero samples covering the expected range, including LLOQ. The lowest standard on the calibration curve should be accepted as the limit of quantification if the analyte response is at least five times the response compared to the blank response and if the analyte response is identifiable, discrete, and reproducible with a precision of 20% and accuracy of 80 to 120% [3,14].
The extraction efficiency of an analytical process, reported as a percentage of the known amount of an analyte carried through the sample extraction and processing steps of the method [27]. Recovery pertains to the extraction efficiency of an analytical method within the limits of variability. Recovery of the analyte need not be 100%, but the extent of recovery of an analyte and of the internal standard should be consistent, precise, and reproducible. Recovery experiments should be performed by comparing the analytical results for extracted samples at three concentrations (low, medium, and high) with unextracted standards that represent 100% recovery [3,5,14,15,35,39,42]. It also be given by absolute recovery [43],
The chemical or physical stability of an analyte in a given matrix under specific conditions for given time intervals. The aim of a stability test is to detect any degradation of the analytes of interest during the entire period of sample collection, processing, storing, preparing, and analysis. The condition under which the stability is determined is largely dependent on the nature of the analyte, the biological matrix, and the anticipated time period of storage (before analysis). The FDA guidelines on bioanalytical method validation as well as the recent AAPS/FDA white paperrequire evaluating analyte stability at different stages. Stability should be confirmed for every step of sample preparation and analysis, as well as the conditions used for long-term storage [8]. They also include the evaluation of the analyte stability in the biological matrix through several freeze–thaw cycles, bench-top stability (i.e. under the conditions of sample preparation), long term stability at for example -20°C or -70°C (i.e. during storage conditions of the samples) and stability of samples on the auto-sampler [1,44].
Generally, stability should be evaluated at least at two concentration levels, using blank biological matrix matched samples spiked at a low and high concentration level. It should be assessed in each matrix and species in which the analyte will be quantified. Also the stability of the analyte must be investigated under various conditions: in the standard solutions used to prepare calibration curves, in any biological matrix stored at -20°C and at room temperature prior to analysis and also in the final extract awaiting analysis. There may also be the need to investigate the stability of the analyte between the sample being taken and stored: some compounds are metabolized by esterases in the blood and have very shorthalf-lives, therefore to stabilize the compound an inhibitor should be added, the effectiveness of which will need to be assessed and validated [11]. Percent stability could be calculated as follows [45]:
Stability samples should be compared to freshly made calibrators and/or freshly made QCs. At least three replicates at each of the low and high concentrations should be assessed. Assessments of analyte stability should be conducted in the same matrix as that of the study samples. All stability determinations should use samples prepared from a freshly made stock solution. Conditions used in stability experiments should reflect situations likely to be encountered during actual sample handling and analysis (e.g., short-term, long-term, bench top, and room temperature storage; and freeze-thaw cycles). If, during sample analysis for a study, storage conditions changed and/ or exceed the sample storage conditions evaluated during method validation, stability should be established under the new conditions. Stock solution stability also should be assessed. Stability sample results should be within 15% of nominal concentrations [35].
Short-term stability
The stability of the analyte in biological matrix at ambient temperature should be evaluated. Three aliquots of low and high concentration should be kept for at least 24 hours and then analysed [15,25].
Long-term stability
The stability of the analyte in the matrix should equal or exceed the time period between the date of first sample collection and the date of last sample analysis [15,25,46].
During freeze/thaw stability evaluations, the freezing and thawing of stability samples should mimic the intended sample handling conditions to be used during sample analysis. Stability should be assessed for a minimum of three freeze-thaw cycles [15,19].
Bench-Top stability
Bench top stability experiments should be designed and conducted to cover the laboratory handling conditions that are expected for study samples [19].
Stock solution stability
The stability of stock solutions of drug should be evaluated. When the stock solution exists in a different state (solutions vs. solid) or in a different buffer composition (generally the case for macromolecules) from the certified reference standard, the stability data on this stock solution should be generated to justify the duration of stock solution storage stability [19].
The stability of processed samples, including the time until completion of analysis, should be determined [19].
The range of an analytical procedure is the interval between the upper and lower concentration (amounts) of analyte in the sample (including these concentrations) for which it has been demonstrated that the analytical procedure has a suitable level of precision, accuracy and linearity [24,47]. The range of a bioanalytical assay is the concentration interval over which an analyte can be measured with acceptable precision and accuracy [11,48].
According to ICH guidelines, the robustness of an analytical procedure is the measure of its capacity to remain unaffected by small, but deliberate variations in method parameters and provides an indication of its reliability during normal usage [24,25,36]. Robustness can be described as the ability to reproduce the (analytical) method in different laboratories or under different circumstances without the occurrence of unexpected differences in the obtained result(s), and a robustness test as an experimental set-up to evaluate the robustness of a method.
This includes different analysts, laboratories, columns, instruments, sources of reagents, chemicals, solvents. Ruggedness of an analytical method is the degree of reproducibility of test results obtained by the analysis of the same samples under a variety of normal test condition. The ruggedness of the method was studied by changing the experimental condition such as, [49].
a. Changing to another column of similar type
b. Different operation in the same laboratory
1. For validation of the bioanalytical method, accuracy and precision should be determined using a minimum of five determinations per concentration level (excluding blank samples). The mean value should be within 15% of the theoretical value. Other methods of assessing accuracy and precision that meet these limits may be equally acceptable.
2. The accuracy and precision with which known concentrations of analyte in biological matrix can be determined should be demonstrated. This can be accomplished by analysis of replicate sets of analyte samples of known concentrations QC samples from an equivalent biological matrix.
3. Reported method validation data and the determination of accuracy and precision should include all outliers; however, calculations of accuracy and precision excluding values that are statistically determined as outliers can also be reported.
4. The stability of the analyte in biological matrix at intended storage temperatures should be established.
5. The stability of the analyte in matrix at ambient temperature should be evaluated over a time period equal to the typical sample preparation, sample handling, and analytical run times.
6. Reinjection reproducibility should be evaluated to determine if an analytical run could be reanalyzed in the case of instrument failure.
7. The specificity of the assay methodology should be established using a minimum of six independent sources of the same matrix [13,15].
In general, biological samples can be analyzed with a single determination without duplicate or replicate analysis if the assay method has acceptable variability as defined by validation data. This is true for procedures where precision and accuracy variability’s routinely fall within acceptable tolerance limits.
The following recommendations should be noted in applying a bioanalytical method to routine drug analysis [15]:
1. A matrix-based standard curve should consist of a minimum of six standard points, excluding blanks (either single or replicate), covering the entire range.
2. Response Function: Typically, the same curve fitting, weighting, and goodness of fit determined during pre-study validation should be used for the standard curve within the study. Response function is determined by appropriate statistical tests based on the actual standard points during each run in the validation. Changes in the response function relationship between pre-study validation and routine run validation indicate potential problems.
3. The QC samples should be used to accept or reject the run. These QC samples are matrix spiked with analyte.
4. System suitability: Based on the analyte and technique, a specific SOP (or sample) should be identified to ensure optimum operation of the system used.
5. Any required sample dilutions should use like matrix (e.g., human to human) obviating the need to incorporate actual within-study dilution matrix QC samples [50].
6. Repeat Analysis: It is important to establish an SOP or guideline for repeat analysis and acceptance criteria. This SOP or guideline should explain the reasons for repeating sample analysis. Reasons for repeat analyses could include repeat analysis of clinical or preclinical samples for regulatory purposes, inconsistent replicate analysis, samples outside of the assay range, sample processing errors, equipment failure, poor chromatography, and inconsistent pharmacokinetic data [29,46,51].
7. Sample Data Reintegration: An SOP or guideline for sample data reintegration should be established. This SOP or guideline should explain the reasons for reintegration and how the reintegration is to be performed [13,15,52].
Bioanalysis and the production of pharmacokinetic, toxicokinetic and metabolic data plays a fundamental role in pharmaceutical research and development involved in the drug discovery and development process. Therefore the data must be produced to the acceptable scientific standards and specifications lay by the different regulatory agencies across the globe. Bioanalytical methods must be validated to objectively demonstrate the fitness for their intended use. This article highlights the Specific Recommendationsand Applications of bioanalytical method in routine drug analysis for drug discovery and development. It could be used as a guideline in developing a bioanalytical method for the routine analysis and different biological processes. It provides information for the bioavailability, bioequivalence and therapeutic drug monitoring studies.