Tải bản đầy đủ - 0 (trang)
6 Calibration Curve, Linearity, and Sensitivity

6 Calibration Curve, Linearity, and Sensitivity

Tải bản đầy đủ - 0trang

Chapter 11


forcing data to a linear function may result in large

errors in measurements of results. The calibration

curve should consist of five to eight points that cover

the entire range of expected analyte concentrations

in the test samples, that is, from 0 to 200% of the

theoretical content. The lowest concentration

should be the LLOQ, and the highest concentration

should be the upper limit of quantitation (ULOQ). If

sample analyte results fall outside the range of the

LLOQ or ULOQ, the sample should be diluted in

matrix and a new standard curve prepared in the

diluted matrix.

The LLOQ is the lowest concentration that can be

defined with accuracy and precision. To define the

LLOQ, at least five samples independent of standards

should be used and the CV or confidence interval

determined. Conditions to define the LLOQ include

a response five times that of the blank response and

a peak accuracy of 80e120% and precision of 20%. The

LLOQ is not the limit of detection, which is the lowest

concentration that the method can reliably differentiate from background noise.

11.7 Selectivity and Specificity

When evaluating a method, a key criterion is the

ability of the method to differentiate the analyte from

other sample components (contaminants, matrix

components, degradation products, etc.). To determine selectivity, the quantitation of analytes in test

matrices containing all potential components is

compared to the quantitation of analytes in solution

alone. The specificity of the assay determines that the

obtained signal is due to the analyte of interest and

that there is no interference from other matrix

components, impurities, or degradation products.

Peak shape when used in conjunction with diode

array, MS, or tandem MS detection can be used to

determine the purity of a peak.6

11.8 Stability

The stability of the analyte in the biological matrix

under a variety of conditions pertinent to collection,



Chapter 11


storage, and analysis should be determined,

including stability in stock solutions. First, stability of

the analyte over three freeze/thaw cycles at two

concentrations is recommended. Second, the

stability of three aliquots of sample at room

temperature for up to 24 hours, that is, based on the

period of time the samples would remain at room

temperature during the study, should be determined.

Third, the stability of samples under expected storage

conditions for a period of time exceeding the projected time of the study should be determined for

three aliquots at two concentrations. Fourth, stock

solution and internal standard stability should be

determined at room temperature over a period of

24 hours and at the expected storage conditions for

the period of the study. Fifth, once the samples have

been processed for analysis, the stability of the

samples during the period of analysis should be

determined. This includes stability of the analyte and

internal standard under conditions that replicate that

of the autosampler during analysis. Stability tests are

performed against freshly prepared analyte standards analyzed in the same run. Changes in stability

of 10% are generally acceptable. If instability of the

samples or standards is observed, use of buffers,

antioxidants, enzyme inhibitors, and so on may be

necessary to preserve the integrity of the analytes.

11.9 Aberrant Results and Errors

in Analyses

Before beginning an analytical method, the suitability of the system to deliver reliable and repeatable

results should be determined. Parameters that can be

evaluated and compared to expected results include

plate count, tailing, peak resolution, and repeatability

(retention time and area). When results are obtained

that are outside of the acceptable range for the

method, the cause of the aberration should be

investigated. The investigation should determine

systematically whether the aberrant result is due to

malfunctioning equipment, an error in sample

preparation or analysis, or an error in sample

collection. Quality control (QC) standards should be

Chapter 11


interspersed with samples during a test run. An

erroneous result for QC samples might suggest

a malfunction in the HPLC system or detector. If the

equipment is functioning within previously set

specifications, then an investigation of the preparation and analysis of the sample is warranted. A first

check should confirm that calculations used to

convert raw data into the final result were correct. In

addition, it is recommended that a check for usage of

proper standards, solvents, reagents, and other

solutions be performed. To determine whether the

samples were prepared properly or whether the

aberrant result might be due to an equipment malfunction, reinjection of the samples is possible.

Reanalysis of the original sample will determine

whether the sample itself is different or the sample

was processed incorrectly, that is, improper dilution,

incomplete extraction, inadequate resuspension of

dried samples, etc. To determine whether an

extraction was carried out to completion, reextraction of a sample can be done. However, if it is

found that the sample was not fully extracted,

a reevaluation and revalidation of the method should

be performed using the modified extraction protocol.

Once new results are obtained, how should the

information be reconciled with the initial aberrant

result? Two methods recommended by FDA guidelines include averaging and outlier results.7 First,

averaging can be an appropriate approach, but its

use depends on the purpose of the sample, the type

of assay being performed, and whether the sample is

homogeneous. For HPLC results, peak responses

can be averaged from consecutive injections of the

same sample, and the average of the responses of

the peak would be considered the response for that

sample. Analysis of different portions from the

original sample would be done to determine the

variability/homogeneity of the original sample. The

cause of unusual variations in replicate sampling

should be investigated. Averaging can, however,

conceal variations in individual test results that

might indicate nonhomogeneity of the original

sample. Thus, it is inappropriate to use average

results if the purpose of the analytical test is to

determine sample variability.



Chapter 11


Second, values that are significantly different from

others in a series of replicate measurements may be

statistical outliers. A deviation in response may be

due to an error in the analytical method or to an

inherent variability in the tested sample. To determine the relevance of extreme results, a statistical

procedure for determining outlier values may be

used. If a result is determined to be a statistical

outlier, the cause of the aberrant response should be

investigated. As with averaging, if the purpose of the

analysis is to determine homogeneity of a sample, an

outlier test should not be used.

11.10 Quantitative Western Blot

Analysis and ELISA

An objective of proteomics is to quantitate relative

changes in the abundance of a protein in response to

changes in the biological system. Currently, the

majority of proteomic studies utilize mass spectrometry-based quantitation of peptides usually

generated by a tryptic digest; however, the final

output is in the form of a change in abundance of

intact proteins. This is opposite to intact protein

profiling such as two-dimensional electrophoresis

(2DE) with a 2DE difference electrophoresis feature.

The latter platform measures the abundance of intact

proteins, which are identified in the subsequent and

postquantitation steps. Because of inherent properties of these two approaches, an orthogonal validation becomes very important. The most commonly

used method for such orthogonal validation is

quantitative Western blot analysis and, much less

common, ELISA assays. It needs to be pointed out

that these two assays, after considering all limitations, can validate the quantity of proteins but not

the change in quantity or function. The ability to

measure function is limited by the number of functional assays for proteins that do not have enzymatic

properties where a change of color resulting from

such a reaction can be measured easily and precisely.

Western blot quantitative analysis is based on the

specificity and sensitivity of antigeneantibody

interactions, which is usually not problematic when

Chapter 11



good polyclonal antibodies are available. One of the

major limitations of this method is the dynamic

range of concentrations and rapid signal saturation.

Figure 11.1 shows an example of a quantitative

analysis of gelsolin in human plasma. The lowest

amount of plasma protein used in this experiment

was 15 ng/lane, and the signal for only full-length

protein is shown. When more than 63 ng/lane was

loaded, other forms of this protein were detected

circulating in the plasma: aggregates (higher molecular weight) and processed forms (lower molecular

weight); however, the signal for the full-length form

saturated quickly. Overall, a lack of linearity was

observed regardless of whether only one form or

multiple forms were considered for quantitation.

This example shows that validation using quantitative Western blot analysis in some instances should

be a sum of measurements for each individual form






















































Figure 11.1 Western blot quantitation of plasma gelsolin (pGSN). pGSN was loaded in twofold

dilutions on a 4e12% SDS-PAGE gel, transferred onto a polyvinylidene difluoride membrane

(left-hand side), and probed with antihuman GSN (rabbit polyclonal) antibody followed by

chemiluminescent detection. X-ray films were scanned, and density was measured using

ImageJ (right-hand side) and expressed in arbitrary density units.


Chapter 11


using various conditions and amounts of loaded

preparation. If the amount of protein is limited and

only one blot can be performed, validation may skew

the results and increase the discrepancy between the

relative change observed in proteomics and orthogonal validation.

11.11 Further Development

of Methods Validation

The purpose of method validation is to demonstrate acceptability of a method for a particular

analysis. With the continued development of higher

resolution HPLC instrumentation and detection

systems, such as higher sensitivity MS and tandem

MS systems, and improved software for analysis,

there is a need to determine the robustness and

reproducibility of data obtained from these

improvements.8 By taking a stepwise logical

approach to method validation, it can be demonstrated to scientific peers, regulatory agencies, and

potential business partners that the method will

produce reliable, believable results.


1. Rozet E, Ceccato A, Hubert C, Ziemons E, Oprean R, Rudaz S,

et al. Analysis of recent pharmaceutical regulatory documents

on analytical method validation. J Chromatogr A.


2. Shabir GA. Validation of high-performance liquid

chromatography methods for pharmaceutical analysis:

Understanding the differences and similarities between

validation requirements of the US Food and Drug

Administration, the US Pharmacopeia and the International

Conference on Harmonization. J Chromatogr A. 2003;987:


3. Shah VP, Midha KK, Findlay JW, Hill HM, Hulse JD,

McGilveray IJ, et al. Bioanalytical method validation: A revisit

with a decade of progress. Pharm Res. 2000;17:1551-1557.

4. Guidance for Industry. Bioanalytical Method Validation.

Available at, www.fda.gov/downloads/Drugs/


UCM070107.pdf; May 2001. Accessed November 2011.

5. Carr GP, Wahlich JC. A practical approach to method

validation in pharmaceutical analysis. J Pharm Biomed Anal.


Chapter 11


6. Iterson v. A Guide to Validation in HPLC. Available at www.


7. Guidance for Industry. Investigating Out-of-Specification

(OOS) Test Results for Pharmaceutical Production. Available at,

http://www.fda.gov/OHRMS/DOCKETS/98fr/98d-0777gdl0002.pdf; May 2006. Accessed November 2011.

8. Gorog S. The changing face of pharmaceutical analysis. Trends

Anal Chem. 2007;26:12-17.






Jerzy Silberring*, † and Pawel Ciborowski‡


AGH University of Science and Technology, Krakow, Poland

Centre of Polymer and Carbon Materials, Polish Academy of

Sciences, Zabrze, Poland


University of Nebraska Medical Center, Omaha, Nebraska



12.1 The “Uphill Battle” of Validation 217

12.2 Accuracy and Precision 219

12.3 Experimental Design and Validation 221

12.4 Validation of the Method 223

12.5 Validation of Detection Levels 224

12.6 Validation of Reproducibility and Sample Loss 226

12.7 Validation of Performance of Instruments 227

12.8 Bioinformatics: Validation of an Output of Proteomic

Data 229

12.9 Proteomics and Regulatory Affairs 231

References 232

12.1 The “Uphill Battle” of Validation

For any experiment that has one or more variables,

whether inherent or introduced by the investigator,

experimental design principles for achieving validity

and efficiency are required.1,2 Traditionally, such

principles were recognized for low-throughput

experiments, but have become accepted for highthroughput procedures, such as microarray experiments.3 This chapter reviews the validation principles

and their applications in experiments employing

Proteomic Profiling and Analytical Chemistry. http://dx.doi.org/10.1016/B978-0-444-59378-8.00012-8

Ó 2013 Elsevier B.V. All rights reserved.



Chapter 12







There is no “one size fits all” in validation similarly

as there is no “one size fits all” in a multistep proteomic profiling experiment, particularly when trying

to increase the sensitivity of every step of the entire

proteomic study.4 Furthermore, even if every step

can be validated separately, it does not necessarily

translate into being able to validate the final outcome

by an orthogonal method(s). This is because of three

major reasons1: each step is governed by specific

analytical parameters that are different than the

entire process in question2; biological processes are

very dynamically changing over time (often quickly)

and at multiple levels; and3 in many, if not most,

instances we are not able to define the relationship

between rate of change and biological effect. A plot of

fold change in the biological activity of a protein

versus overall change in function of the studied

system would be very helpful in validation; however,

this is usually the very question we ask and try to

answer using profiling experiments. This subsequently deprives us from points of reference critical

for validation.5,6 Studying changes in proteomes of

humans is even more complicated, not only because

of the complexity of the human organism, but also

because ethical boundaries limit how far this system

can be manipulated. Animal models that are very

valuable in reductionistic studies are less informative

about functions of a human body in its entirety in

holistic studies.

The validation procedure is time-consuming and

not as spectacular as thousands of identified

compounds. Therefore, validation and internal laboratory quality control, which is a mandatory routine in

analytical chemistry, needs to be transferred and

adapted to proteomic experiments, which, as stated

earlier, are much more complicated. Although we are

usually interested in validation of the final output, any

given methodology in the multistep procedure is

a subject of validation. Common terms, such as accuracy, precision, specificity, and linearity, can be found

in any book on analytical chemistry or medicinal

chemistry. Similarly, detailed guidelines for testing

those parameters and valuable advices can be found

Tài liệu bạn tìm kiếm đã sẵn sàng tải về

6 Calibration Curve, Linearity, and Sensitivity

Tải bản đầy đủ ngay(0 tr)