Best Practices – Internal Standards

In an ideal scenario, a typical solution that is injected multiple times should result in the same response signal. In practice, however, the signal will likely differ due to variables in analytical conditions. These can include differences in injection volume or sample preparation, among other variables. Even if these factors can be tightly controlled, there are other variables at play such as matrix effects and mass spectrometer consistency.

Hence, it is necessary to include an internal standard for analysis. Internal standards (IS) are known compounds added to a sample at a fixed concentration. If the sample preparation is consistent, the final amount of IS will be close to constant. Thus, the detector response should be the same for the IS, and the ratio of analytes to IS should reflect analyte concentration and be independent of analytical conditions.

One challenge for all bioanalytical laboratories is that the response of the IS to analytical conditions can vary differently from that of the analyte. In this case, the determination of analyte based on IS concentration as benchmark would be improperly skewed. Hence, the question is which internal standard is best for a study? Ideally, the IS should behave as closely as possible to the compound under the same conditions (retention time, ionization conditions, mass spectrometer response). One of the better choices for IS would be a stable-labeled compound which can be separated by the mass spectrometer. Only then can we be sure that the calculated concentration is accurate. For example, in Figure 1, several samples show low signal for both the analyte and IS. Without the IS, the samples would show low concentrations. With the IS, the ratio of analyte to IS remains unaffected, such that the calculated concentrations will not change. In Figure 2, the IS drifts at the end of the run. This could be due to the instrument losing sensitivity. Still, the run passed due to proportional loss of signal for analyte resulting in to correct response ratio between analyte to IS.

When the client is okay considering bioanalytical data in the absence of a stable-labeled IS, they are balancing as per “fit-for-purpose”. Alternatively, a good approach is to use a mixture of three different internal standards that show different masses, retention times, and structures when a deuterated standard is not readily available. After the sample run is complete on the mass spectrometer, check the IS response and look for consistent levels or drifting, along with the accuracy of the standards and QCs. Subsequently, we should choose the one with best response as IS and process that batch from there. In our experience, the same IS is not guaranteed to behave the same way every time and thus having an alternate ready is a good idea. It could take additional work and method development, but this approach will save valuable time and resources for our clients in the long run.

Figure 1. Internal standard response for a sample run. The standards and QCs are consistent but the difference in IS between the standards and samples are due to either the analytical conditions or sample processing. However, the ratio of analyte to IS should be accurately calculated by using deuterated internal standard.

Figure 2. This sample run shows the importance of the internal standard in correcting for differences in detector response during the run. Despite showing low IS, the standard curves are close(top). Without the IS, the curves would be very different (bottom), and the data would not be reliable.