In an ideal scenario, a typical solution that is equally injected and then tested multiple times should have same signal response. In practice, however, the signal will likely vary due to different analytical conditions. These conditions include inconsistency in sample preparation, injection volume, mass spectrometer, and matrix effects. Hence, it is necessary to include an Internal Standard (IS) for analysis. Scientists define Internal standards as known compounds added at fixed concentration to various analytical samples. If the sample preparation is consistent, the final amount and detector response of IS will be same across the samples. Thus, the ratio of analytes to IS would reflect the analyte concentration independent of other conditions.
One challenge for all bioanalytical laboratories is that the response of the IS to analytical conditions can vary differently from that of the analyte. In this case, we would observe improper skew when measuring the analyte based on IS concentration as benchmark. Hence, the question is which internal standard is best for a study?
Under same conditions, the IS and analyte compound should behave as closely as possible. For context, behavior here includes retention time, ionization effects, and mass spectrometer response etc. Only then can we be sure that the calculated concentration is accurate. Hence, a stable-labeled compound that can be separated by the mass spectrometer is typically a solid choice as IS. In Figure 1, several samples show low signal for both the analyte and IS. Without the IS, the samples would show low concentrations. With the IS, the ratio of analyte to IS remains unaffected, such that the calculated concentrations will not change. In Figure 2, the IS drifts at the end of the run. This could be due to the instrument losing sensitivity. Still, the run passed due to proportional loss of signal for analyte resulting in to correct response ratio between analyte to IS.
When the client is okay considering bioanalytical data in the absence of a stable-labeled IS, they are balancing measurement as per “fit-for-purpose”. An alternative approach, when a deuterated standard isn’t readily available, is to use a mixture of three different IS that show different masses, retention times, and structures. After the sample run is complete on the mass spectrometer, check the IS response and look for consistent levels or drifting, along with the accuracy of the standards and QCs. Subsequently, we should choose the IS with best response and process that batch from there. Generally, the chosen IS can act inconsistently and thus having an alternate ready is a good idea. Even with additional work and method development, this approach will save valuable time and client resources in the long run.
Figure 1. Internal standard response for a sample run. The standards and QCs are consistent but the difference in IS between the standards and samples are due to either analytical conditions or sample processing. However, the ratio of analyte to IS could be calculated accurately by using deuterated internal standard.
Figure 2. This sample run shows the importance of the internal standard in correcting for differences in detector response during the run. Despite showing low IS, the standard curves are close(top). Without the IS, the curves would be very different (bottom), and the data would not be reliable.