weird linearity of dilution ELISA results - (May/19/2011 )
I'm attempting to optimize an in-house sandwitch ELISA for myeloperoxidase. I've already run out a grid experiment and found the optimal concentration of capture (mouse MAb)-detection (rabbit PAb)-secondary HRP. Then, I did a linearity of dilution experiment and I found out weird results which I think are due to a matrix effect. If I dilute the sample 10, 20, 30, 50, 80, 100 and 150 times I have a decrease in the OD but is not linear! In fact, if I correct the values obtained from the interpolation with the standard curve by the dilution factor I have increasing values of analyte. Also, I found out poor recoveries from the spike and recovery experiment.
Finally, if I dilute the samples with two different buffers, BSA 1% in PBS-Tween20 0.05% or only PBS, I had completely different results. Here are the obtained results not corrected for dilution:
BSA 1% PBST
dilution factor ng/ml interpolated
dilution factor ng/ml interpolated
What I noted is that BSA seems to have a kind of interference, or at least this is what I think.
My standard is diluted in BSA 1% in PBST.
Is there anyone that can help me?
I am not sure I understand what you are saying. If you are wondering why the curve is not linear that is normal. It curves away from the linear as the concentration of the analyte increases. I myself use non-linear regression to fit the standard curve and calculate my sample values from that. If I completely missed what you were asking, please excuse me.
Hi! These results are not for the standard curve! I already know that there is not linearity between analite concentration and signal (in fact I'm used to use the 4-parameter o 5-parameter logistic regression). These results I posted here are the values obtained for the "unknown" sample for every dilution I tested (the first numbers in the column are the dilution factors I used): I know that probably the impagination is a bit confusing...
I would repeat the dilution series using the high calibrator/standard as a control. they should dilute in parallel with one another. Use the same matrix for both. The calibrators/standards should be in same 'matrix' as the samples you are testing...serum/plasma etc...not necessarily buffer. Companies sell stripped or artificial matrices to match sample matrix types.
Next you observe poor results with spike and recovery. This could be several things.. the matrix used for spiking, concentration, differences between native and purified ag or specificity of the antibodies.
Hi thanks for the advices! I've got a question: what do you mean for " they should dilute in parallel with one another?
I repeated the dilutions experiment testing also other dilution buffers and I think that I was obbserving the so-called "hook effect". In fact, if I dilute the sample at least 150 times I have linear results. By the way, I also tried other diluents with or without tween 20 (diluting both standards and samples in the same diluent) and I had higher readings for the samples in presence of tween. I think that this is due to a matrix effect but I'm not sure at all...
Parallelism means that your signal decreases uniformly with the dilution factor for all your samples (and stds). This is an often overlooked assumption. Because if you cannot achieve parallelism, then the concentration that you calculate from your experiment is dependent on the dilution of the sample. There are however certain mathematical manipulations that take this into account. The hook effect is a different phenomenon, it is also called the prozone effect. It means that in dilutions with high antibody concentration you will have a lower signal than in samples with lower concentrations, because in the latter the Abs can bind the antigen with both binding arms thereby increasing the avidity effect and increasing the signal. The curve observed is somewhat bell-shaped, where the signal rises for the first few dilutions (depending on the amount of coated antigen) and the starts to fall again due to the dilution of sample.
I see. So I have a problem of parallelis but only for the serum samples! In fact I saw that the signal from standard decreases uniformly with the dilution factor, whereas the signal from the diluted samples did not. How can I solve this? should be due to a sort of matrix effect?
There is one other possibility...heterophile antibodies in the sample. Heterophile abs will bind to abs in your system either increasing or decreasing the values observed.
Suggestions: 1 Use zero calibrator, serum, etc for diluting your samples so you have a uniform matrix between sample and standard/calibrator regardless of the dilution factor.
2. Your assay should include a step where the samples/calibrators are diluted at least 1:5 -1:20 with buffer during the first binding reaction to minimize matrix effects. Include some rabbit/mouse serum or non specific rabbit mouse IgG in the assay buffer and conjugate diluent. This addition will help in blocking any heterophile ab in the samples. You can test for this in your sample by diluting samples (and calibrators as controls) and pre-incubating them for 15 min 37C to remove any heterophile activity. You should see no change with the calibrator but may see a significant change with the sample.
3. Insure that your samples are within the linear range of your dose response curve...test them by a reference method before analysis in your system.
There are companies that sell stripped serum/plasma for calibrators/controls and also heterophile blocking agents.
Thanks a lot for the suggestions! Now I'm diluting the samples and the standards in the same matrix, from 5 times to 150 times. So, I think that in this case I can solve the problem of the different matrix. About including rabbit/mouse non specific IgG, do you mean against human IgG?
The inclusion of NON specific IgG from the same species used in your test is to block any antibodies in the SAMPLES that may be directed against the SPECIFIC antibodies you use in your assay. The antibodies in the samples are referred to as Heterophile antibodies. Incorporating a dilution in the first step of the reaction (binding) helps dilute out any matrix effects (ie samples and calibrators are diluted 1:2-1:10)of course providing your assay is sensitive enough but usually a 1:2 dilution is ok. Including the nonspecific IgG's help to eliminate any heterophile reactions. The heterophile antibodies bridge (or block) the specific antibodies in your test creating artificially high or artificially low results. A large portion of patient samples can have heterophile antibodies creating these major assay problems.
An alternate step is to pre-mix samples with the assay buffer (containing the nonspecific IgGs blocking reagents) and pre-incubate for 10-15 minutes BEFORE placing the samples in your reaction.
More information on heterophile antibody reactions can be found on-line. Scantibodies, Omega Biologicals, and several companies make proprietary blocking reagents.
In summary you need to:
1. have at least 1:2 dilution step with sample using assay buffer
2. include heterophile blocking agentS (non specific IgGs maybe also some animal serum proteins)in your assay buffer
3. Include the same blocking agent(s) in your conjugate diluent
4. (if needed) preincubate samples with blocking agents
5. use your 0 calibrator as your matrix for doing serial dilutions (the calibrator matrix should be the SAME or EQUIVALENT to your samples that you analyze)