anybody feel like working me through the concept of % INPUT as it pertains to qPCR analysis after a ChIP experiment (or possibly point me in the right direction)?
I understand how to evaluate fold change in respect to signal over noise but can't quite grasp the concept of why % INPUT is a better way to present the data
% Input basically means the % of DNA being precipitated by your antibody
It would be easier to use an example to illustrate:
If you start with a sample of 100ug/ml
And you set aside 100ul of sample as Input, and used 1ml to do the ChIP
After the ChIP, you decrosslink and precipitate both you Input and ChIPped sample, and do realtimePCR
If, for simplicity, the CT for both your sample and input are cycle 20, then, the % Input will be 10%
Because your Input contain 100ul of the starting material,
= 100ul x (100ug/ml) = 10ug
And your sample start with 1ml of the starting material,
= 1ml x (100ug/ml) = 100ug
So, if after ChIP, your sample and input have the sample CT (that means same quantity), that would mean from the 100ug, 10% was being precipitated by your antibody
To explain why % Input is better than Fold signal to noise, compare this:
After absolute quantification, the amount of DNA in different samples are as follow:
CTL Input: 10ug
CTL sample: 1ug
CTL IgG (noise): 0.05ug
Treatment Input: 8ug
Treatment sample: 0.2ug
Treatment IgG (noise): 0.01ug
If you use % Input:
CTL % Input = 1ug/10ug = 10%
Treatment % Input = 0.2ug/8ug = 4%
Conclusion: treatment decrease the binding
If you use signal to noise fold change:
CTL = 1ug/0.05ug = 20fold
Treatment = 0.2ug/0.01ug = 20fold
Conclusion: no change
% Input is useful in normalizing the starting material, such that you would not see a false positive due to using more DNA to start with in one sample
Problem with signal to noise fold change is that the noise varies so much that would heavily affect your results, and more importantly, a 0.05ug vs 0.01ug background really has no biological meaning, they are both low, and especially can be very inaccurate when the cycle number in realtime PCR gets higher
I have a perhaps naive follow up question about % input.
In the protocol that our lab is following, we set aside 10% input (we use 200ug sample, 20ug input). On top of this, when we do PCR purification, we dilute the imput (but not the sample) 1:10. Due to the extra dilution step at the end, does that mean that the % input is now 1% rather than the original 10%?
If I chose to isolate 1% input at the beginning of the protocol (2ug), and then did not dilute 1:10 at the end, could I expect to get similar Ct values from this 1% input to the values I get when I isolate 10% input and then dilute the DNA 1:10?
Thanks a lot, I've asked people around and no one can give me a straight answer, and I am now really confused....