Protocol Online logo
Top : Forum Archives: : Real-Time PCR

developing real time PCR - (Jul/17/2008 )

Hi

I am developing an assay to detect Chlamydial DNA from swabs.
I have developed primers using primer3 and Mfold which appear to work well. I have sequenced the product.
I have amplified then purified the product and cleaned it up. Then calculated the amount of DNA using a spectrophotometer.

Now I am trying to run a standard curve.

I have diluted the cleaned up DNA to make a curve containing 10^6 copies of DNA (16.22 pg/20 ul reaction) through to 1 copy.

I had product produced only at 10^6, 10^5 and 10^4 copies in the reaction. The Cts were around 24, 27 and 30 roughly.

The efficiency of the reaction looks like it will be ok (1.1) and the slope also (-3.1) particularly once I get my pippetting up to scratch.

I have done a checkerboard titration on the primer concentrations at 300 nm, 500 nm and 900 nm without seeing any real difference in the Cts.

What I am a bit confused about is, when I read papers they talk about being able to detect 10 copies of DNA in a reaction. But I am nowhere near this. Are there any tips on increasing the sensitivity of the assay?

Thankyou for your help

-KoalaJo-

Before I forget, I am using a commercial Sybr Green Mastermix. I do have some probes designed for use with these primers but I am trying to get the Sybr Green going first due to cost considerations.

And also, I diluted my DNA in TE. Could it be as simple as the EDTA causing PCR inhibition in the more dilute samples (ie more EDTA relative to DNA)?

-KoalaJo-

Going to answer my own question here for anyone searching the forums who comes across it.
I think my basic problem has been that I didn't have an accurate initial quantification, so what I thought was 10^7 copies, probably was fewer. I'm yet to completely confirm this, but anyone having the same problem, should check that the DNA from their standard really is the amount they think it is. Use a nanodrop, and/or compare to hyperladder on a gel to make sure your quantification is correct.

-KoalaJo-

I have the impression that spectrophotometry often overestimates the DNA content in a sample by several fold (eg. compared to a mass ruler on a gel). I don't know if a nano drop will be the ultimate solution because it is also some kind of spectrophotometer.
maybe ribogreen (invitrogen) is worth a try? or an agilent or bio-rad experion bioanalyzer if available.

-Ned Land-

Try adding some carrier DNA (another plasmid that won't amplify, or some yeast tRNA) to your standards. Also Fycol (sp?) is meant to work wonders.

-maset-

Would it be possible for you to expound on alternative methods of determining initial concentration of standards besides specking (I also use a nanodrop). My PI is investigating using BDNA, but this sounds very cost prohibitive. I will search the methods listed here as well, trying not to thread jack.

Also I have spoken to the tech support at Invitrogen, and they claim that the amount of EDTA in TE is insufficient to impair PCR.

A neighboring lab of mine claims to be able to detect a retrovirus down to 1 copy reliably in their RT-PCR, but I believe they also calculate their initial standard by specking and so I am weary to put faith into that estimate. I would imagine that other labs suffer the same problems. However, IMO, the difference of say 100 fold in PCR isnt that much and if the standards were consistent then a trend would still be reliable to see if there was an increase or decrease and by aprox how much. Thats just my opinion however. and I don't believe that answers your question... sad.gif

-Randoramma-