I have a few problems with an assay which I'm trying to optimize. The assay originally is for cell culture. The production company (R&D) suggests to increase blocking buffer concentration to 10% BSA for serum samples which is what I did. The primary antibody is mouse anti-human protein. The secondary is goat anti-human protein. The biggest problem I had was enormously high background (1.2). After some reading and re-experimentation, I've made some changes. I switched to 5% horse serum block (in PBS), 10% horse serum in standards. Decreased primary antibody coating. The secondary is prepared in 2% goat serum. It uses TMP substrate. I repeated the experiment and got readings above 500 pg/mL, but below, absorbance values were inconsistent. Turns out the protein is less than 500 pg/mL in human serum, whereas it's 2-9 ng in BSA (which might explain the high background in the first runs). In my last run, I spiked the samples directly before plating w/ 500 pg of the protein (100 uL sample with 3.5 uL protein; background was still high, but I saw the trend I wanted (the later samples (it was a bleeding time course, and we expected to see levels decrease with time) had values below 500 pg, which makes no sense!) I'm not sure exactly what's going on. I don't know if it's the blocking, the secondary diluent, spiking issues? I've read some stuff about heterophile antibodies, and it seems like it might be that. Another thing, which I've never questioned...why do I dilute the secondary (goat anti-human protein) in goat serum? My understanding is that the antibodies in the serum won't bind the goat antibody. Is this right? Should I be blocking with goat serum? mouse serum? human serum? For spiking, should I dilute the protein in BSA or something so that I can ensure it won't stick to the sides of the tubes before I plate? I would really appreciate any help!
Submit your paper to J Biol Methods today!
1 reply to this topic