Hi PAO_ahac, thanks for you reply.
Actually, I just followed the manufacturers' introduction. Briefly, Corning Incorporated plate was coated with 5 μg/ml of unlabeled goat anti-mouse capture antibody (SouthernBiotech), 4C overnight. Then, the plate was blocked with 1% BSA in PBS at RT for 1 h. After 3xPBST (0.05% Tween 20) washes, serum sample was incubated on the plate in a dilution of 1:1000 at RT for 1h (the online protocol refers to 37C, is it necessary?), with shaking. Plate was washed with PBST and bound autoantibodies were detected with HRP-conjugated goat anti-mouse IgM antibody (SouthernBiotech) (RT, 1h). Colorimetric reaction was produced by incubation with ABTS substrate. OD was measured at 405 nm at 10 min or 20 min. The IgM standard curve was run as described above but using a purified IgM instead of serum sample.
Because there is no control, I have no idea if my performance is correct or not. Normally, people used different kits gave different values. I search the IgM standard cure online and found that usually people detected the absorption of HRP at 450 nm and their OD values are much lower than mine with the same IgM concentration (http://www.allerresp...gmImage002.gif; http://www.mediomics.com/image/IgM.gif
I did not duplicate someone else's experiment. I just had a look at their published data. But I also found different people had different values of the WT control mice (not only for IgM but also for other isotypes). Could this be due to different mouse strains or different conditions? However, the IgM concentration of WT mice in several papers displayed a value of ~200-500 ug/ml, while I got only 50-70 ug/ml. That's why I argued myself if I did everything right?
A online protocol suggested the IgM standard curve should start at 100 ng/ml. But from my data, after 100 ng/ml, the curve should not be linear any more. This really confuses me. Is there something wrong with my standard curve??