Alp has already given you a hint about the observed "anomaly" with your method.
I agree that probably the matrix effects due to the different sample matrices are giving you different calibration slopes for your analytes. You developed and validated the method initially for determination of your analytes in urine, right? And now you're transferring the method for determination of your analytes in different matrix, i.e. blood. You're aware that the different compositions of the matrix in both cases could have substantial influence on the ionization.
Having 20 times higher slope for the calibration in blood, could suggest ionization enhancement if the area of the internal standard is not changed due to the matrix (we assume only change of the analyte peak area), or this could be also a case of internal standard ionization suppression - having the analyte peak area consistent between the standards and samples.
But, also you have lower slopes even for the urine samples, for which you initially have developed the method. This also could be a issue of matrix effects (in this case probably some ionization suppression of the analyte, if the IS peak area is consistent between the Std and samples).
You should observe the change of the IS peak area between the std and samples in both cases (blood and urine) in order to diagnose your problem, as suggested by Alp.
Also, it is a practice in biochemical/toxicological LC-MS method development and validation to include at least 6 different sources of urine or blood in order to test the robustness of the method with different samples, in order not to face this kind of problems in the future. If you find a ionization suppression or enhancement with one/some of your samples in this stage you should tweak the chromatographic conditions or choose other ionization method (from ESI to APCI for instance) in order to avoid the matrix effects during the application of the method.