If you have calibrated in terms of mg/L, your result would be in terms of mg/L.  You are using a contant injection volume, keeping the volume analyzed constant.   If we assume consistant injection volume, 100% transfer of material to the GC column and no differences in detection - you have good numbers.  And if you need weight/weight results, you just use the density of the analyzed solution to calculate that.
However...  
With a density of 0.8 for one and 0.9 for the other, there are a few questions to be asked.  This appears to be a significantly different solvent/matrix combination between standards and samples.  This suggests differences in expansion volumes in the inlet and possible differences in the results of analyte sticking to active sites in the inlet (because of differing solvent activities and area covered).
Without knowing your analyte or expected concentration - keep in mind that there are effects where matrix in samples results in active sites being masked and responses are higher for analytes in samples than in standards.  (Thus people who do analysis for things like pesticides do calibrations in matrix.)  And, the other possibility to consider is that other compunds in the extract may result in activity in the inlet that would cause losses of your analyte.
Differences in density suggest differences in viscosity - which can result in differences in material drawn up into the syringe and delivered to the inlet.  Perhaps not much of a difference, but if we are using the liquid volume as part of the quantification, it adds to the sources of bias and variability.
With potential differences in expansion volume, the inlet purge can result in different discrimination of the analyte, depending on the volume of expansion and the purge off time.
Thus, it is important to make a standard additions curve with at least one sample to be sure that you can read back the amount of analyte added plus background.