For LOD, the approved alternative is based on the standard error of the y-intercept.
LOD = 3.3 x SE / slope
In any case you must run a standard at approximately the LOD to verify that the analyte can be detected (regardless of how LOD was estimated).
For LOQ, the equivalent is
LOQ = 10 x SE / slope
This is better than S/N (if for no other reason than it avoids the question of where and for how long you measure noise), but it is inadequate in the sense that, like S/N, it's a single value and doesn't account for the "with the required level of accuracy and precision" part of the LOQ concept. Let's face it, if you must quantitate with a CV of +/- 1%, you will not be able to do that at S/N = 10.
The best way is to generate a CV vs A plot (CV is % RSD; A is the amount of analyte) by running 3 - 5 replicates at each of multiple levels starting below what you expect LOQ to be, and then plotting the %RSD as a function of the amount of analyte. It's usually done using logarithmic axes, and looks something like this (I've used arbitrary units for the X-axis; imagine that they represent nanograms):
Now you can look at the plot and relate the LOQ to the required CV. In this case, the LOQ for 2% RSD is about 10ng. The LOQ for 10% RSD is about 4 ng, and so on.
There is lots more you can do with the information, of course, by looking at the distribution of residuals from the regression, testing for goodness of fit, etc.