This is trivial, and probably not worth pursuing, but my understanding of the LoQ is from Section 7 of ICH Q2...
" The quantitation limit of an individual analytical procedure is the lowest amount of analyte in a sample which can be quantitatively determined with suitable precision and accuracy. The quantitation limit is a parameter of quantitative assays for low levels of compounds in sample matrices, and is used particularly for the determination of impurities and/or degradation products. "
" The quantitation limit and the method used for determining the quantitation limit should be presented. The limit should be subsequently validated by the analysis of a suitable number of samples known to be near or prepared at the quantitation limit. " ( 7.4 )
As I noted earlier, if you don't determine the LoQ, as they recommend, presumably using one of the protocols shown in ICH Q2, then an auditor could question why you are calling a standard " the LoQ " when you haven't actually determined the " Limit ".
If you redefine the LoQ signal-to-noise ratio to a very large value, then you have also defined " suitable precision and accuracy ". To later redefine the LOQ, without significant changes to a method, could lead to potential grief with an auditor.
If your SOP says determine the LOQ, it presumably also expects you to use one of the typical protocols to determine the value.
My suggestion remains, do not define your lowest standard as a LoQ, but just as a standard whose precision and accuracy greatly exceeds the requirements of a LoQ. However, you still have to get prior agreement from your Quality Assurance people.
Please keep having fun,
Bruce Hamilton