Ok... I have decided to use a satellite mapping system for my map; I must calibrate the satellite so as to correlate its distance unit to the real one on the
earth. I take 2 big objects and I measure the distance between them and I explain the satellite that between the 2 houses, let's say, the distance is exactly 22 Meters.
Now, for the rainfall level, I use some special bottles

. They have an emitter, 2 valves and a weight detector(scale), assuming that water density is always the same:
1. valve for letting water in but not allowing it to evaporate
2. valve for releasing all water out when the bottle is full
3. emitter providing the following information:
a. an ID of the bottle corresponding to its position on the map
b. time elapsed since last remote controlled water evacuation
c. the weight of the acumulated water
I go to the Standards Institute to ensure that the bottles have the same volume and that they are made of a material that have a small dilatation coefficient, for the measurements must be correct in winter as in summer.
And, finally, I sit confortably on a sofa and I connect all bottles to my computer and observe how much it rains as time goes by...
But how does the PDA-UV detector calibrate??? I only press a button and it is claimed that it calibrates itself! Wavelenght? Absorption? ( I suppose absorption) but how?
Where does it have a standard?
I was just curious...
Is it so for the absorption that it closes the detector and this means absorption is zero?
