It seems to me that this always comes down to the question "how much risk of being wrong are you willing to take?". We use a 1-point calibration standard to ensure the quality of a gas we use. Mostly it's because as long as the analyte concentration is less than a certain amount, the gas is acceptable for use. First, I think you have to answer that question for yourself for your situation. "What risk will be minimized or addressed by adding more calibration standards". Also, how high is the risk of nonlinearity of your detector? Flame detectors are quite linear over a very large range.......
Dilution of gases can be pretty easy. I use the canisters by Restek:
http://www.restek.com/catalog/view/43681
If you fill the canister and then vent it to atmospheric pressure, you repressurize the canister with analyte-free gas to a known pressure (need a canister with a gauge on it) to dilute it. Use the gas law to calculate your dilution factors. I've had good success with this approach.