At the risk of putting words into Uwe's post, there really are only two things you can do:
1. Change the sample workup procedure to eliminate the interference. This is most easily accomplished if the interference is quite different chemically from the analyte (a protein, for example).
2. Adjust the chromatography to improve separation. Assuming you are doing reversed-phase, this usually means changing the selectivity, which can be accomplished in one (or more) of six ways.
- change the mobile phase strength (or gradient steepness)
- change the temperature
- change the organic solvent
- change the column chemistry
- change the pH
- put in additives
I've listed these roughly in order of ease of use. Without knowing a lot more about your sample and separation chemistry, it's hard to be more specific.
As far as the internal standard question, ideally, the internal standard should be as similar as possible to the analyte in terms of chromatographic behavior.
Internal standards add a certain amount of error to the measurement (you have to measure two peaks). If the precision is good to begin with, this additional error makes things worse. That is the situation in most LC analyses (certainly using UV detection).
When there are sources of error that affect the IS and the analyte in the same way, however, those errors cancel when the area ratios are calculated. That's the situation in some analyses where sample workup or injection errors are involved. It's also the situation in MS detection where the ionization efficiency can vary considerably from sample to sample due to matrix effects.
Bottom line: IS is used only when it improves quantitative precision.