Omitted variables are terrible. If you are beset by them then (& unless you are lucky that they are orthogonal to whats included) you are condemned to regression hell: your coefficients are biased and inconsistent, you cannot derive policy relevant conclusions and your girlfriend won't love you anymore.
So the conclusion is to get better data. So say you do and you now have a previously omitted variable in your data. You should include it, right?
Wrong actually, if the paper below is correct which it looks like being. The problem is that the standard results in this area are based on there being only one omitted variable. If you have two omitted variables the bias on whats included depends in a messy way on all the correlations between the X's.
Say the model is:
Y=b1*X1 + b2*X2 + b3*X3 [ignoring the constant & disturbance term]
So you don't observe X2 and X3 initially so your estimate of "b1" is biased. It may seem counter-intuitive but adding X2 does not necessarily get you a better estimate of "b1". Actually, its quite intuitive: say omitting X2 was biasing b1 upwards and omitting X3 was having the reverse effect. So its quite possible you could have a small [or even zero] bias and adding in one of them makes things worse. I don't think you don't actually need these opposing biases for the result to hold because there is also the X2,X3 correlation.
Its rather analogous to the Second Best Theorem in Welfare Economics due to Lipsey & Lancaster.
The practical problem is that there may always be an "X3", that is typically you cannot be sure that you have all the relevant variables. Its all rather disturbing.
The Phantom Menace: Omitted Variable Bias in Econometric Research , Kevin Clarke