The energy devoted to establishing the truth or falsity of conjecture X should grow with the distance between (optimal action conditional on X) and (optimal action conditional on not X).

If you are going to do the same thing whether X or not X, who gives a F about X?