Note November 9, 2015: Commenters jonshea figured out my proposed reason.
Note: A hint to the proposed answer for this puzzle is offered by another, more recent post here.
Large amounts of electrical power are generally transmitted via high-voltage lines. The reason for this is pretty basic:
Transmitting power \(P\) over an electric line requires voltages \(V\) and currents \(I\) determined by the formula: \(P = V \times I\). The resistive loss \(L\) on a line is \(R \times I\). So, for a given amount of power, the higher the voltage, the lower the current, and the lower the loss. Lower losses are better both because more of the power you put in at one end of the line arrives at the other end and because resistive loss heats the line which causes problems up to and including melting the wire.
As a consequence, the electrical transmission system runs at high voltages: ~130 kV, ~250kV, ~500kV, and ~750kV are not uncommon. But the author has never seen a power line with a higher voltage, even for the biggest transfers over the longest distances. Why do the designers not go to higher voltages?
The author has a rather neat idea—never confirmed by actually talking to a power system engineer—why that should be so.
If you can guess the reason, please comment. Points awarded for both my idea and the actual reason (if different).