What is a "value handshake"?

A value handshake is a hypothesized form of trade between superintelligences: when two AIs with incompatible utility functions meet, instead of going to war, they might decide to split the universe into chunks with volumes according to their respective military strength or chance of victory. After all, they have superhuman prediction abilities and likely know the outcome before any attack even happens. If their utility functions are compatible, they might even decide to merge into an AI with an utility function that is a weighted average of the two previous ones.

This could happen if multiple AIs are active on earth at the same time, and then maybe if at least one of them is aligned with humans, the resulting value handshake could leave humanity in a pretty okay situation.

On the other hand, the possibility of value handshakes provides a challenge to any strategy of using multiple non-aligned AIs to keep each other from getting too powerful, since value handshakes might allow the AIs to cooperate, leaving humanity in a weaker position.

See The Hour I First Believed By Scott Alexander for some further thoughts and an introduction to related topics.