Discussion about this post

User's avatar
The Water Line's avatar

Worth noting that existential risks are not limited to risks that will end humanity. They also include the lock in of bad trajectories (e.g. authoritarian dictatorships, indifference to animal torture, etc.). So there may still be a case for longtermism if we think humans are more likely to achieve decent values than aliens.

There may also be value in humanity's survival for the sake of diversity in the universe, if only to ensure that there isn't the disvaluable lock in later on.

Expand full comment
Will's avatar

Objections:

1. Do we have strong reasons to think that morally valuable aliens will also be morally upright aliens? Maybe they'll be a moral catastrophe. Maybe they like ritualistic torture or factory farming that makes our version look like paradise.

2. you're arbitrarily limiting existential risk to destroying humans, but human caused disasters could affect aliens as well. Probably not biotech but AGI definitely could.

Both of those arguments seem plausible to me and very substantially weaken your argument for higher volatility.

Having a higher prior for alien civilization existing should perhaps make us somewhat more willing to speed up tech development instead of prioritizing safety, because presumably other aliens civilizations may prioritize safety less or have worse values, but I don't think it transforms long-term nearly as much as you think.

Expand full comment
6 more comments...

No posts