Discussion about this post

User's avatar
Lavander's avatar

>the AGI design should be widely separated in the design space from any design that would constitute a hyperexistential risk

https://www.lesswrong.com/w/separation-from-hyperexistential-risk

Expand full comment
Joe Edelman's avatar

I've thought about this too. But you left out some of the most important drivers:

- Having any goal makes you want to grab power, which makes you more vulnerable to existing incentives (often the incentives you you didn't like!)

- Having any goal makes you want to grab power, then your movement is full of sociopaths who just want power.

- Having any goal makes you want to grab power and it's hard to know when to stop— when to switch from grabbing power to spending it.

-> The more scared you are, the more likely you continue to grab power rather than stop at the relevant time.

- Fear-based motivations lower your bit-rate & make you see the world in black and white terms related to your fear.

-> This means they reduce the range of policies you'll consider or invent (Pause AI, Defund the police, etc)

-> This creates a bitter policy environment where either nothing can get done, or your opponents look more reasonable.

Expand full comment
3 more comments...

No posts