Comments on “Fear power, not intelligence”

Add new comment

The concerning superpowers have, in fact, been specified

Robert 2023-02-22

The AI risks literature generally takes for granted that superintelligence will produce superpowers, but which powers and how this would work is rarely examined, and never in detail.

I don’t know what level of detail you expect, but the go-to example of a concerning superpower is nanotechnology, which has the twin properties of:
1. an existence proof demonstrating its feasibility
2. being obviously sufficient to cause human extinction

Maybe this is rarely described in papers put up on arxiv, but Eliezer is hardly shy about explaining this, and Nate also describes it in detail here.

Add new comment:

You can use some Markdown and/or HTML formatting here.

Optional, but required if you want follow-up notifications. Used to show your Gravatar if you have one. Address will not be shown publicly.

If you check this box, you will get an email whenever there’s a new comment on this page. The emails include a link to unsubscribe.