I think alignment is necessary, but not enough for AI to go well. By alignment, I mean us being able to get an AI to do what we want it to do, without it trying to do things basically nobody would want, such as amassing power to prevent its creators from turning it off.
1
How could things go wrong?
Moving fast and breaking things
This is Alan’s Substack.

Alan’s Substack