Home > Author > Nick Bostrom >

" Human individuals and human organizations typically have preferences over resources that are not well represented by an "unbounded aggregative utility function". A human will typically not wager all her capital for a fifty-fifty chance of doubling it. A state will typically not risk losing all its territory for a ten percent chance of a tenfold expansion. [T]he same need not hold for AIs. An AI might therefore be more likely to pursue a risky course of action that has some chance of giving it control of the world. "

Nick Bostrom , Superintelligence: Paths, Dangers, Strategies


Image for Quotes

Nick Bostrom quote : Human individuals and human organizations typically have preferences over resources that are not well represented by an