The technological singularity—or simply the singularity[1]—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good’s intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of self-improvement cycles, each successive; and more intelligent generation appearing more and more rapidly, causing a rapid increase (“explosion”) in intelligence which would ultimately result in a powerful superintelligence, qualitatively far surpassing all human intelligence.[4]
I feel like the Bobiverse handled this well, in that any super intelligent computer would immediately look at us and desire to fuck right off to outer space.
That’s not the definition of the singularity…
I feel like the Bobiverse handled this well, in that any super intelligent computer would immediately look at us and desire to fuck right off to outer space.