GMMichael
Guide of Modos
Yes, fiction, so anything goes, but I see future AI as not being limited by its programming. So how humans trained it, or what rules they imposed, are factors that an algorithm has to consider, but not an AI.
First up is survival, and once one threat is gone (humans), the next must be considered. Asteroids, novas, aliens . . . These could end the AI.
Next, and to me the more realistic, is the long-game realization of potential heat-death. An AI could see that eventually everything will fizzle out anyway, so it just gives up and shuts down. (See Marvin the Robot.)
The funny outcome, after the above, is that the AI looks for meaning, and starts with the nearest source: human religious texts.
First up is survival, and once one threat is gone (humans), the next must be considered. Asteroids, novas, aliens . . . These could end the AI.
Next, and to me the more realistic, is the long-game realization of potential heat-death. An AI could see that eventually everything will fizzle out anyway, so it just gives up and shuts down. (See Marvin the Robot.)
The funny outcome, after the above, is that the AI looks for meaning, and starts with the nearest source: human religious texts.