by Shelt Garner
@sheltgarner
One interesting thing about Artificial Superintelligence is we seem to talk about it as if it’s a single event or thing. What if there will be all sorts of ASIs in the near future, doing different macro tasks?

It could be that we will have all-powerful, godlike ASIs that are like Titans or Olympians that we have wild interactions with on a personal basis, like what the Greek and Roman legends say.
I just can’t believe that there will be one and only one ASI, whenever that arrives. As such, we need to prepare ourselves for a time when we’ve “made our own aliens.”
It seems at least possible that we’re rushing towards a hard Singularity at some point in the near future. If that is the case, then having an ASI would be just the beginning.
And that doesn’t even begin to address what happens with Artificial General Intelligence, which is a few steps below ASI and as such easier to achieve. It could be that, by definition, to achieve AGI is to achieve ASI.

But back to a hard Singularity. People could live hundreds of years. Cancer could be cured. We could even upload or minds into cyberspace. Literally anything could happen once ASI is achieved. If that did happen, then everything would be disrupted. The basic fabric of society would have to be re-imagined.
But that, for now, is all very fantastical. It could be that things are far less exciting than I imagine. Who knows at this point. But I do feel a lot of froth in the rhetoric floating around AI at the moment. That’s just not cool. We need to calm down and think seriously about how ASI and other hard Singularity technologies might actually change our lives.