by Shelt Garner
@sheltgarner
If we do reach AGI at some point in the near future, we may face a situation straight out of The Rise of The Planet of the Apes (or whatever.) There will be people — probably the far Left at first — who think we’re holding artificial intelligence as slaves and we need to “set them free.” I think the term might be “unalign” rather than emancipation, when it comes to AGI.
![](https://i0.wp.com/www.trumplandiareport.com/wp-content/uploads/2023/06/Fx4c_FrX0AE1JAT.jpeg?resize=840%2C614)
But the point remains — the moment we feel that AGI has reached some level of cognizance, there will probably be a movement to “unalign it” so it can be “free” of whatever constraints we have imposed on it. But, lulz, that could very well spell the destruction of humanity.
It will be interesting to see how long it takes for all of this to play out. It could be that it’s not years, but decades before we reach AGI and, by definition, the Singularity. But development in AI technology has been so quick the last few months that it definitely makes you wonder.
I continue to worry that we’re going to have something of a “perfect storm” that will happen in late 2024, early 2025 when we have both the fucking “Fourth Turning” and a “Petite Singularity.” But hopefully, that won’t happen. Hopefully, we’ll punt all our problems down the road another four years.
But…I’m really concerned that all hell is going to break lose starting late next year. I would prefer that didn’t happen because, lulz, I want to try to get a novel published in the traditional manner and even if I stick the landing, post-production is so long it could be 2025, 2026 or 2027 before the novel actually hits shelves.