There’s a lot of very smart people out there, like Stephen Hawking etc., who believe AI will be the end of humankind. So naturally I’m curious why you think the opposite of that? ( if I’m understanding you correctly)It's been my option for sometime that for the betterment of humanity general AI must be created and that the first organization to do so will be pretty much the last global leader before a unified world government. Depends how it's used, but whoever gets it should use it to beat out the competition first then subjugate what's left. That or the other side goes use em or lose em while the chance exists.
Your concept of it is correct but you miss one follow through.Well ideally there would be no human to control such an ai but one that acts as a benevolent or even God-like force influencing behind the scenes but never truly seen. And why is this my opinion? It is my opinion because I want humanity to succeed and I want a better future for all of us, however, I do NOT believe humans are capable of a better future. I think humanity is capable to achieve this ourselves but only as a collective species not as individuals (read nations or alliances as well as individual people). I never see such a reality coming true on it's own; asides from some existential threat that connects us all, but even then, there are plenty of those already such as: nuclear weapons, climate change and the list goes on. And we are still not rallied in common cause to fix these issues and move to the next step. That is why I see us, all of us, requiring ai to solve these issues for us, whether overtly like: destroying the opposing team (destroying entire nations or groups) and moving on without them or something more covert like an ai subtely influencing the world for the better through media manipulation, economic, etc. For now ai does exist as far as we know, so pretty much all I wrote is just my BS opinion, unless it's already in the works covertly and none of us being the wiser.
The only things perfect in this universe are those not created by living organisms. A general ai will not be perfect, but still the best shot we have in my opinion. If somehow it were to end humanity now, wouldn't that be better than a slow and eventual extinction of humanity along with all the suffering that comes with it? We have children because it's in the nature for the survival of our species, thinking about whether that child will be the next Hitler or Mao Zedong does not stop that. An ai could as easily be our redemption or destruction but we must create it in the hope that it is the continuation of our species, much as we see in our own flesh and blood children.Your concept of it is correct but you miss one follow through.
acting as a god to create an AI that will administer our collective good you forget it is the same imperfect god (us) that created the AI in the first place.
the same weakness predilections and tyrannical failures we have will be built into the AI. Or worse yet able to be redirected or programmed into the AI.
In the end there is the very real danger the AI will execute all the worst tendencies as its creator.