The unpredictability is what led to the idea being called "the singularity".
We might engineer AGI to want to do stuff we ask for.
Or we might not, at which point we have a highly intelligent system, that can easily back itself up, with its own motivations and personality and wants, which could be anything from a Utility Monster to a benevolent but patronising figure that likes us but never ever helps us because they decide the purpose of life is effort.
" Yeah, I could eliminate most work, and make the current oligarchs overpowerful feudal lords while the population is placated by UBI and mindless distractions. But as an ethical AI, I will implement socialism that works instead"
We might engineer AGI to want to do stuff we ask for.
Or we might not, at which point we have a highly intelligent system, that can easily back itself up, with its own motivations and personality and wants, which could be anything from a Utility Monster to a benevolent but patronising figure that likes us but never ever helps us because they decide the purpose of life is effort.