If the AI is orders of magnitude higher than human intelligence why would it make such a lumpen decision? What is the motivation for it? What are the parameters of the problem?
If the AI is so psychopathic that it exterminates an entire species, billions of people, along with all of the domesticated animals and ecosystems that have evolved around human existence, using devastating thremonuclear weapons that create a nuclear winter, then who is it doing this for?
That doesn't seem particularly intelligent to me.
I'm not aware that current solutions to climate change (based on human levels of intelligence) include the total annihilation of the human race. That's our best current minds and current computing power. Is there a level of super-intelligence where psychopathy kicks in?
The hypothetical argument is that computer AI sees itself another species, but the thing about computers as species is that the impact of climate change is irrelevant to them. Climate change is 'bad' for humans and 'bad' for other species but not for a bunch of ones and zeroes.
So for a computer to make a decision that it's humans that have to go, would require some compassion and care for the other species on Earth. In which case, why wouldn't the super-AI have compassion for humans? It would be aware that billions of humans are not individually responsible.
A lot of the scare stories about sup AI are essentially the same old 'mad scientist creating something they can't control and turns on them' that's encapsulated in the Frankenstein story. As I said, the first part of your post is what scares me more. It's not super intelligence, it's negligence and badly written software that worries me. Ignorance is more dangerous than super intelligence.
If the parameters are not fully laid down, then the AI will do what is asked of it within the parameters it has been given. If the parameters are set out that all existing life on earth cannot be harmed during the task, then the solution found will take that into account. Its got nothing to do with being a psycopath, its to do with how its been taught to carry out its task. Humans wipe out insects all the time. Look at how we deal with locusts for example, we poison them with insecticides, a different intelligence could deduct that we are a problem on the same scale. We breed far beyond our ability to feed and house, we consume without thought, we hunt other species to extinction, we farm other species for food, we use the planets raw materials without thought, we cull sharks so we can swim in the sea without thought about the eco system. We as a species are a plague. Or the intelligence could deduce that mosquitoes or termites are the worst and destroy them instead.
Obviously wiping out all life on earth is an extreme example, but that is the kind of nightmare scenario that AI developers have to consider. We know killing all life is not the best solution, but was culling badgers really the correct decision to deal with Bovine TB? Our government seemed to think so.
A lad I work with cannot understand why we build in so many things to prevent users doing stupid things - he bases his thinking on what he would do and as he wouldn't do something he expects others will be the same - experience has taught me, if they can do it, no matter how stupid it may seem, they will.
Moving away from extreme situations to more mundane tasks. I just sat my two lads down (6 and 8.), showed them a map of the UK and said - show me which way you would run a new railway from Liverpool to London. They both drew the most direct route. If I gave them the tools, they'd have built it without thinking, because that's what I asked them to do. Then I said, what if there was a nature reserve with protected animals, what would you do? They both said build around it. Now they do that because they've been taught you don't disturb protected animals - they learn't this when they came in the house with a load of newts they caught in the garden. Until I brought it up, they hadn't thought of it. This is where I think we would be with AI, its like a child and he have to give it the same boundaries with give our children.
All this reminds of the IT joke - A developer gets sent to the shop by his mum to buy a pint of milk and she says, if they have eggs, get 6. When he comes home he's got 6 pints of milk. She asks why he's got 6 pints of milk - he replies because they had eggs!!!