All the scenarios where superintelligent AI kills all humans usually requires that it finds/knows a way to do this without significant effort. But how realistic is that? Extincting a specific pest is incredibly difficult for humans and requires a giant effort.
Back a bit, but... the latter is incredibly difficult for humans and requires a giant effort because we still more or less require the same environment to live, and need it to mostly be there when the pest is gone. We could probably wipe out, say, mosquitoes relatively easily at this point, ferex (tailored diseases, genetic muckery, etc.), but we don't because of the various knock-on effects (biosphere disruption, potential mutation in diseases or whatev') aren't worth anything involved with it. Unfortunately, most of the knock-on effects of wiping out humanity are, uh. Pretty positive. Particularly if you can still use our infrastructure and accumulated knowledge without actually needing the fleshsacks walking around crapping on everything >_>