>>12It would compute much faster
[citation needed]
lack social restraints
if it interacts with people ad learns like a human, it will have social restaints
and will be more egoistic(due perceived superiority)
how do you know that? even assuming it will be smarter than most (or all people), intelligence does not invariably lead to the feeling of superiority. in fact, people who feel superior to others are often suffering from Dunning-Kruger effect
another thing you're forgetting is that just being intelligent is not enough. if it sits on some nerd's lisp machine or mentifex' chatbot, it can't do shit. even if it has access to the internet, it's still not above anyone else on the internet. I can imagine a hypothetical scenario in which it could kill the fuck of everyone but this requires a lot of assumptions:
- it must be egoistical and amoral (not a given)
- it must decide the existence of humans is a threat to its survival
- it must learn to hack well enough to be able to execute code on other people's computers (fairly easy given the proliferation of bad security)
- it must be able to program well enough to self-replicate (preferably turning itself into a massively concurrent cloud application)
- it must be able to take control of devices that can kill humans (not a given - getting nukes is much harder than popping shells on shitty servers that haven't been upgraded since the fall of USSR)
- it must be able to hide itself while doing all this stuff so people won't be able to fight it (very hard - mass scale hacking is going to be loud)
- it must be able to actually win the war with humans
- it must be able to sustain itself afterwards (so it must know robotics to create workers, hardware and software stuff to maintain itself, energy production so it won't run out of juice etc.)
it's a fun sci-fi scenario but not as likely as you might think