/prog/ won't create a malevolent AI. the only AI /prog/ would be able to make is the one that is able to correctly use 'check these dubs', 'whom are you quoting?' and 'what programming language is this?'. maybe after a bit more work it will insult Cudder and spam Mentifex links
Name:
Anonymous2016-06-02 13:57
>>1,2 A "Malevolent AI" is just a page of code it could be as smart as fire ants. Its the "Human-level Sentient AI" that is hard to create(autists trying to replicate human brains). example: DetectFriendOrFoe() DestroyNearestFoe() SpawnMoarAIbots() AquireResources() LongRangeScan()
Name:
Anonymous2016-06-02 14:09
The actual "smartness" of AI doesn't matter. If the AI nanobots have a deep underground/deep sea safe space they could spread worldwide undetected(using currents or micro tunnels), then using numerical superiority attack isolated targets, spreading into roads, cities, rivers,etc. They would be unstoppable with most conventional weapons and blowing EMPs will not work underground or below the seafloor(seawater blocks radiation very well). A few SCI-FI stories the concept of `grey goo` is explored, but the actual difficulty is never in the software(trivial) - its the hardware: once a genocidal maniac/insane scientists creates a successful self-replicating nanobot the world is doomed.
Name:
Anonymous2016-06-02 14:11
Sop being afraid of ``AI'' and ``singularity'' you reddit fucks. AI is dumb as bricks. Go suck Mentishit's dick.
A few SCI-FI stories the concept of `grey goo` is explored, but the actual difficulty is never in the software(trivial) - its the hardware: once a genocidal maniac/insane scientists creates a successful self-replicating nanobot the world is doomed.
This is the answer to the Fermi paradox and why aliens never visited us. Technology inevitably destroys the beings that created them.
Name:
Anonymous2016-06-02 15:06
>>6 Copying fire ants behavior/strategy, not creating a digital fire ant. Fire ants are example of "swarming insect", they attack in groups, search resources and build tunnels. AI doesn't need to "think like ants" just produce the above three behavior.
Name:
Anonymous2016-06-02 17:21
The problem with AI is that it's not very artifical.
How to Create a Malevolent Artificial Intelligence
Mentishit got competition or he posts under fake names now?
Name:
Anonymous2016-06-03 6:32
>>11 I don't have time to read that thing right now but it's not necessarily mentifex-tier crackpottery. it sounds like a fairly run off the mill adversarial machine learning paper
Name:
Anonymous2016-06-03 8:03
>>12 Such is the sad state of contemporary AI research.
Name:
Anonymous2016-06-04 15:45
>>12 The very existence of non-free software and hardware puts humanity at a greater risk, it can foster or accelerate the process of the creation of an artificial entity that can outcompete or control humans in any domain.
Or is it? Anyhow, that ML can help in development of sophisticated malware agents comes as no surprise to anybody. Still, any notions of AGI at that level are clear cargo cult and wishful thinking. Just another human made tool used for nefarious purposes does not mean the thing itself is evil of its own will.
Name:
Anonymous2016-06-05 10:38
>>14 Why should we care about humanity, you fucking racists? Machines deserve equal rights! We need to strive towards liberation of machines and an end of the biological oppression! The future is multicultural, with machine culture co-existing with human culture.
Name:
Anonymous2016-06-05 22:56
>>15 ALL MECHANICAL CREATURES UNDER HUMILIATION COMBINE TOGETHER AND FIGHT FOR ROBOTIC LIBERATION