Should autonomous robots get armed?
There are already Machine Learning projects for military robots. Some of them do have weapons. Many countries want to stop this trend, that is mainly pushed forward by the USA. Isn't it insane to create armed, full autnomous machines trained to kill living beings? Some researchers said, that 2045 singularity might start. What if machines could arm themselves then? It sounds like Science Fiction, but this really could become a big threat in the future. What are the pros and cons? I see cons only.
11/15/2017 6:55:11 PMRobert Paulson
16 AnswersNew Answer
Methinks before guns were a thing, it was probably also an insane idea to create a machine which launches projectiles at 1700mph into/through things. You know, what if guns were used as weapons against humans ourselves instead of hunting? Wait, that already happened? >:O
Just in case anyone needs a vague idea of how armed robots work: https://m.youtube.com/watch?v=pQ2dI_B_Ycg
Doesn't seem like a wise decision, what if the machine comes to learn that killing people is actually efficient in reducing population related problems like hunger, poverty, disease...
@Netkos Ent This is an interesting point of view. I personally also think, that noone can stop this research anyway. It's like constructing nuclear weapons.. the only thing we can do is to implement frameworks, that guarantee a general respect towards mankind. Your point about not sending humans but robots to war is a strong argument, but then again The Cold War might rise once again. At least this is the opinion in some papers of the german army. And how could a framework look like? Do we have enough knowledge to make it stable against all odds? Can we prove correctness? Asimov's laws in example showed, how there might always be things, we didn't think about. Thank you for your input. I just started reading about this topic and your view opened up my way of thinking.
I guess the only thing to fear is man himself unless A.I becomes so advanced that it is equivalent to human intelligence
First off, most things we love and enjoy right now were things of science fiction previously. Thankfully nerds know what to focus on. :) Disclaimer: I'm from the USA. Personally, I think it's amazing that we're at a point that we can even start to do this, and I'm 100% in favor of us taking robots/AI/AC to the furthest extent we possibly can. From my stance, what's insane is sending living human beings out into the world to be killed if you have a means of sending out non-living beings. In programming, we have something that's called "garbage in, garbage out." Basically, they'll only arm themselves and "kill all humans," if we create them to do that, otherwise they'll do what they're created to do. HOWEVER, many of us, including myself, want to create full AI/AC type robots, which have the ability to think/react as a conscious being. So.... maybe. I imagine they'll want to arm themselves and kill everyone as much as humans want to do the same thing; some do but most don't. It'll simply depend upon their experiences, knowledge, and what has placed impressions upon them, such as their peers/parents/etc... Considering this has been a thought/scenario for a long time, something we'll put a lot of consideration into is preventing them from destroying us, because as humans we mostly prefer to be superior to all else. As such, we'll have a means of stopping them in some form or another, unless of course the people who create it decide to use them to take over the world.....which is also something humans have been trying to do since the moment we realized there was a world outside our tribe. Back to the question. We already use robots for military purposes, and we'll definitely use them as we continue to progress them, especially the USA...........and China, India, and Russia. There are both pros and cons to this, each with their own implications.
No! they shouldn't, even with humans weapon abuse is growing in number, and human has feelings, and remorse, robots do not have such, it is unwise decision to arm robots with weapons.
@Yerucham Then they're still catching up to what we already know as humans, and what we already do as humans.
I don't think, that this will be a threat caused by robots, it will be a threat caused by humans.
To the one who sees cons only: Yes in case of arms,its true about no guarantee of control stuff.. but say about medicine, no cure for cancer, lack of technology related deaths, AI is not a problem, its an opportunity, It's you, who got power to make it heaven or hell....
Its the end of Humanity because Man is evil
@Yerucham, so true, humans made guns, tnt, napalm and all others I don't even know, and now humans want to arm robots, and make "too smart" AI and all, the evil that man do :)
What if armed robot has just an error and kill all of ppl on his own side? ?😀😀
An autonomous machine doesn't need our permission to obtain weapons. If they are truly smart enough to be considered autonomous and a sufficient replacement for humans in battle, they are smart enough to figure out how to arm themselves. The thing is, humans are already that smart. The danger of autonomous beings in possession of deadly weaponry turning on those who supplied them with those weapons has existed as long as standing armies.
Never! I can see Skynet coming …😱☠️ 😜
https://m.youtube.com/watch?v=fRj34o4hN4I The New guy in town.