Should the speed of developing self driving cars be controlled or not ? | Sololearn: Learn to code for FREE!
New course! Every coder should learn Generative AI!
Try a free lesson
+ 2

Should the speed of developing self driving cars be controlled or not ?

too much use of AI can be harmfull

6th Jul 2018, 3:05 PM
mad0001
6 Answers
+ 3
"too much use of AI can be harmfull" Good method of starting a debate is to present your opinion on the matter and provide us with examples or specific information as to why you believe in your opinion. So, too much AI can be harmful, why do you believe that and in what ways would it be harmful? Also, if you're on the side of it being controlled, who is controlling it? Who controls them? Why would slowing down the speed at which we create those cars be beneficial? How would it be harmful? In my opinion, I'm a huge believer in the Open AI projects taking place and believe that everyone should have open access to all of the new technology that's coming up. I believe that we should advance our species/technology as quickly as possible to ensure the survival of our species in the long term. The longer we wait to prepare for mass extinction events (whether by nature or our own doing), the better off we'll be. In regards to self-driving cars specifically, that's also a wonderful set of technologies and I think will be a great thing to progress sooner than later. Have you seen most people drive? Just like not everyone has surgeon hands, not everyone was meant to be drivers either. Self-driving cars will save a lot of lives and allow us to play on our phones while we're "driving" somewhere. lol
6th Jul 2018, 3:26 PM
Fata1 Err0r
Fata1 Err0r - avatar
+ 3
No worries. Impossible to offend me. lol I love a good debate, especially on these subjects that I find interesting. Machines break and malfunction; so I'd never assume it to be 100% safe at any given point. However, how many crashes are caused by human error or irresponsibility? I'd imagine the crash statistics of self-driving cars is nothing in comparison to what humans do when allowed to operate machinery themselves. In the USA, in 2015, we had well over 6 million car related accidents that resulted in deaths, injuries or damages to property. I wouldn't blindly rely upon machines as being flawless, but they're certainly statistically less flawed at it than us. Also, if you think about it, who are you a slave to right now? Greedy emotional humans? Or cold and calculated machines? lol I'd imagine we're a slave to both already. I depend upon my machines for all sorts of things and I can't help but check my phone constantly. Likewise, I'm only as free as the humans that control me say I am. Realistically, machines would probably use better logic to control and run things than humans are capable of doing. However, as we all know, it simply depends upon how they're programmed; garbage in, garbage out. The more immediate threat is dependent upon which humans obtain and advance these technologies first or above all else. Even further, another big threat is creating enough humanoid robots to replace the working classes of most nations, and then the rich will no longer require those without money to work hard for them; we become the obsolete machines of the rich.
6th Jul 2018, 3:55 PM
Fata1 Err0r
Fata1 Err0r - avatar
+ 2
Fata1 Err0r no offense please, if you have heard the news of uber crash in US than h gotta kno that it can't be 100% safe for us to rely on machines
6th Jul 2018, 3:38 PM
mad0001
+ 2
though I also supports the AI but it must be within a limit that does not mean that we are slaves and machines are ordering us ... if u tink about it a litl
6th Jul 2018, 3:40 PM
mad0001
+ 2
Fata1 Err0r let's end this debate by simply moving on to the conclusion that all the work(or a less) is being done by machines .. then what does the humans do after a specific period of time :) ?
6th Jul 2018, 4:00 PM
mad0001
+ 2
Well, if history is any indicator of how humans behave, then I'd say once the working lower class becomes obsolete by machines that can do their jobs, a couple things will happen. There will be a HUGE spike is homeless people and unemployed; hungry people. From there, people will either shift their studies and learn something that's still relevant (such as what we're all doing here by learning more about technologies and how to create/program them) or simply become a victim of the advancements. However, this will cause unrest among the vast number of people in that situation, which will cause them to either fight among each other, create a "criminal" class due to survival needs, or rise up to rebel against what they believe caused the problem, be it the robots or governments. How that's dealt with one can only imagine. Throughout history the upper classes required the lower classes to be their peasants/slaves/workers in order to obtain what they wanted, so if they're no longer required because robots replaced them, I'm sure you could imagine what humans would do with that type of power. Genocide has happened over lesser beliefs. Now lets assume a brighter progression than what I just mentioned. Lets say we progress robots to the point that they're superior to us in all ways and capable of all things we can do. First thing that we would do with that is create a "super brain" that doesn't have legs/arms and then we would use it to solve the answers to many things we wonder about. Once it progresses past the point of human intelligence, it begins to progress itself at an exponential rate; we would be like an ant standing in front of a God and probably wouldn't even be able to understand it any longer. If we limited its mobility, that wouldn't have such a drastic end result.. We would use them to further explore space and advance our cybernetic capabilities in order to keep up with our creations. I'm sure we'll use them to war as well. We're just animals bound to our instincts, just like other animals on Earth
6th Jul 2018, 4:15 PM
Fata1 Err0r
Fata1 Err0r - avatar