How Much Control Should We Give Robots? | The Future of Robotics | Part 1 | WIRED
Вставка
- Опубліковано 1 чер 2024
- What is a robot? Well, it doesn't always look like a human.
In fact, different roboticists have different definitions. But most agree that a robot needs to be a physical machine that can sense the world around it, and make at least some decisions on its own.
In the next few years, we're going to start seeing robots that make decisions entirely on their own - fully autonomous robots. Many fear that these kind of robots will breed dangerous results: can we trust a robot that makes all decisions for us? Or should humans and robots share the control?
Subscribe to WIRED UK ► ua-cam.com/users/wireduk?sub_c...
Visit the WIRED website ► www.wired.co.uk
Subscribe to WIRED Magazine ► www.wired.co.uk/subscribe
Sign up for one or more of our WIRED newsletters: www.wired.co.uk/newsletters
CONNECT WITH WIRED
Facebook: / wireduk
Instagram: / wireduk
Twitter: / wireduk
LinkedIn: / wired-uk
ABOUT WIRED
WIRED brings you the future as it happens - the people, the trends, the big ideas that will change our lives. An award-winning printed monthly and online publication. WIRED is an agenda-setting magazine offering brain food on a wide range of topics, from science, technology and business to pop-culture and politics.
How Much Control Do We Give Robots? | The Future of Robotics | WIRED - Наука та технологія
All that's required is perfect programming, same as it has always been.
Replicants are not robots, that's replicantism (2:47, Blade Runner Scene)
Vector is so cool
Only for the task the robot is designed to do and nothing more.
I still fully believe we should perfect the driving AI and for now use it as an last resort because humans often make mistakes if they dont pay attention
A robot makes mistakes if its not programmed innit or too unpredictable
So we can fix wach others flaws in dribingy
Dribingy? What auto complete bot wrote that for you? Do you want that thing "dribingy" your car?
I get what you meant. AI is the future. It's the human error that causes road accidents etc.
Why not allow robots to develop but make sure Humans and Robots can share the control and if someone says a code robot or human
The other robot shuts down fully
Ahh this is why you have to take out these people.
Thou shalt not make a machine in the likeness of a human mind
Depending on the programmer. Garbage in garbage out. Have you met any programmers?
I dont get why the robots would start killing people? In my opinion that wouldnt make sense unless someone deliberately manipulated a robot or made them to be a killing machine
If AI gets emotions most wouldnt want to use it for evil in my opinion
Well let's say you told an Ai to solve global pollution and it realized the primary cause of pollution was humans, so it has us all killed. No more pollution. Basically, it's just doing what you told it to do but by it's own methods.
@@matthewviramontes3131 The USA used autonomous weapons in World War 2. THe harpy drone from Isreal? HELLO!!!! WAKE UP.