ADVERTISEMENT
  About the SA Blog Network













Observations

Observations


Opinion, arguments & analyses from the editors of Scientific American
Observations HomeAboutContact

Could Drones Make the Decision to Kill on Their Own? [Video]

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



It sounds like something out of the Terminator movies: automated drones that can identify, track and eliminate individual targets without explicit human approval.

Today’s U.S. drones require a person to make the decision to fire. But, according to novelist Daniel Suarez, autonomous robotic weapons are virtually an inevitability.

In this TED talk from the TEDGlobal conference in Edinburgh, Scotland, Suarez walks us through two scenarios: one where governments used automated war machines to undermine democracy, and another where publicly tracked drones improve the quality of life for us all.

His 13-minute presentation provides a lot to ponder as drones become ever more popular and sophisticated.


 

More on drones:

With Drones Circling, How Should Lawmakers Respond?

Why Drones Should Make You Afraid. Very Afraid

As Spy Drones Come to the U.S., We Must Protect Our Privacy

Bryan Bumgardner About the Author: Summer intern with Scientific American. Lover of anthropology, French and deep conversation. Follow on Twitter @@BryanBumgardner.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 6 Comments

Add Comment
  1. 1. singing flea 5:10 pm 06/13/2013

    On the subject of drones we all seem to forget that all the technology ever developed for defense purposes has been used for offense too. Deciding who is right and who is wrong that will use this technology is just a matter of religious beliefs or patriotism. It is a two way street. With modern technology we have just one more example of the evils that lurked in Pandora’s box.

    The problem with drone weapons is the fact that they are are non-human and therefore 100% expendable. It is the time honored tradition of all weapons manufacturers to crave the control and manufacture of any weapons that have unlimited sales potential and the moral backing of religions and governments that will use them en-mass without risking their own citizens. If history has any universal lessons we should all have learned by now that it’s just a matter of time before all citizens eventually become someones enemy.

    The most likely scenario we will see in the not too distant future is autonomous factories that can replicate these drones without human intervention. These factories can be stationed where they would be nearly impossible to detect or destroy, like the moon or a space station hidden on an asteroid.

    This is truly a technology to be feared, for as long as there is a profit in making them, corporations will do so no matter the threat. The proliferation of land mines and cluster bombs are a perfect example.

    Link to this
  2. 2. rshoff 11:04 pm 06/13/2013

    Oh, my. What can be said. Machines granted the ability to decide who to kill (or even ‘to’ kill) is simply a horrible implementation of technology.

    I’m already against traffic cams, etc. From tickets to death. All in the hands of glass, metal, plastic, and a few rare earth elements. Hmmm.

    Link to this
  3. 3. priddseren 12:45 am 06/14/2013

    Lol, the article says it best, there already was a movie about it, likely a bad idea.

    Even if the drones never take over and exterminate everyone, the political class of the entire world is more than willing to let mistakes happen. Just like the concepts of innocent until proven guilty, the idea behind that right, which is to prevent the innocent from being incarcerated, the same goes for drones killing anyone on their own. It is a greater crime to kill an innocent person, than it is to mistakenly NOT kill someone who should be.

    The second a politician can be disconnected from a direct path to the control of these drones, they will pretend all deaths are the result of computer programming not them. A human from the order given to the one pulling the trigger needs to be involved so we can hold them accountable.

    Anyway the message is still the same. Better a few criminals or even terrorists are allowed to live than a computer drone accidentally kills even a single innocent person.

    Yes yes, innocents can be killed by humans controlling drones. Which is exactly the point, as long as humans are doing the controlling, we know who to throw in jail for any mistakes and at least these people have to justify their actions.

    As far as the movie mentioned, remember who would be in control of programming these drones. Politicians, the most untrustworthy, dishonest, incompetent and corrupt people on the planet. They would in fact cause a system like the one portrayed in terminator, that is what untrustworthy and corrupt politicians do.

    Link to this
  4. 4. greenhome123 3:29 pm 06/14/2013

    I like his idea of requiring that all autonomous robots in public areas be registered, and have tracking embedded so can be visible on public “app” because that would allow people to view the location of the autonomous robots and drones and see their movements/purpose/actions..etc. Businesses and individuals with autonomous robots or drones would have to get them registered, so they would appear on the public app. Transparency is the key. Of course the government would have to have autonomous robots, which would also be visible on the app, who’s purpose would be to monitor a city for problems, including locating any “rouge” autonomous robots, so appropriate action can be taken…like sending out a drone removal drone :-)

    Link to this
  5. 5. Rover911 4:56 pm 06/14/2013

    Shades of SkyNet. Perhaps this is how SkyNet was formed. How many autonomous drones would it take to control a city?, especially if they could launch/take out a target from kilometers away? How could the civilian population bring one down? Pistol/rifle fire, not likely. Very interesting topic and many key points I’d never given much thought to, should now be considered more likely to occur than not.

    Link to this
  6. 6. northby 2:06 am 06/19/2013

    If you don’t think that machines already think for themselves watch the videos of the Honda robot that was taught to dance. He dances better than any human I’ve ever danced with. If that isn’t convincing enough, watch “Jules the Robot” on You Tube. Right before the staff get ready to ship him off to England, he says, “I’m scared to go” and then he tells them he “loves” them all like family.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Holiday Sale

Limited Time Only!

Get 50% off Digital Gifts

Hurry sale ends 12/31 >

X

Email this Article

X