Lethal Autonomous Weapons Systems - LAWS (Killer Robots)

HughFreakingDillonHughFreakingDillon Winnipeg Posts: 36,524


Killer robots coming soon to a war near you


Killer robots is a dreadful name, don't you think? It reminds you of the killing machines in the Terminator series and the Battle Droids of Star Wars. Lethal Autonomous Weapons Systems is a much classier name, and the acronym is even better: LAWS. So the international conference that opened at the United Nations Geneva office on Monday is about LAWS.

Don't think drones here. Drones loiter almost silently, high in the air above your picnic, until the operator back in Las Vegas decides you are plotting a terrorist attack and orders the drone to kill you and your family. But at least there is an operator, a human being in the decision-making loop.

With LAWS, there isn't. The machine sorts through its algorithms, and decides on its own whether to kill you or not. So you'll probably be glad to know there are no operational machines of that sort -- yet. But military researchers in various countries are working hard on them, and they probably will exist in 10 or 20 years.

Unless we ban them. That's what the conference in Geneva is about. It's a meeting of diplomats, arms-control experts, and ethics and human rights specialists who, if they agree this is a real threat, will put it on the agenda of the next November's annual meeting of the countries that have signed the Convention on Certain Conventional Weapons (CCW). So it's early days yet, and there's still a chance to nip this in the bud.

That's an awkward name, but not nearly as clumsy as the full name: the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects. But it actually has done some good already, and it may do some more.

Protocol 1 bans "The use of weapons the primary effect of which is to injure by fragments which are not detectable by X-rays in the human body." Protocol II requires countries that use land mines to make them deactivate automatically after a certain period. Protocol V, added in 1995, prohibits the use of blinding laser weapons.

The world would be a worse place if they did not exist. They do exist, and by and large they are obeyed. But none of these weapons would make a decisive difference in actual battle, whereas they cause or would cause great human misery, so it was easy to ban them.

The problem with killer robots is that they could make a decisive difference in battle. They don't get tired, they don't get paralyzed with fear, and if you lose them, so what? It's just a machine. There's no person in there. But that's precisely the problem: There's no person in there. Do you trust the machine to make decisions about killing people -- who's a soldier and a legitimate target, who's an innocent civilian -- all by itself?

Now, let's be honest about this. Human soldiers on battlefields don't always make wise, ethically correct decisions about whom to kill and whom to leave alive either. An example. There's sniper fire coming from that house over there, and you know there are civilians trapped in there, too. You have to get rid of those snipers or you'll be stuck here all day. You have two options.

You can send a squad of your own soldiers in to clear the house. They'll kill the snipers, and most of the civilians will be spared. But you may lose one or two of your own soldiers doing it that way, and these are people you know, for whose lives you are directly responsible. Or you can just call in artillery or an airstrike and mash the whole house. If you don't think that's a hard choice to make, you don't know much about human beings.

Whereas the killer robot will just go in there and kill the snipers. No hesitation. And if its software is properly designed, it won't kill the civilians.

Still, killer robots are a very bad idea. Wars involve killing people, and whether you're doing it with live soldiers or Lethal Autonomous Weapons Systems, it's never going to be morally tidy. The real worry is how much easier it would be for a technologically advanced country to decide on war if it didn't have to see lots of its own soldiers get killed.

So by all means let's ban purpose-built killer robots if we can: This is an initiative that deserves our support. But bear in mind there will almost certainly be autonomous machines eventually, and some of them will certainly be capable of killing. So it is also time to start working on international rules governing their behaviour. Isaac Asomov's Three Laws of Robotics (written in 1942) would be a good point of departure.

One: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Two: A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

Three: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
new album "Cigarettes" out Fall 2024!

www.headstonesband.com




Comments

  • HughFreakingDillonHughFreakingDillon Winnipeg Posts: 36,524
    this is fucking crazy that this is being brought to the table at the UN. that means it's close. DAMN close.
    new album "Cigarettes" out Fall 2024!

    www.headstonesband.com




  • Where we have gotten to blows my mind. The last two decades have been a blur of innovations that when I was a child seemed impossible.

    I can still recall the introduction of widespread internet, emails, and cellular phones. I thought, at the time, "You've got to be kidding me?"

    I 'typed' every one of my papers for my undergraduate degrees.

    Funny thing is I don't feel particularly old.
    "My brain's a good brain!"
  • Where we have gotten to blows my mind. The last two decades have been a blur of innovations that when I was a child seemed impossible.

    I can still recall the introduction of widespread internet, emails, and cellular phones. I thought, at the time, "You've got to be kidding me?"

    I 'typed' every one of my papers for my undergraduate degrees.

    Funny thing is I don't feel particularly old.

    What's crazier, a lot of technologies we use today were developed as classified military secrets 20 plus years ago. Imagine what they already have going on these days?
  • HughFreakingDillonHughFreakingDillon Winnipeg Posts: 36,524

    Where we have gotten to blows my mind. The last two decades have been a blur of innovations that when I was a child seemed impossible.

    I can still recall the introduction of widespread internet, emails, and cellular phones. I thought, at the time, "You've got to be kidding me?"

    I 'typed' every one of my papers for my undergraduate degrees.

    Funny thing is I don't feel particularly old.

    I agree. one day I was 20 years old, still calling my friends on the phone to schedule band practice. next thing I know, I have a hotmail account, then I have an MSN messenger account, and I'm talking to people for no reason on the internet during commercials (remember having to watch those?), and then I'm finding PEARL JAM SONGS I'D NEVER HEARD on some primitive file sharing site. I can't remember what it was called. but it was this super long list and when your song was ready to download, it turned from red to green. it was pretty exciting to see "Black Red Yellow" hit that 100%.

    new album "Cigarettes" out Fall 2024!

    www.headstonesband.com




  • brianluxbrianlux Posts: 41,629
    Ch-ch-ch-changes:

    image

    “The fear of death follows from the fear of life. A man [or woman] who lives fully is prepared to die at any time.”
    Variously credited to Mark Twain or Edward Abbey.













  • benjsbenjs Posts: 9,098
    At least the robots have LAWS they follow... Makes them significantly more obedient than their "thinking" human counterparts sometimes!
    '05 - TO, '06 - TO 1, '08 - NYC 1 & 2, '09 - TO, Chi 1 & 2, '10 - Buffalo, NYC 1 & 2, '11 - TO 1 & 2, Hamilton, '13 - Buffalo, Brooklyn 1 & 2, '15 - Global Citizen, '16 - TO 1 & 2, Chi 2

    EV
    Toronto Film Festival 9/11/2007, '08 - Toronto 1 & 2, '09 - Albany 1, '11 - Chicago 1
  • brianluxbrianlux Posts: 41,629
    benjs said:

    At least the robots have LAWS they follow... Makes them significantly more obedient than their "thinking" human counterparts sometimes!

    Yes but how sound are those LAWS? OR more importantly, how sound are the people that set them into motion?

    image

    “The fear of death follows from the fear of life. A man [or woman] who lives fully is prepared to die at any time.”
    Variously credited to Mark Twain or Edward Abbey.













Sign In or Register to comment.