Activists on Parliament Hill today are hoping to ensure deadly fighting machines that pick their own targets are never allowed to roam future battlefields.
Deciding on who is friendly and who is enemy is hard for people. For machines, I can't see them ever making that distinction, unless the enemy is limited to other machines.
"DrCaleb" said Deciding on who is friendly and who is enemy is hard for people. For machines, I can't see them ever making that distinction, unless the enemy is limited to other machines.
Actually the protocol for this kind of thing would be simple.
1. Withdraw your human troops from the battlefield.
"BartSimpson" said Deciding on who is friendly and who is enemy is hard for people. For machines, I can't see them ever making that distinction, unless the enemy is limited to other machines.
Actually the protocol for this kind of thing would be simple.
1. Withdraw your human troops from the battlefield.
2. Order your drones to kill everyone remaining.
Bartman, how long has it been since two armies met against each other on a battlefield?
In a modern context, the drones would have been ordered to kill a civilian population. Not the intended target at all. Bad PR to boot.
"DrCaleb" said .....how long has it been since two armies met against each other on a battlefield?........
Back when Douglas Haig heroically managed to get one out of every three soldiers that served under him machine-gunned to death or blown into pieces by mortar rounds?
In a modern context, the drones would have been ordered to kill a civilian population. Not the intended target at all. Bad PR to boot.
Well, it's bad optics when a wedding attended by Taliban and Al Qaeda gets blown up but compared to what the above-mentioned Douglas Haig got up to the casualty rate by the new methods is really kind of low. Just because the old ways were traditional it doesn't mean they were good.
"Thanos" said .....how long has it been since two armies met against each other on a battlefield?........
Back when Douglas Haig heroically managed to get one out of every three soldiers that served under him machine-gunned to death or blown into pieces by mortar rounds?
In a modern context, the drones would have been ordered to kill a civilian population. Not the intended target at all. Bad PR to boot.
Well, it's bad optics when a wedding attended by Taliban and Al Qaeda gets blown up but compared to what the above-mentioned Douglas Haig got up to the casualty rate by the new methods is really kind of low. Just because the old ways were traditional it doesn't mean they were good.
And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
You can't turn back the clock. In less than 20 years computers will be smarter than people. Then SkyNet, fall of humanity, terminiators, time travel, yadda yadda yadda. Unless it turns out that we are all simulations (highly likely actually) in which case it will be Agents, Red Pill, The One et cetera.
"DrCaleb" said And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
There'll never be any perfection or justice in automated war any more than there's ever been any perfection or justice in human-waged war. Humans have been notoriously lousy at differentiating the goat-herder with a hundred year-old Lee-Enfield on his back from the Talib with the AK-47 on his so over-emphasizing any mistakes the machines would make really does seem kind of silly. The overall point is that whatever reduces human casualties is something that should be encouraged.
"Thanos" said And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
There'll never be any perfection or justice in automated war any more than there's ever been any perfection or justice in human-waged war. Humans have been notoriously lousy at differentiating the goat-herder with a hundred year-old Lee-Enfield on his back from the Talib with the AK-47 on his so over-emphasizing any mistakes the machines would make really does seem kind of silly. The overall point is that whatever reduces human casualties is something that should be encouraged.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
I really don't see the difference between machines killing everyone in a predetermined area and napalming, carpet bombing, nuking, or MOAB'ng an area with the same net result.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
I really don't see the difference between machines killing everyone in a predetermined area and napalming, carpet bombing, nuking, or MOAB'ng an area with the same net result.
In that scenario there wouldn't be. I'm thinking more to the future when a machine is armed with a rifle and is allowed to determine it's own targets and let loose on a village to weed out insurgents. Somewhere carpet bombing isn't an option because of the 'friendlies'. Or perceived friendlies, anyhow.
Deciding on who is friendly and who is enemy is hard for people. For machines, I can't see them ever making that distinction, unless the enemy is limited to other machines.
Actually the protocol for this kind of thing would be simple.
1. Withdraw your human troops from the battlefield.
2. Order your drones to kill everyone remaining.
Deciding on who is friendly and who is enemy is hard for people. For machines, I can't see them ever making that distinction, unless the enemy is limited to other machines.
Actually the protocol for this kind of thing would be simple.
1. Withdraw your human troops from the battlefield.
2. Order your drones to kill everyone remaining.
Bartman, how long has it been since two armies met against each other on a battlefield?
In a modern context, the drones would have been ordered to kill a civilian population. Not the intended target at all. Bad PR to boot.
.....how long has it been since two armies met against each other on a battlefield?........
Back when Douglas Haig heroically managed to get one out of every three soldiers that served under him machine-gunned to death or blown into pieces by mortar rounds?
Well, it's bad optics when a wedding attended by Taliban and Al Qaeda gets blown up but compared to what the above-mentioned Douglas Haig got up to the casualty rate by the new methods is really kind of low. Just because the old ways were traditional it doesn't mean they were good.
.....how long has it been since two armies met against each other on a battlefield?........
Back when Douglas Haig heroically managed to get one out of every three soldiers that served under him machine-gunned to death or blown into pieces by mortar rounds?
Well, it's bad optics when a wedding attended by Taliban and Al Qaeda gets blown up but compared to what the above-mentioned Douglas Haig got up to the casualty rate by the new methods is really kind of low. Just because the old ways were traditional it doesn't mean they were good.
And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
There'll never be any perfection or justice in automated war any more than there's ever been any perfection or justice in human-waged war. Humans have been notoriously lousy at differentiating the goat-herder with a hundred year-old Lee-Enfield on his back from the Talib with the AK-47 on his so over-emphasizing any mistakes the machines would make really does seem kind of silly. The overall point is that whatever reduces human casualties is something that should be encouraged.
And that's what the article is saying. That wedding wasn't targeted by an automated system. There was a person who gave the order, and a person who carried it out. The drone didn't decide to fire the missile by itself.
How would an automated drone know about the guy who is a sheep herder by day protecting his flock with an AK, and the same guy who goes on Taliban raids by night with the same AK? Machines can't make that distinction, and we shouldn't let them.
There'll never be any perfection or justice in automated war any more than there's ever been any perfection or justice in human-waged war. Humans have been notoriously lousy at differentiating the goat-herder with a hundred year-old Lee-Enfield on his back from the Talib with the AK-47 on his so over-emphasizing any mistakes the machines would make really does seem kind of silly. The overall point is that whatever reduces human casualties is something that should be encouraged.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
Bartman, how long has it been since two armies met against each other on a battlefield?
Sometime next week when Russia invades Ukraine.
Russia vs. Georgia 2008.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
I really don't see the difference between machines killing everyone in a predetermined area and napalming, carpet bombing, nuking, or MOAB'ng an area with the same net result.
But machines making 'kill' decisions changes no soldiers' risk on the battlefield, because many weren't on the battlefield to begin with. But it does increase the probability of friendly fire and civilian casualties. That's the point of the article.
I really don't see the difference between machines killing everyone in a predetermined area and napalming, carpet bombing, nuking, or MOAB'ng an area with the same net result.
In that scenario there wouldn't be. I'm thinking more to the future when a machine is armed with a rifle and is allowed to determine it's own targets and let loose on a village to weed out insurgents. Somewhere carpet bombing isn't an option because of the 'friendlies'. Or perceived friendlies, anyhow.
Skynet will take care of everything.
How well did that work out?
There is no hope to stop the rise of Skynet.