Well I figured we'd all be fucked
Results 1 to 10 of 10
'heavens above, how awful it is to live outside the law - one is always expecting what one rightly deserves.'
petronius, the satyricon
Well I figured we'd all be fucked
"But here steps in Satan, the eternal rebel, the first free-thinker and emancipator of worlds. He makes man ashamed of his bestial ignorance and obedience; he emancipates him, stamps upon his brow the seal of liberty and humanity, in urging him to disobey and eat of the fruit of knowledge." ~Mikhail Bakunin
Ehh, it's just another military tchnological innovation among many, both present and historical. The mitrailleuse was decried for its effective use mostly in slaughtering surrendered communards, but these days every soldier is himself a mitrailleuse (both in technological capability and sometimes willingness to act as one on civilians) but there's somewhat of a lack of moral outrage regarding the means. It's just how war is done now.
I ALMOST DIED OF A DRUG OVERDOSE BECAUSE OF ANARCHISM AND PUNK ROCK
A fully autonomous weapon is very different from a fully automated weapon, and the mitrailleuse is neither. Autonomous robots are a staple of science fiction, one of the creepier examples being Philip K Dick's "Second Variety," in which self-replicating autonomous bots designed to counter a successful Russian nuclear strike develop sentience and turn against their Western masters. But you don't need to envision a worst case scenario to see the problems with fully autonomous weapons or the non-autonomous versions that operate with telepresence, like the predator drones.
The mitrailleuse and its successors made the act of killing easier, quicker, cheaper, and reduced the amount of soldiers harms way. this is the impetus behind pretty all military technological development, including the current one.
Enlighten me: what are the problems and how do they differ from existing weapons and users? Are they different from the ones in the article, or the same?
I ALMOST DIED OF A DRUG OVERDOSE BECAUSE OF ANARCHISM AND PUNK ROCK
This literally just blew my mind
Come little children, I'll take thee away, into a land of enchantment, come little children, the times come to play, here in my garden of magic.
"I'm tired of this "isn't humanity neat," bullshit. We're a virus with shoes."-Bill Hicks.
I feel the Bern and I need penicillin
I think the true innovation is not making death machines cheap, fast and more efficient, or in controlling the threat to one's own soldiers, but in the independence given to the weapons. This is particularly true with fully autonomous units which will be exercising independent judgment that is limited only by hazy AI parameters.
I think that the concerns raised in the article are the ones that have been sounded by human rights activists and NGOs, but it doesn't appear to use the language. So here are the primary concerns you hear by campaigners against them:
1. Can these weapons effectively apply the constraints of international humanitarian law and international human rights law?
2. Will these weapons lead to a "responsibility" vacuum?
3. Will these weapons lower the threshold for conflict?
4. Will human control of these systems be effective?
Now we do have precursors to fully autonomous weapons, and autonomy is more like a sliding taxonomy scale than it is a hard and fast line. So for example the missile defense systems use automated systems to detect incoming missiles. The US Navy’s MK 15 ******* Close-In
Weapons System was first installed on a ship in 1980, and modified
versions are still widely used by the United States and allied nations. Israel's Iron Dome is similar. These systems use radar and other sensory mechanisms to detect incoming threats and respond, and while they are in theory subject to human certification, but that decision has to be made nearly instantaneously in many cases. And it is entirely possible to have a system that doesn't require human certification (or approval), but that requires human override, or veto. According to Human Rights Watch, for example, it is unclear whether there is an override mechanism in place for the fully automated NBS Mantis (there is a human monitor), which Germany uses to protect its bases in Afghanistan. Within four and a half seconds after detecting targets up to three kilometers away, it can fire six 35mm automatic guns at 1,000 rounds per minute.
Now these units are not autonomous in the way that future robots will be. But it is important to look at their deployment to see some of the future dangers of fully autonomous robots. There is already an observed "automation bias" that makes this quite risky to use in the military context:
A fully autonomous system, however, can eliminate or at least diminish the risk of this bias, but only by eliminating human decision-making, which then creates the accountability problem and the concerns over proportionality, war crimes, etcetera. So imagine a fully autonomous drone that is capable of conducting a mission with at most a human override, or variations of existing sentry units used in Israel and South Korea that are capable of ground operations without human input. They would be responsible for independently determining whether a target is a legitimate one or not, but the intelligence is artificial and requires the machines to distinguish between say, two unidentified soldiers on approach or two civilians. There are proposals for getting around this (in the form of an "ethical governor" that uses a complex software program to determine proportionality, etc), but that still creates an accountability problem, given that there would still be errors.
The evil genius of neoliberalism in its ability to get plausible deniability when committing genocide.![]()
I am a pessimist by nature. Many people can only keep on fighting when they expect to win. I'm not like that, I always expect to lose. I fight anyway, and sometimes I win.
--rms
While corporations dominate society and write the laws, each advance in technology is an opening for them to further restrict its users.
--rms
AKA loonyleftist
To what end? Why make autonomous war machines?
non-issues. Has any weapon met measures 1-3? Have any human controls on weapons been effective?
dude what makes you think a machine will be less effective in distinguishing threats from non-threats any worse than human soldiers already do?
I don't have to tell you that modern conflict ain't any less fucking horrible on civilians than past conflict.
I ALMOST DIED OF A DRUG OVERDOSE BECAUSE OF ANARCHISM AND PUNK ROCK
We're supposed to Be Very Afraid because The Future Will Be Horrible and Nothing We Do Matters Anymore for a few centuries at the very least, it's gotten a bit old at this point.