Dehumanizing War

I have been wracking my brains out trying to find something different relating to science, but relevant to today’s world. Then I ran across this post in the Daily Galaxy :

Increasingly, the military wants to hand over the responsibility of killing to conscienceless machines. Some say it’s a great way to protect our troops and others are calling it a cold-hearted cop-out.

In either case, the US military hopes to dehumanize military operations as quickly as possible. The US National Research Council advises “aggressively exploiting the considerable war fighting benefits offered by autonomous vehicles”. They are cheap to manufacture, require less personnel and, according to the navy, perform better in complex missions. One battlefield soldier could start a large-scale robot attack in the air and on the ground.

The US military already has unmanned aerial vehicles armed with hellfire missiles. “At present they require a human to give, by remote, permission to fire,” says Owen Holland, professor of computer science at the University of Essex, “but it will not be long before they can take the human out of the loop.”

The U.S. Government has made no secret about wanting to increasingly utilize robots and other “unmanned” equipment to use in warfare. This is one of the mandates of the Pentagon 20 Year Strategy, to “dehumanize” war.

Hmmph, like it doesn’t do that already?

3 responses

  1. I wonder how long the Pentagon will keep the robots “nonsentient”?

    If they’re smart they will.

    Terminator IV anyone?

  2. Remember what happened to humanity when it created the Cylons to do its bidding in Battlestar Galactica?

    Some reports say true artificial intelligence is just a decade off.

    Imagine if the two, robots and AI, become joined?

  3. That is the premise behind the Singularity, strong Artificial Intelligence being created, either for computers or robots. Once created, the AI will improve itself exponentially to the point where a “Singularity” (like in black hole) will occur whereby anything could happen since nobody can predict the outcome.

    And yes, something like the Cylons could happen from it. The “terminator” is a possible outcome too. That’s why some scientists aren’t too sure whether we should be striving for AI, we could be causing our own extinction.

    Like we aren’t doing that already, right?

    At least if the AIs succeed us on the evolutionary tree, we’ll be leaving behind intelligent descendents of a sort.

    The way we’re going now, only cockroaches will succeed us.

Leave a Reply to dad2059 Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: