WaPo following my lead

There are just too many great quotes from this article, so just read the whole thing:

Bots on the ground

The wars in Afghanistan and Iraq have become an unprecedented field study in human relationships with intelligent machines. These conflicts are the first in history to see widespread deployment of thousands of battle bots. Flying bots range in size from Learjets to eagles. Some ground bots are like small tanks. Others are the size of two-pound dumbbells, designed to be thrown through a window to scope out the inside of a room. Bots search caves for bad guys, clear roads of improvised explosive devices, scoot under cars to look for bombs, spy on the enemy and, sometimes, kill humans.

Even more startling than these machines’ capabilities, however, are the effects they have on their friendly keepers who, for example, award their bots “battlefield promotions” and “purple hearts.” “Ours was called Sgt. Talon,” says Sgt. Michael Maxson of the 737th Ordnance Company (EOD). “We always wanted him as our main robot. Every time he was working, nothing bad ever happened. He always got the job done. He took a couple of detonations in front of his face and didn’t stop working. One time, he actually did break down in a mission, and we sent another robot in and it got blown to pieces. It’s like he shut down because he knew something bad would happen.” The troops promoted the robot to staff sergeant — a high honor, since that usually means a squad leader. They also awarded it three “purple hearts.”

Humans have long displayed an uncanny ability to make emotional connections with their manufactured helpmates. Car owners for generations have named their vehicles. In “Cast Away,” Tom Hanks risks his life to save a volleyball named Wilson, who has become his best friend and confidant. Now that our creations display elements of intelligence, however, the bonds humans forge with their machines are even more impressive. Especially when humans credit their bots with saving their lives.

Ted Bogosh recalls one day in Camp Victory, near Baghdad, when he was a Marine master sergeant running the robot repair shop.

That day, an explosive ordnance disposal technician walked through his door. The EODs, as they are known, are the people who — with their robots — are charged with disabling Iraq’s most virulent scourge, the roadside improvised explosive device. In this fellow’s hands was a small box. It contained the remains of his robot. He had named it Scooby-Doo.

“There wasn’t a whole lot left of Scooby,” Bogosh says. The biggest piece was its 3-by-3-by-4-inch head, containing its video camera. On the side had been painted “its battle list, its track record. This had been a really great robot.”

The veteran explosives technician looming over Bogosh was visibly upset. He insisted he did not want a new robot. He wanted Scooby-Doo back.

“Sometimes they get a little emotional over it,” Bogosh says. “Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission.”

The bots even show elements of “personality,” Bogosh says. “Every robot has its own little quirks. You sort of get used to them. Sometimes you get a robot that comes in and it does a little dance, or a karate chop, instead of doing what it’s supposed to do.” The operators “talk about them a lot, about the robot doing its mission and getting everything accomplished.” He remembers the time “one of the robots happened to get its tracks destroyed while doing a mission.” The operators “duct-taped them back on, finished the mission and then brought the robot back” to a hero’s welcome.

Near the Tigris River, operators even have been known to take their bot fishing. They put a fishing rod in its claw and retire back to the shade, leaving the robot in the sun.

Of the fish, Bogosh says, “Not sure if we ever caught one or not.”

The article goes on to ask a good question I didn’t ask last time:

If a robot performs magnificently in battle, should its operator get a medal? Even if that human is hundreds, if not thousands, of miles from the action? The question goes to the heart of a novel issue: Where does the bot end and the human begin?


  1. It’s really just an extension of what the fictional Gen. Patton was talking about at the end of that movie. Button pushing and advanced weapons have been around for awhile. I suppose at a certain point it’s about strategy and leadership. You promote a general who might not be on the front lines if his leadership brings victory. I would think the same thing applies here. The key is that the human is the “operator” so he is actually doing something, the bot is not entirely autonomous. If it were then it should be getting the promotions and medals just like Data in TNG.

  2. If the operator is just the guy putting the bot out there, then I am not sure he deserves a medal more than the machine that actually accomplished the mission.

    Furthermore, these bots often are autonomous, so your hypothetical isn’t very helpful for your answer. But let me post a few comments from the D&D thread to make my point more clear.

  3. Aliquid posted:

    This is why you don’t name animals you’re going to eat.

    I think this misses the whole point. If you are raising an animal to eat, then don’t name it. The military aren’t building these robots to get destroyed, they are building them to complete missions. When they are successful, the team honors the bot as a symbolic gesture that appears to raise the morale of the team as a whole- which is the one of the main functions of honors and medals. When a bot does get destroyed, it can lower morale not just because of some psychological trick (HAAD) on the part of the soldiers, but for the very real fact that the absence of the bot impairs the mission.

    So the bonding with machines described in the article seems perfectly appropriate to me.

  4. Powercrazy posted:
    This is ridiculous. Emotion at play here, nothing more.

    I thought soldiers in general are supposed to be more diciplined than that. Until we have real thinking (and maybe not even then) AI machines, we shouldn’t give it a second thought. Sure there will be emotional bonding, but I think that is a sign of psychological insecurity more than anything.

    Machines are tools, made by man, you can enjoy them, you can give them cute names, but at the end of the day if its You or It, I’d assume everyone would choose themselves.

    You are severely underestimating human psychology.

    The human mind comes equipped with what has been called a ‘Hyperactive Agent Detection’ system. But such detection is not merely ’emotional’ or a sign of insecurity. In fact, it is probably significantly responsible for our intelligent behavior in the first place.

    Let me explain with a long-winded post.

    Human intelligence is primarily found in our ability to be team players in a complicated environment of tools, resources, and other agents that can work together to solve some shared task. Part of this intelligence rests on our ability to identify other key players in the team, and to offload and distribute responsibilities among the various agents and tools to best accomplish the task. This is why our intelligence is primarily social intelligence- we know how to work together.

    When you start putting AI systems in key roles within that team, the basic structure of our social intelligence doesn’t change. More importantly, it shouldn’t. These systems are properly identified as autonomous members of the team, and are treated accordingly. That doesn’t mean that the AI is a person, or deserves the same treatment we would give to a human. The fact that the soldiers treat these bots like pets seems to show that we aren’t calling the nature of humanity into question. So the ‘us or them’ mentality that you seem to be supporting isn’t really the issue here. (Note: I still think there is a problem with the ‘us vs them’ mentality, but thats neither here nor there)

    Rather, the issue is: how do we treat the machines on the battlefield? Once you get rid of the idea that the machines are human, then an answer to this question stops being so controversial: you treat the machines like valuable members of the team, with their own unique abilities and responsibilities, and who are working alongside you to accomplish the same tasks. When the machine is successful, then the team is successful also, and praise and accolades go around. When the machine is unsuccessful, then the team as a whole is unsuccessful.

    The point is that these responses aren’t just anthropocentric delusions brought about by emotional insecurity. They are responses entirely appropriate to the situation, and which reinforce the mutually supportive relationship between the humans and the machines.

    The HAAD is not a sign that humans are emotionally weak or psychologically confused. The HAAD might get a few false positives in certain situations, but for the most part it is a very successful device that provides largely accurate and extremely important information about the other players in the environment. And it is a matter of brute fact that machines have become additional players in the environment- not just tools like the soldier’s gun, but semi-autonomous agents that have their own objectives and abilities in their own right.

  5. Your first comment claims they are “autonomous” then in your most recent comment you call them “semi-autonomous” what’s up? Until that part is worked out, I’ll stick with my comment which, hypothetical though it may be, addresses either circumstance.

  6. what is the matter. we all feel for our extensions and attribute qualities to them. a good sportsman for instance recognises his equipment as individual and attributes luck and success to some. the same happens to machine operators, even people who operate computers have that kind of family feeling for some machines. it is natural that man wishes to reinforce his successes and the possible agents that made his success possible, whether they be man or machine. in high risk environments like war, the bonding just becomes closer. ask any army general or police officer about a favorite gun and you will know.

    the robot is closer to us because it acts almost like us, it is programmed that way. and in war, we will always remeber who helped us and who did not. that emotional bond is common and is bound to stay. so what is new?


Submit a comment