The military wants humans and robots to work together


It’s hard to yell at a robot and make it obey the same way you might yell at Alexa, but the military is working on it.

Currently, humans are piloting military robots in action. This is true for unmanned drones and for land vehicles, all of which are piloted by portable tablets. The stand-alone features, from navigation to human-defined GPS waypoints, help streamline this process, although issuing these commands still requires a soldier to enter them into a tablet or computer in order for the robot to can understand.

To more easily integrate robots into the routines of war, the military is testing a range of robot-human communication tools. At the end of July, on a campus north of Baltimore, the Robotics Research Center (RRC) at West Point, in collaboration with another laboratory and army command, tested methods that could facilitate the communication of soldiers with robots, via tablets and beyond.

At present, the remote control of a robot, even by tablet, is tedious. The soldier must actively guide him in real time, avoiding traps and being very attentive. There are two ways to reduce this burden. First, the military could make robots follow instructions given by tablet more autonomously, requiring less soldier time in combat. And finally, thanks to research to improve the ability of robots to understand and act on human language, soldiers could get rid of tablets altogether, reducing the burden of commanding robots to the same as leading human troops.

It is not an easy task, in part because combat makes communication difficult. Soldiers speak softly to each other when they need to conceal their position and howl over the din of battle when they absolutely need to be heard. The orders shouted by the officers are, hopefully, understood by those within earshot, who follow as best they can.

[Related: These augmented-reality goggles let soldiers see through vehicle walls]

“The ultimate goal of RRC research is for a squad or platoon to charge teams of air, wheeled and legged robots in the same way they would with their soldiers – no Linux command line required. “, Daniel Gonzalez, postdoctoral fellow at RRC in the Department of Electrical and Computer Engineering, said in a press release.

The use of the tablet is a bridge to this future. In the exercise, cadets trained with a quadcopter and a wheeled robot, connecting to phones, tablets and a mobile server.

The robots moved around autonomously and tagged objects in place using programming from the Army Research Laboratory. Then they shared the information they gathered with the soldiers on the shelves. This work builds on previous DARPA research to test urban swarming and mapping. In 2019, the OFFSET program (for swarm activated offensive tactics) tested swarms at a mock training facility in Fort Benning, Ga. Swarms would fly up to map an area by tracing the perimeter of a given building.

For this exercise, humans and robots were tasked with locating an injured person in an urban setting. This is a task with applications far beyond the military, and vital in the types of battlefields the military anticipates over the next several decades.

“This is a critical capability as we seek to better understand how soldiers will interact with these systems and how they will aid missions,” ARL researcher Stephen Nogar said in a statement. “If the mission was successful, there is work to be done to improve reliability and coordination of behaviors. “

Some of this work will likely be done by the Robotics Research Center, which will continue to refine the stand-alone code. Other work could be done in collaboration with the Army Combat Capabilities Development Command, or DEVCOM.

[Related: How do you make AI trustworthy? Here’s the Pentagon’s plan.]

In fact, DEVCOM researcher Claire Bonial has been working on understanding natural language, or how machines understand the way humans speak, for a decade. In a research paper released this year, Bonial and his co-authors worked on a way for robots to first understand human language, and then try to understand it in the context of human sentences.

This part of processing is vital if robots are to understand commands issued as words instead of instructions given as direct mechanical inputs into a computer. The way a person can say “wait then attack” emphasizes paying attention to an environment before proceeding, but a robot given a wait command without the ability to reason when that expectation must change to attack becomes more of a handicap. of an asset in action.

“We are optimistic that the deeper semantic representation will provide the necessary structure for a higher anchoring of language in the conversational and physical environment, so that robots can communicate and act more as teammates of soldiers, as opposed to tools ”, Bonial said in a press release.

This research, from training cadets in search and rescue to Bonial’s efforts at language comprehension, will pay off in the future, with robots listening and acting on human commands expressed in language. In the meantime, the military will continue to adopt robots, incorporate their controls as tablet functions, and adapt to machines driven on the battlefield.

The ultimate vision, that of human-robot cooperation, will have to wait until the machines can listen to the briefings, or at least listen to the orders. And then when the order arrives, as brutally as it may be under fire, the robots and humans together can seamlessly move into action, working towards the same goal and able to communicate through words, instead. simple tablet controls.

About Jon Moses

Check Also

A laptop based on the Russian-made Baikal M1 appears in pre-production

Bitblaze, a Russian brand specializing in servers, storage systems and workstations, introduced its pre-production Bitblaze …

Leave a Reply

Your email address will not be published.