Your browser doesn’t support HTML5
Two American Air Force fighter airplanes recently flew into the sky above California.
The military planes were taking part in training, practicing a wartime maneuver called a dogfight.
One of the planes had a human pilot inside. The other did not. It was operated by artificial intelligence, or AI, software.
A human was in the airplane’s other seat.
The Air Force was demonstrating how far its AI piloting technology has come.
The United States is competing to stay ahead of China on AI and its use in weapon systems. The focus on AI has led to public concern. People worry that future wars will be fought by machines that choose and strike targets without direct human intervention. The U.S. has said it would never send a pilotless fighter plane into war because of concerns that civilians could be harmed mistakenly. But it is unclear if other countries have the same policy.
So the U.S. is working on AI piloting technology, just in case.
Admiral Christopher Grady is vice chairman of the Joint Chiefs of Staff, the top U.S. military leadership group. When speaking about the use of AI in war, he said: “China’s working on it as hard as we are.”
Here is a look at the U.S. history of machine learning and AI in the military.
The military started putting computers in control of some systems in the 1960s and 1970s. The Aegis missile defense system was programmed by humans. They gave the system a set of “if/then” commands because the computer could detect incoming missiles faster than humans. The system could not learn, it could only carry out the rules. So it was not AI as we understand it today, said Christopher Berardi, an Air Force Lt. Colonel who is working with the Massachusetts Institute of Technology (MIT) on AI development.
AI took a step forward in 2012 when faster computer processors started to analyze information and write rules. Computer experts call that period “the big bang” for AI. The new data created by a computer is considered artificial intelligence.
Air Force Secretary Frank Kendall was the human on board the F-16 fighter jet called Vista. The Vista jet is the most visible AI project, but the U.S. Department of Defense has hundreds in place.
One project going on at MIT includes recordings of Air Force pilots speaking with people in air operations centers on the ground. Service members are training the AI systems to detect the most important conversations and separate them from mundane chatter. The plan is to use AI to alert pilots to critical messages.
The Air Force is also working on an AI-powered navigation system that does not depend on satellites. The concern is that satellites could be targeted in war.
So instead of using satellite navigation, the Air Force is working on a navigation system that uses the magnetic fields of the Earth. The idea is not new, but before AI, experts worried that navigation systems would not work due to the electromagnetic signal created by most aircraft. But programmers are training an AI system to ignore the signal created by the airplane and instead pay attention to the Earth’s magnetic emissions.
Col. Garry Floyd is head of the Air Force-MIT Artificial Intelligence Accelerator Program. He said magnetometers are so sensitive they would pick up the signal from flashing lights on a C-17 airplane. Floyd called the work so far, “very, very impressive.”
“We think we may have added an arrow to the quiver in the things we can do, should we end up operating in a GPS-denied environment. Which we will,” Floyd said.
The system has only been tested on the C-17 so far.
There are some limits to what AI can do. So there is always a human safety pilot on board. For example, AI cannot learn in mid-flight. It can only learn from the data it collects each time the plane is in the air. When the plane is back on the ground, the new information is loaded into a simulator and the AI system learns and creates more flight rules.
But unlike in the past, AI now learns fast. Today’s computers have more power than ever before and sometimes the AI system learns new ways to fly the planes. Sometimes it can beat human pilots in tests.
But there are still “guardrails.” AI may learn a new way to fly a plane, but a human observer will prevent the computer from transferring the details on how to do an unsafe maneuver in the air. The Air Force said it hopes the system it is working on today can be used to create a fleet of 1,000 unpiloted airplanes.
I’m Dan Friedell.
Dan Friedell adapted this story for Learning English based on a report by The Associated Press.
____________________________________________________
Words in This Story
focus –n. attention or concern
detect –v. to study information in order to come up with a conclusion
big bang –n. a time of fast discovery or change
mundane –adj. usual, routine or unimportant
chatter –n. words spoken quickly
alert –v. to direct someone’s attention to something important
impressive –adj. causing great appreciation or interest
arrow to the quiver –expression. another kind of weapon added to a group of weapons
simulator –n. a machine or computer system that permits a user to experience an activity such as flying without getting into an airplane
transfer –v. to move information from one place to another
We want to hear from you. What would you think if a war was conducted without humans flying airplanes?