Your browser doesn’t support HTML5
Paul Hildreth looked at images from security cameras set up at schools in Fulton County, Georgia. He began watching a video of a woman walking inside one of the school buildings. The top of her clothing was bright yellow.
Hildreth used his computer’s artificial intelligence, or AI system to find other images of the woman. The system put the pictures together in a video that showed where she currently was, where she had been and what she was doing.
There was no security threat. But the short demonstration for The Associated Press showed what is possible with AI-powered cameras.
Hildreth is emergency operations coordinator for the Fulton County School District. If a gunman were to enter one of his schools, he says, the cameras could identify the shooter’s position and movements, helping police to end the threat quickly.
AI is helping to change security cameras from passive to active observers. They can identify people, suspicious behavior and guns and gather large amounts of information. The images show what people are wearing, how they walk and other physical mannerisms.
If the cameras capture an image of someone who is banned from a building, the system can immediately inform school officials if the person returns.
At a time when the threat of a mass shooting is ever-present, school officials are among the biggest supporters of the technology. Police, businesses and big companies are also using what is being called intelligent video. Yet many civil liberties groups express concerns about a threat to privacy.
In early 2018, an expelled student killed 17 people at Marjory Stoneman Douglas High School in Parkland, Florida. A year later, Broward County, which includes Parkland, set up cameras from the Canadian-based company Avigilon throughout its school district.
Hildreth’s school district in Georgia will spend $16.5 million to put the cameras in about 100 buildings in coming years.
The Greeley, Colorado school district has used Avigilon cameras for about five years and the technology is improving, said John Tait, the district’s security manager.
Upcoming improvements include the ability to identify guns and read people’s expressions, something not currently part of Avigilon’s systems.
“It’s almost kind of scary,” Tait said. “It will look at the expressions on people’s faces and their mannerisms and be able to tell if they look violent.”
It is hard to say how many schools have AI-equipped cameras because no one is collecting those numbers. Michael Dorn is head of Safe Havens International, a nonprofit that advises schools on security. He notes that “quite a few” schools use Avigilon and Sweden-based Axis Communications equipment “and the feedback has been very good.”
Schools are the largest market for video-surveillance systems in the United States. The U.S. market was worth an estimated $450 million in 2018, notes HIS Markit, a London-based data and information services company.
AI-powered cameras in stores
AI cameras have also been tested by companies to study customers’ facial expressions to identify whether they are having a good or bad shopping experience. That information comes from the Center for Democracy and Technology, a Washington nonprofit group that fights for privacy protections.
Businesses can see individuals stealing products in real-time and inform security, or warn of a potential theft. One company, Athena-Security, has cameras that recognize when someone has a weapon. And in an effort to help businesses, the company recently expanded the system to help identify big spenders when they visit a store.
Police in New York City, New Orleans and Atlanta all use cameras with AI. In Connecticut, the Hartford police force has deployed about 500 cameras throughout the community. They include some AI-equipped models that can search hours of video to find people wearing a certain kind of clothing. They also can search for places where a suspicious vehicle was seen.
Privacy concerns
The power of the systems has fueled privacy concerns, however.
“People haven’t really caught up to how broad and deep the technology can go,” says Jay Stanley, a policy expert at the American Civil Liberties Union.
Stanley published a research paper in June about how the cameras are being used. “When I explain it,” he said, “people are pretty amazed and spooked.”
When it comes to the possibility for reducing violence that may be less of an issue.
Shannon Flounnory, executive director for safety and security for the Fulton County School District, said no privacy concerns have been heard there.
“The events of Parkland kind of changed the game,” he said. “We have not had any arguments or pushback right now.”
Facial recognition is not perfect. A study from Wake Forest University found that black faces appeared angrier than white faces in some facial-recognition software programs.
But the seemingly endless cycle of mass shootings is forcing Americans to see technology — untested though it may be — as a possible solution to an unending problem.
I’m Dorothy Gundy.
And I'm Bryan Lynn.
Ivan Moreno reported this story for The Associated Press. George Grow adapted his report for VOA Learning English. Hai Do was the editor.
We want to hear from you. Write to us in the Comments section, and visit our Facebook page.
Words in This Story
artificial intelligence – n. the development of computer systems with the ability to perform work that normally requires human intelligence
scary – adj. causing fear; frightening
feedback – n. information about reactions to a product or event
customer – n. a person or group that buys products or services from a business
spook – v. to frighten; to unnerve
cycle – n. a series of events, often repeated in the same order