Recent school shootings have spurred students across the country to mobilize. However, instead of hitting the streets in protest, one group of students has turned to object recognition technology to detect and warn of possible threats to schools.
The University of Texas at Dallas students have developed a system called iNotify that’s designed to stop mass shootings before they start, according to a report from NBCDFW.com. iNotify serves as an early-warning system, instantly analyzing feeds from video surveillance cameras to identify weapons and other signs of threatening activity on school grounds.
“It, in real time, recognizes these weapons and then can actually determine the meaning of what’s going on rather than just saying, you know, this is what I think it is and then send out, based on that, still in real time, a notification to law enforcement, emergency responders and anyone who may be in harm’s way as a result of the weapon being detected,” UT Dallas computer science major Ashlesha Nesarikar told NBCDFW.com.
The use of object recognition technology helps iNotify to discern real threats from false alarms.
“We can add this extra level of context to, in real-time, recognize whether someone is wielding a weapon and posing a threat to people in the community,” Nesarikar said.
Nesarikar was inspired to create iNotify by an incident at the University of Texas, Austin in 2017, when an assailant stabbed four people, killing one.
An iNotify app currently is under development that will be able to receive alerts from the object recognition system.
In the wake of the Stoneman Douglas High School mass shooting in February, experts and advocates have been calling for the deployment of AI systems to enhance security.
In addition to object recognition technology, schools could employ security cameras with face-recognition capabilities to capture photos of everyone in a school and compare them with a database of staff and students. This would allow the school to control access and prevent strangers from entering school facilities.
Other types of artificial intelligence technologies, such as emotion- and action-detection algorithms, could provide a more sophisticated level of security. These algorithms could note unusual activity among those who are allowed access to the school, detecting and warning if anyone is engaging in dangerous acts.
“(AI) can monitor thousands and thousands of people at any one time and look at their behavior to see if they’re changing, and are they going to be a threat to other people in the building or the facility,” said Ron Jones, a cybersecurity professor at Harrisburg University, according to an article from FOX43.