Tel Aviv-based company Cortica collaborated with Best Group for developing the technology, which can help in predicting crime across India using Artificial Intelligence (AI)
New AI system detecting illegal activity by identifying criminals before they act will be rolled out in India. New AI system works by identifying small movements that precede criminal activity. The aim of the Minority Report-style CCTV surveillance system is to prevent offences such as sexual assault by looking at the body language of people to predict what they are about to do. An Israeli security and AI research company will soon use AI to analyze the terabytes of data streamed from CCTV cameras in public areas in India. The company claims that will analyze ‘behavioral anomalies’ when someone might be about to commit a crime.
This technology analyses something multiple times before it learns to recognize key warning signs. If the system makes a mistake, programmers can find out which file was responsible for the dodgy call and re-teach it. The technology could be used for different types of surveillance and could also monitor passenger behavior using footage obtained from drones and satellites. Co-founder and COO of Cortica, Karina Odinaev, said the technology could identify movements often overlooked by security teams, potentially making cities safer. She said ‘unsupervised learning’ is required for the software to learn what to spot, which is why they want to train it on security camera.
For instance, in self-driving taxis the system could detect if someone might be about to assault another person. It could also monitor when a situation might be about to turn potentially dangerous in crowded areas. This citizen-monitoring technology is already in use in 40 local governments in China. Law enforcement London and New York already use video surveillance for facial recognition and to analyses license plates.