Researchers have developed an algorithm that identifies attacks in real time using a drone-mounted camera, raising ethical concerns about misuse or misidentification, The Register reports. It has several steps. First, it identifies people in the shot. Then it estimates their body poses, creating stick figures. It looks at their poses relative to each other and uses its training to recognize instances of strangling, punching, kicking, shooting, and stabbing. On test videos, the previous state of the art (created by the same team) accurately identified violent actions 78% of the time, whereas the new system’s accuracy was 89%, according to a paper to be presented this month at the Conference on Computer Vision and Pattern Recognition in Salt Lake City. However, the test videos were staged; behaviors and views might not be as clear in real videos, and in crowds of more than 10 people the system can’t work in real time, the first author says.
Click here for free access to our latest coronavirus/COVID-19 research, commentary, and news.
Support nonprofit science journalism
Science’s extensive COVID-19 coverage is free to all readers. To support our nonprofit science journalism, please make a tax-deductible gift today.