Brawl-Detecting Algorithms, the Government’s A.I. Confusion, and Other A.I. News of the Week
The most important A.I. developments of the week
It’s becoming less and less rare to see news about the U.S. federal government and artificial intelligence.
This week, two Democratic senators released a bill that would put a moratorium on facial recognition used by federal agencies until a commission creates guidelines on how it can be used safely and fairly. This would severely hamper organizations like the FBI, and potentially the U.S. military.
President Trump’s federal budget proposal called for doubling investment into A.I. research through organizations like the National Science Foundation, Department of Energy, and the Department of Defense.
All of which adds up to a sense that the U.S. federal government still doesn’t have a cohesive plan for how to address A.I., except for a vague notion that it’s important in some form and should be funded.
Meanwhile, surveillance tech isn’t going anywhere. Two of the papers in this week’s roundup focus on how algorithms can help automate surveillance analysis from CCTV cameras. Below are some other interesting A.I. research papers from this week:
Radiologists working with the help of an algorithm were able to more reliably grade prostate biopsies than if they had performed the grading alone, according to a study with 14 doctors and 160 biopsies.
The theory behind gait detection is that the way a person walks is as unique as their fingerprint or face. That means an algorithm can pick a person out of a crowd by their strut, even at a distance.
An algorithm can pick a person out of a crowd by their strut, even at a distance.
The problem is, gait detection works best with a clear view of the person walking, and that’s rare in real-life surveillance situations. This algorithm looks for frames where a person is obscured and tries to generate that information on its own to improve recognition.
Using data from hockey fights, movies, and surveillance cameras, Turkish researchers built an algorithm to detect when fists fly. The system works pretty well on hockey and movie fights, but it turns out real-world surveillance cameras are tricky, as researchers were only able to attain about 70% accuracy. WorldStar will have to wait to automate its staff.
American Express has a lot of financial data from its customers, but it obviously can’t share that data for privacy reasons. Now, AmEx researchers have a way to recreate synthetic financial datasets that can be shared and studied — a complex reconstruction of the data that can’t be traced back to customers.
The folks at Google Brain are tackling the issue of classes and subclasses in image recognition.
If an algorithm can detect dogs, that’s a class. If it can detect breeds, that’s a subclass. This research uses a secondary algorithm for each class, called a “student,” that specializes in subclass probabilities and helps overall accuracy.