IE 11 is not supported. For an optimal experience visit our site on another browser.

Here's how AI could help catch shoplifters in the act

The tech identifies suspicious activity based on shoppers’ behavior, such as body language, gait and facial expressions.
Image: Vaakeye

Shoplifting continues to be a huge problem around the world, costing retailers billions of dollars a year in the United States alone. But a Japanese startup has developed artificial intelligence software that it says can catch shoplifters in the act — and alert staff members so they can swoop in to prevent pilferage.

The system isn’t yet available in the U.S., but the Tokyo-based company behind it, Vaak, says tests in local convenience stores showed the system slashed shoplifting losses by 77 percent.

Dubbed Vaakeye, the system works with a store’s surveillance cameras to catch thievery that busy staffers might miss. Its developers trained the system by showing it more than 100 hours of closed-circuit television footage that depicted honest shoppers as well as shoplifters.

Vaak CEO Ryo Tanaka said the system identifies suspicious activity on the basis of more than 100 aspects of shoppers’ behavior, including gait, hand movements, facial expressions and even clothing choices. Promotional videos show Vaakeye spotting a range of suspicious activities, from “restless” behavior and “sneaking” to putting items into bags or pockets.

If the system spots behavior it deems suspicious, it alerts store personnel via an app. Then it’s up to staffers to take action — typically by approaching the potential shoplifters and asking if they need help. The system doesn’t actually label people as shoplifters; rather, Tanaka said, it tells staffers to “please check these people — they might steal things.”

This isn’t the first time artificial intelligence has been used to combat retail shrinkage. Retailers have used AI to detect refund fraud and employee theft. And Japanese communications giant NTT East made headlines last summer with AI Guardsman, a camera that uses technology similar to Vaakeye’s to analyze shoppers’ body language for signs of possible theft. AI Guardsman’s developers said the camera cut shoplifting losses by 40 percent.

Chelsea Binns, an assistant professor at John Jay College of Criminal Justice in New York City, said the Vaakeye system “appears to show great promise for loss prevention.” But, she added, retailers must weigh the costs and benefits of surveillance. “If regular customers are afraid to enter stores because they don’t like the idea of being tracked, this could potentially hurt retail sales,” she said.

Sven Dietrich, another John Jay professor, said deep learning algorithms tend to be only as good as the data used to train them. He offered a hypothetical scenario in which shoplifters in a training video wear blue jackets — something that might lead to the conclusion that anyone in a blue jacket is suspicious.

“You have to be sure you have enough information,” he said. “Otherwise, the algorithm might be extracting a certain bias."

And such systems might be biased against more than a shopper’s clothing.

Jerome Williams, a professor and senior administrator at Rutgers University’s Newark campus, has written extensively on race and retail environments. He said that unless training data is carefully controlled, a theft-detection algorithm might wind up unfairly targeting people of color, who are routinely stopped on suspicion of shoplifting more often than white shoppers.

“The people who get caught for shoplifting is not an indication of who’s shoplifting,” Williams said. “It’s a function of who’s being watched and who’s being caught, and that’s based on discriminatory practices.”

Williams said black shoppers who felt they had been unfairly scrutinized in stores previously might be more likely to appear nervous in subsequent shopping experiences — a potentially risky proposition if a system misidentifies anxiety as suspicious behavior. Still, he praised Vaakeye’s focus on body language. “I think it’s a good approach,” he said. “You shouldn’t racially profile. You should behaviorally profile.”

Nell Watson, a Belgium-based engineer who speaks widely about machine learning, agreed that behavior is an essential part of the equation. “It may be argued that you could engineer a system which is perhaps even less biased than a given human being, because human beings have their own impressions about people,” she said. But, she added, “it really depends on how well audited the algorithms are by independent experts in this kind of area.”

Audited or not, Vaakeye is already out in the world. Tanaka said the system had been installed in about 50 stores in the Tokyo area and would be soon be available more widely in Japan.

Want more stories about technology?

SIGN UP FOR THE MACH NEWSLETTER AND FOLLOW NBC NEWS MACH ON TWITTER, FACEBOOK, AND INSTAGRAM.