This AI Detects Shoplifters Before They Steal, But There's Something Even Creepier About It
VAAK, a Japanese startup, has developed artificial intelligence software that can detect suspicious human body language in potential shoplifters. The software, called VAAKEYE, interfaces with security cameras to analyze the movements and behaviors of people as they move throughout the camera's field of view. The software then conveys its suspicions to staff via an app.The goal, VAAK claims, is not arrest, but prevention. Once alerted, staff can approach the individual and ask if they need any help: a well-known deterrent for most potential shoplifters. VAAK has already proven its product's efficacy: according to Bloomberg Quint, the company made headlines last year when the system detected a shoplifter in a Yokohama convenience store where VAAKEYE was being tested. The suspect was arrested a few days later.
A NEW SHERIFF IN TOWN
The arrest bolstered VAAK's credibility, as well as their confidence. "I thought then, 'Ah, at last!'" said Vaak founder and CEO Ryo Tanaka, Bloomberg Quint reports. "We took an important step closer to a society where crime can be prevented with AI." In an even bolder statement, Tanaka, sounding like a badge-happy newly-minted sheriff, told TechCrunch Japan, "In the future we will further strengthen crime detection and prediction (and) aim to realize a safer society. I will increase the deterrent effect (shoplifting) by increasing the arrest by video evidence."
SUSPICIOUS BEHAVIOR?
While we could not locate a patent for the technology, VAAK's website (automatically translated by Google) gives some clues as to how it works. VAAKEYE can "analyze more than 100 person feature quantities such as face, clothes, movement direction, attribute and estimate behavior and purpose," the site's tech section claims, while also tracking other human traits and behaviors "such as taking / returning products," "poor physical condition," and "nuisance behaviors." In other words, the system is designed to detect "suspicious behaviors," including concealment, fidgeting, and restlessness, commonly observed by law enforcement and loss prevention personnel in shoplifters. Before you raise a legal objection, bear in mind that in most American states – we didn't research Japan's laws – concealment alone is enough to secure the element of intent in the crime of shoplifting or theft. According to Palmer Recovery Attorneys, "in the majority of states, a person has committed the act of shoplifting and may be detained as soon as unpurchased merchandise is concealed." The decision to act or make an arrest would, of course, remain at the discretion of the store and law enforcement with the liability of a false positive resting with the store, just as it would if a loss prevention agent or employee had observed the suspicious behavior. Unlike a human, however, this AI cannot – at least as far as we know – articulate what quantifies suspicious behavior: it simply attempts to match patterns against an existing dataset or sets.
FACIAL RECOGNITION TECHNOLOGY
What is of greater concern to privacy advocates is VAAK's bigger goal, plainly stated on their homepage, of "solving social problems with the eye of artificial intelligence" and another facet of the technology demonstrated in the video that hasn't been mentioned in other reports that we've examined. VAAK seems to deploy facial recognition technology that saves information on consumers to create a profile on them, much in the way that shopping rewards or discount cards work, but without the ability to opt out. In the videos produced by VAAK, each individual's face observed by VAAKEYE returns gender and age information, physical information in the form of height and weight, as well as average spend amount, number of visits, time in store, and frequently purchased ("liked") items. That's just the information that is documented onscreen. Assuming that the video is a demonstration of the actual technology, there's no reason to believe that additional layers of data are not captured and stored, much in the way that Facebook mines and captures information based on data points (likes, friends, check-ins, etc), ostensibly to create a custom curated feed for users.
POTENTIAL THREATS
When incursions are made into personal privacy – regardless of relative consent – the rationale offered invariably falls behind one of two lines of reasoning: safety/security or benefits to the consumer. The mechanisms behind selling or otherwise providing the data to third parties are intentionally opaque: it's no accident that those who most benefit from the stripping of the public's personal privacy insist that their business is entirely private.
VAAK's valorous talk of using their technology to analyze "suspicious activity, dangerous behavior, annoying behavior, etc. to protect the safety of stations, airports, shopping streets, houses and school roads," ignores the threats created by giant databases of highly personal information that have proven, as in the cases of Equifax, Mariott, and Yahoo among others, to be poorly secured soft targets that can be breached by dedicated hackers. On a much smaller scale, this potentially puts personal information directly at the fingertips of low-level personnel, which ought to raise concerns surrounding stalking and one's personal safety. Unfortunately, you do not have the option to choose where you shop according to whether or not a retailer deploys this technology. Bloomberg Quint writes that "because it involves security, retailers have asked AI-software suppliers such as Vaak and London-based Third Eye not to disclose their use of the anti-shoplifting systems."
So what, if anything, is there to do? Outside of living "off the grid," the answer is an unpleasant one to consider: be prepared for more of it and recognize it wherever you encounter it.
You can rest assured that it recognizes you.
https://www.outerplaces.com/science/item/19294-vaak-ai
沒有留言:
發佈留言