Amazon upgrades AI model for 'Just Walk Out'
“A single, generalized AI model can produce results equally as good as a model that would be overtrained or overfit on a subset of the data,” Jon Jenkins, VP Just Walk Out technology, Amazon, said during an invitation-only press tour attended by Chain Store Age at Amazon offices in Seattle. “We have been working on the underlying transformer technology for years. The model allows us to generate receipts faster and more accurately and efficiently."
"Previously, there was a model to detect if someone's hand went into a product space," Jenkins explained during the presentation. "Then there was a model to determine if the image of the item that came out of that product space looked like the item that we thought was there. Then there might be a model for counting the number of items that came out of that space. And what Amazon has learned is like what has been learned in the large language model (LLM) AI space elsewhere, which is you can actually combine all of these data inputs into a single model."
According to Amazon, Just Walk Out is currently available in over 170 third-party locations such as airports, stadiums, universities, and hospitals in the U.S., the U.K., Australia and Canada. Amazon plans to launch more Just Walk Out stores in 2024 than any year prior, more than doubling the number of third-party stores with the technology this year.
"The new, Just Walk Out multi-modal foundation model for physical stores is a significant advancement in the evolution of checkout-free shopping," Jenkins said in a corporate blog post. "It increases the accuracy of Just Walk Out technology even in complex shopping scenarios with variables such as camera obstructions, lighting conditions, and the behavior of other shoppers, while allowing us to simplify the system."