4:20 PM - 4:40 PM
[3I5-GS-11-03] In-store Customer Trajectory Generation with the Mamba2 Architecture
Keywords:instore customer journey, trajectory generation, Mamba2, transformer, deep learning
Research on human trajectory generation using the Transformer architecture has been advancing. This technology enables the generation of highly "human-like" movement trajectories, such as commuting from home to work and back or navigating within a retail store from entry to checkout. However, Transformers have a computational complexity proportional to the square of the context length, making them unsuitable for generating movement trajectories with long-term memory, such as customers who spend extended periods in a store. To address this issue, this study applies the Mamba2 architecture to the same human trajectory generation task, aiming to generate in-store movement trajectories for long-stay customers. Mamba2 is based on state space models, characterized by a computational complexity that scales linearly with context length. In this presentation, we compare the generation results of both methods, particularly considering the elapsed time after store entry, and discuss the appropriate use cases for each model.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.