Abstract
Drinking is an inherently multisensory activity, yet the potential of immersive technology to dynamically shape flavor experiences remains underexplored in Human-Food Interaction (HFI) research. We introduce “XTea”, an adaptive beverage cup-based system that integrates large language models to translate natural language input into modifications of a parameterized immersive environment experienced through a headset when drinking bubble tea. Through a study with 12 bubble tea enthusiasts, we derived themes that demonstrate how “XTea” can enrich sensory engagement, support personalized and agentic experiences, and foster social qualities of drinking, pointing toward new explorations for multisensory HFI design. We also present four design strategies for multisensory beverage experiences. Ultimately, we aim to contribute to the advancement of HFI research on how multisensory interaction design can enrich flavor perception and engagement.