Zhuo Wang

HCI Researcher



Hello, I'm Zhuo


I am a senior student at Xi'an Jiaotong–Liverpool University. My research interest in Human Computer  Interaction(HCI) focuses on the intersection of AI and XR  (wearable/ sensing technologies) , with a specific focus on two directions: 
1) Designing AI-powered XR systems that enable more intelligent and adaptive spatial interactions based on users’ needs, contexts and environment. 
This typically involves three key-loop steps: First, the system senses and tracks the current environment (including scenes and objects) and user multimodal inputs, and converts these into digital information. Second, AI comprehends and reasons about this digital information. Third, the system adaptively determines when to intervene (timing), how to present information (multisensory output), what content to deliver, and where to place the assistance in the user's environment. This closed-loop process enables users to receive on-demand, timely, and context-aware support throughout their interaction experience to improve their perception and cognition capabilities. 
2) Explore and design Human- (physical & virtual) AI  collaboration in XR environments to support individual or group needs, particularly in learning, creative work, and team collaboration.
I am especially interested in XR as a mediating layer for Human-AI interaction: multimodal interaction enables precise intent expression and configurable AI boundaries (agents’ role, involvement degree etc.), and XR's spatial feature allows AI to manifest in diverse forms (avatar or tool) and make its reasoning visible. 
Note:  I view XR as an increasingly mature ecosystem set of  wearable and sensing devices. When a research problem doesn't require a full XR system, or when a subset of devices offers a more elegant solution, I'm open to using individual wearables or sensors instead.  So when I refer to "XR" throughout the above statement, it can be understood broadly as wearable and sensing technologies—for example, "designing AI-powered wearable or sensing systems". 
I have been fortunate to work with the following mentors :
Xiaojuan Ma at HKUST and Qian Zhu at Renmin University of China;      Karan Ahuja at Northwestern University; 
Michael Nebeling and Janet Johnson at University of Michigan;                  Brennan Jones at XJTLU. 

 
If you have a research project related to  XR  or AI + XR, and are looking for a collaborator  in these directions, please feel free to contact me! 

Projects




VR Data Story


This project aims to explore the potential of using a VR data story to raise people’s situation awareness of health risks




VR Museum


This project aims to understand the promises and challenges of experiencing and curating exhibitions in VR




Interaction Technology of Situated Visualization in Public Environment


This project explores socially acceptable spatial interaction design for situated visualization




Navigation Technique in Multi Scale Environment


This project focuses on the design and evaluation of a unified multi-scale navigation user interface to help users quickly understand spatial and hierarchical information in multi-scale virtual environments




A Streaming Gesture Recognition Framework


This project aims to develop a gesture recognitrion framework in resource-constrained scenarios




Design of Mixed Reality Systems to Enrich the Beverage Experience


This project focuses on a adaptive multisensory drinking system




Investigating Embodied Conversation Agent to reduce LLM hallucination in Virtual Reality Education.


This project explores different AI hallucination-aware cues design for embodied conversational agents




GrabXR


This project aims to design and develop a XR hand grab system in occlusion environment




Active Teaching Agent


This project aims to provide a personalized and interative learning and responsive teaching experiense




Group -AI Interaction in MR


This project explores how to use MR as a medium to regulate AI agent in collaboration work.

Publications


Duo Streamers: A Streaming Gesture Recognition Framework


Boxuan Zhu, Sicheng Yang, Zhuo Wang, Haining Liang, Junxiao Shen

2025


Make Interaction Situated: Designing User Acceptable Interaction for Situated Visualization in Public Environments


Qian Zhu, Zhuo Wang, Wei Zeng, Wai Tong, Weiyue Lin, Xiaojuan Ma

CHI '24, Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2024


From reader to experiencer: Design and evaluation of a VR data story for promoting the situation awareness of public health threats


Qian Zhu, Linping Yuan, Zian Xu, Leni Yang, Meng Xia, Zhuo Wang, Hai-Ning Liang, Xiaojuan Ma

International Journal of Human-Computer Studies, vol. 181, 2024, p. 103137


DreamVR: Curating an Interactive Exhibition in Social VR Through an Autobiographical Design Study


Jiaxun Cao, Qingyang He, Zhuo Wang, RAY LC, Xin Tong

CHI '23, Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2023



Tools
Translate to