Welcome to my HCI world!

I am Jindu Wang (王锦都), currently an M.Phil. student in HKUST VisLab, at the Department of Computer Science and Engineering of the Hong Kong University of Science and Technology (HKUST), supervised by Prof. Huamin Qu. Before joining HKUST, I obtained my B.S. degree in Software Engineering from Xi'an Jiaotong University.

I am actively seeking PhD opportunities for Fall 2026 admission. Feel free to contact me if you have opportunities or would like to discuss potential collaborations!

My primary research focus is on Human-Computer Interaction, particularly at the intersection of Wearable Computing, Cognitive Science and AI. I aim to understand user intentions and needs by using LLMs to interpret multimodal data, with a specific focus on the unique spatial and environmental context captured by AR/VR technologies, complemented by an ecosystem of other wearable sensors. By integrating theories from cognitive science, I seek to design and develop context-aware, seamless multimodal interactions and applications that enhance the quality of life and well-being for users, which follows two connected threads:

1) Understanding: How to utilize multimodal contextual information and user profiles to understand users' states, intentions, and needs at specific moments and in particular contexts?

2) Execution: How to design natural and seamless multimodal interactions, interfaces, and applications based on in-context understanding to provide proactive and/or user-initiated assistance?

I am fortunate to collaborate extensively with outstanding international researchers. I work closely with Xiang Li and Prof. Per Ola Kristensson from the University of Cambridge, as well as Runze Cai and Prof. Shengdong Zhao from NUS/CityU Hong Kong.

I was a research intern in KAIST HCI Lab advised by Prof. Geehyuk Lee. Previously, I interned at the Institute of Software, Chinese Academy of Sciences (ISCAS) and Lenovo Research, under the supervision of Prof. Teng Han and Dr. Nianlong Li, respectively. Also, I was a research assistant at Xi'an Jiaotong University, advised by Zhongmin Cai.

News

  • 2025.7: One 1st author paper conditionally accepted by ISMAR (TVCG track) and one co-author paper accepted ACM UIST!!! Korea see you again~
  • 2025.4: Will come to CHI2025, hiiii Yokohama! O(∩_∩)O
  • 2025.1: Start the collaboration on a new project with Prof. Shengdong Zhao and Runze Cai, let's do something interesting!
  • 2024.9: Submitted one ACM CHI 2025 paper, thanks to my collaborators and hope for good luck~ (Seems no good luck as a result)
  • 2024.8: Joined HKUST VisLab as an MPhil!

Publications

  • ISMAR 2025
    Handows: A Palm-Based Interactive Multi-Window Management System in Virtual Reality
    Jin-Du Wang, Ke Zhou, Haoyu Ren, Per Ola Kristensson, Xiang Li
    IEEE Transactions on Visualization and Computer Graphics (Special Issue of IEEE ISMAR 2025, To Appear)
  • UIST 2025
    NeuroSync: Intent-Aware Code-Based Problem Solving via Direct LLM Understanding Modification
    Wenshuo Zhang, Leixian Shen, Shuchang Xu, Jin-Du Wang, Jian Zhao, Huamin Qu, Linping Yuan
    ACM Symposium on User Interface Software and Technology (UIST) 2025
  • SUI 2023
    Xiang Li, Jin-Du Wang, John J. Dudley, Per Ola Kristensson
    ACM Symposium on Spatial User Interaction (SUI) 2023
  • C&G 2024
    Xiang Li, Jin-Du Wang, John J. Dudley, Per Ola Kristensson
    Computers & Graphics 2024

Services

Reviewer: CHI2025-LBW, IMX2025-Technical Paper, CHIPLAY2024-Work in Progress (Special Recognition), ISS2024-Full Paper

Projects (Selected)

Capillary Network: Exploring Interspecies Kinship through Real Data and Biological Inspired Algorithm

Capillary Project Image 1 Capillary Project Image 2

Capillary Network is a data-driven interactive experience that speculatively visualizes the invisible diffusion of poisonous substances across species. Tracing how pesticides permeate various organisms, including wildlife, beneficial invertebrates, companion animals, and humans, it reveals a haunting yet beautiful kinship among them. Viewers are invited to be part of the system, where toxins circulate like blood, prompting reflection on life, death, and human–nonhuman entanglement.

Dual-Stick: Dual Sticks Controller for Enhancing Raycasting Interactions with Virtual Objects

Dual-Stick Project Image 1 Dual-Stick Project Image 2

This work presents Dual-Stick, a novel controller with two sticks connected at the end that provides extra input dimensions, like dual rays, to enrich the raycasting input method in VR. Based on the design space analysis of daily stick-shaped tools, we designed a Dual-Stick prototype consisting of 3D printed dual sticks, springs, and reflective balls for tracking. Two user studies were conducted to evaluate the performance of Dual-Stick in target selection and manipulation tasks. The device also provides novel interaction opportunities for VR applications. Results showed that it can effectively improve target selection efficiency, especially for small targets, and enable flexible and coordinated manipulation operations through the mode switch mechanism. This work highlights the potential for the new structure of controller in VR that exploits users' ability to manipulate daily tools.

Swarm Manipulation in Virtual Reality

Swarm Project Image 1 Swarm Project Image 2

This project explores efficient techniques for manipulating multiple objects simultaneously in virtual reality environments. The work focuses on developing intuitive interaction methods that allow users to control swarms of virtual objects with natural gestures and movements.