Xiaoying Wei

I am a Ph.D. student (2022 ~ ) at the Computational Media and Arts (CMA), HKUST, and supervised by Prof. Mingming Fan. My research interest is Human-Computer Interaction (HCI). I received my Master's degree from the Department of CST at Tsinghua University in 2020 and my bachelor's degree from Xiamen University in 2017.

I focus on the fields of HCI, Accessibility, and Virtual Reality. Specifically, I apply user-centered design, VR/AR, sensing, and qualitative research methods to empower people by 1) Supporting Better Interpersonal Communication Using Novel VR/AR Techniques, and 2) Understanding and Mitigating Aging and Accessibility Challenges by Increasing Their Social Connections.

I am glad to cooperate with the researchers in the HCI field.

Xiaoying Wei

Publications


CHI 2023 (TH-CPL A)
Bridging the Generational Gap: Exploring How Virtual Reality Supports Remote Communication Between Grandparents and Grandchildren
Xiaoying Wei, Yizheng Gu, Emily Kuang, Beiyan Cao, Xian Wang, Xiaofu Jin, Mingming Fan*

When living apart, grandparents and grandchildren often use audio-visual communication approaches to stay connected. However, these approaches seldom provide sufficient companionship and intimacy due to a lack of co-presence and spatial interaction, which can be fulfilled by immersive virtual reality. To understand how grandparents and grandchildren might leverage VR to facilitate their remote communication and better inform future design, we conducted this work...
[PDF]

ChineseCHI 2022 Honorable Mention Award
Communication in Immersive Social Virtual Reality: A Systematic Review of 10 Years' Studies
Xiaoying Wei, Xiaofu Jin, Mingming Fan*

As virtual reality technologies have improved in the past decade, more research has investigated how they could support more effective communication in various contexts to improve collaboration and social connectedness. We conducted a systematic review of the studies investigating communication in social VR in the past ten years by following the PRISMA guidelines.
[PDF]

UbiComp 2022 (TH-CPL A)
Synapse: Interactive Guidance by Demonstration with Trial-and-Error Support for Older Adults to Use Smartphone Apps
Xiaofu Jin, Xiaozhu Hu, Xiaoying Wei, Mingming Fan*

As smartphones are widely adopted, mobile applications are emerging to provide critical services. This trend may create barriers for older adults. We designed an app-independent mobile service for help-givers to create a multimodal interactive tutorial on a smartphone and for help-receivers (e.g., older adults) to receive interactive guidance with trial-and-error support when they work on the same task...
[PDF]

CHI 2021 (TH-CPL A)
Auth+Track: Enabling Authentication Free Interaction on Smartphone by Continuous User Tracking
Chen Liang, Chun Yu, Xiaoying Wei, Xuhai Xu, Yongquan Hu, Yuntao Wang, Yuanchun Shi

We propose Auth+Track, a novel authentication model that aims to reduce redundant authentication in everyday smartphone usage. By sparse authentication and continuous tracking of the user's status, Auth+Track eliminates the "gap" authentication between fragmented sessions and enables "Authentication Free when User is Around"...
[PDF]

UbiComp 2021 (TH-CPL A)
QwertyRing: Text Entry on Physical Surfaces Using a Ring
Yizheng Gu, Chun Yu, Zhipeng Li, Zhaoheng Li, Xiaoying Wei, and Yuanchun Shi

The software keyboard is widely used on digital devices such as smartphones, computers, and tablets. The software keyboard operates via touch, which is efficient, convenient, and familiar to users. However, some emerging technology devices such as AR/VR headsets and smart TVs do not support touch-based text entry. In this paper, we present QwertyRing, a technique that supports text entry on physical surfaces using an IMU ring...
[PDF]

CHI 2019 (TH-CPL A)
HandSee: Enabling Full Hand Interaction on Smartphones with Front Camera-based Stereo Vision
Yu Chun, Xiaoying Wei, Shubh Vachher, Yue Qin, Chen Liang, Yueting Weng, Yizheng Gu, Yuanchun Shi

We present HandSee, a novel sensing technique that can capture the state of the user’s hands touching or gripping a smartphone. We place a prism mirror on the front camera to achieve a stereo vision of the scene above the touchscreen surface. HandSee enables a variety of novel interaction techniques and expands the design space for full hand interaction on smartphones...
[PDF]

UIST 2019 (TH-CPL A)
Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor
Yizheng Gu, Yu Chun, Zhipeng Li, Weiqi Li, Shuchang Xu, Xiaoying Wei, Yuanchun Shi

Head-mounted MR systems enable touch interaction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determining the touch contact accurately. We show that a finger ring with IMU can substantially improve the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms...
[PDF]



Education


HKUST 02/2022 - Present
Ph.D. student in Computational Media and Arts

Tsinghua University 09/2017 - 06/2020
M.S. student in Computer Science and Technology

Xiamen University 09/2013 - 06/2017
B.S. student in Software Institute


Internship & Work Experience


Product Manager, Meituan 06/2020 - 11/2021
UX designer, Huawei 06/2018 - 09/2018


Honors & Awards


Outstanding award of Tsinghua University Challenge Cup          2018
Second prize of Service Outsourcing Enterpreneurship Competition for Chinese College Student          2016
Silver & best Popularity Award of Creative Public Service Advertising Competition in Fujian          2016


Skills


Using Maya, Flash, AE, Pr, PS, AI for prototyping, 3D modeling and effect display.
Using Unity3D, Andriod Studio, Processing for DEMO development.



Updated in Jan. 2023