dexterous interaction lab
靈巧互動實驗室
dexterous: done with mental or physical skill, quickness, or grace (Merriam-Webster)
From using chopsticks to manufacturing, humans skillfully perform a variety of tasks; yet we mostly only tap on screens when interacting with modern computers like our smartphones. Our lab envisions future Human-Computer Interaction (HCI) that can leverage and even augment our dexterity.
One key to human dexterity is through technologies for haptic feedback. Humans rely on a combination of senses, including precise sense of touch and forces, with muscular control to perform tasks. While most advanced haptic devices developed for teleoperation often require huge infrastructure, our lab brings the rich senses of haptics to wearables, Augmented Reality (AR), and assistive technology. Our lab has investigated novel haptic feedback that goes beyond simple vibration for notification, such as providing the sense of touch for assisting in repairing while wearing miniature fingernail devices, guiding hand poses for playing musical instruments with a robotic exoskeleton, providing a new tactile sensory substitution for Blind users in grasping everyday objects. Our lab believes enabling dexterous interactions in computing can enhance daily experiences and empower humans both physically and cognitively.
Dexterous Interaction Lab is a newly established research lab led by Prof. Shan-Yuan Teng (PhD, University of Chicago) of the Department of Computer Science & Information Engineering at National Taiwan University (NTU, Taipei, Taiwan).
Research themes
Our lab is interested in exploring what interactive devices will look like, feel like, and live with like, much further into the future (e.g., 10 years from now). Specifically, our lab explores these themes by developing/inventing novel software-hardware systems (e.g., wearables) and conducting in-depth user studies (e.g., perception, user experience):
- How might we design interactive systems that increase user’s dexterity? While AI systems have advanced in text and video domains, there is little assistance for physical tasks (repair, craft, cooking, sports, etc). How might we design systems that allow users to perform better, and even learn better?
- How might we design multisensory systems in the wild? Most multisensory systems are restricted in specific labs (theme parks, simulators); how might we enable richer experiences that integrate into our daily mobile life?
- How might we design devices for more people? Our bodies are all different; how can we design systems that support individual needs, such as assistive tools for people with disabilities?
Join/Collaborate with the lab
Interested? The lab just launched and is recruiting students and open for collaboration. For students, please email about your interest and skills (Programming and hardware prototyping experiences are recommended).
Principle investigator
Shan-Yuan Teng [CV]
Assistant Professor
鄧善元 助理教授
tengshanyuan@csie.ntu.edu.tw
*Shan-Yuan is the first name; Email in English or Chinese (Traditional)
News
- Lab officially starts in August, 2025! Lab space is under construction, stay tuned.