Pre-recorded Sessions: From 4 December 2020 | Live Sessions: 10 – 13 December 2020

4 – 13 December 2020

#SIGGRAPHAsia | #SIGGRAPHAsia2020

Emerging Technologies

  • Ultimate Supporter Ultimate Supporter
  • Ultimate Attendee Ultimate Attendee
  • Basic Attendee Basic Attendee

Date: Thursday, December 10th
Time: 11:00am - 12:00pm
Venue: Zoom Room 9


Note: All live sessions will be screened on Singapore Time/GMT+8. Convert your time zone here.


Q&A for Wearable Sanitizer: Design and Implementation of an Open-source, On-body Sanitizer

Abstract: We present an open-source, wearable sanitizer that provides just-in-time, automatic dispensing of alcohol to the wearer's hand or nearby objects using sensors and programmable cues. We systematically explore the design space for wearable sanitizer aiming to create a device that not only seamlessly integrates with the user's body.

Author(s)/Presenter(s):
Pat Pataranutaporn, MIT, United States of America
Ali Shtarbanov, MIT, United States of America
Glenn Fernandes, MIT, United States of America
Jingwen Li, MIT, United States of America
Parinya Punpongsanon, Osaka University, Japan
Joe Paradiso, MIT, United States of America
Pattie Maes, MIT, United States of America


Q&A for MAScreen: Augmenting Speech with Visual Cues of Lip Motions, Facial Expressions, and Text Using a Wearable Display

Abstract: MAScreen, a wearable LED display in the shape of a mask capable of sensing lip motion and speech, providing a real-time visual feedback on the vocal expression and the emotion behind the mask. It can transform from vocal data into text, emoji and another language.

Author(s)/Presenter(s):
Hyein Lee, Korea Advanced Institute of Science and Technology (KAIST), South Korea
Yoonji Kim, Korea Advanced Institute of Science and Technology (KAIST), South Korea
Andrea Bianchi, Korea Advanced Institute of Science and Technology (KAIST), South Korea


Q&A for ZoomTouch: Multi-User Remote Robot Control in Zoom by DNN-based Gesture Recognition

Abstract: ZoomTouch is a robot-human interaction system that allows an operator or a group of people to control the robotic arm via video calls application from anywhere in the world using DNN-based hand tracking. No glove or special hand tracking device, the camera is all you need.

Author(s)/Presenter(s):
Ilya Zakharkin, Skolkovo Institute of Science and Technology, Moscow Institute of Physics and Technology (State University), Russia
Arman Tsaturyan, Skolkovo Institute of Science and Technology, Moscow Institute of Physics and Technology (State University), Russia
Miguel Altamirano Cabrera, Skolkovo Institute of Science and Technology, Russia
Jonathan Tirado, Skolkovo Institute of Science and Technology, Russia
Dzmitry Tsetserukou, Skolkovo Institute of Science and Technology, Russia


Q&A for Dual Body: Method of Tele-Cooperative Avatar Robot with Passive Sensation Feedback to Reduce Latency Perception

Abstract: Dual Body was developed to be a telexistence system, one in which the user does not need to continuously operate an avatar robot but is still able to passively perceive feedback sensations when the robot performs actions. Such a method can highly reduces the latency perception and the fatigue feelings.

Author(s)/Presenter(s):
Vibol Yem, Tokyo Metropolitan University, Japan
Kentaro Yamaoka, Tokyo Metropolitan University, Japan
Gaku Sueta, Tokyo Metropolitan University, Japan
Yasushi Ikei, University of Tokyo, Japan


Back