Category Archives: singapore

Designing Realities 2019

Following up on our Blended Reality Summer School in Singapore I was invited to present at a two-day seminar on “Designing for the Future: Virtual, Augmented and Blended Realities” at the National University of Singapore on December 16 and 17, 2019.

“Virtual Reality (VR) generate graphics and sounds to place you in a spectacular imaginary world and Augmented Reality (AR) overlays virtual elements to augment your real-world environment. Mixed Reality (MR) maps real-world environments to overlay and interact with virtual objects, and we believe that this enables the implementation of novel interactions such as providing passive haptic feedback as natural wall borders or table surfaces for game physics. This form of MR – we like to call “Blended Reality (BR)”, inter-connects the digital and the physical by harnessing the knowledge of augmented and virtual reality, tangible user interfaces, radar sensing, computer vision, wearable computing, discreet computing and ubiquitous computing. This blend of technology allows us to explore and design discreet interactions which weave computing into the literal or figurative fabric of day to day life.”

OBJECT INTERACTION IN AR

2019 Seminar: Object recognition in HCI with Radar, Vision and Touch.

In April 2019, I will deliver a new lecture on Object Recognition in HCI with Radar, Vision and Touch in the School of Computing in the National University of Singapore.

To access the paper noted here, click on the names of the papers below to access the PDF directly. To access the bibtex and official ACM copy, click on the ACM logo beside the paper name.

Abstract

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk I will delve into three forms of sensing for object detection and interaction with radar, blurred images and touch. 

RadarCat (UIST 2016, Interactions 2018, IMWUT 2018) is a small, versatile system for material and object classification which enables new forms of everyday proximate interaction with digital devices. RadarCat exploits the raw radar signals that are unique when different material and objects are placed on the sensor. By using machine learning techniques, these objects can be accurately recognized. An object’s thickness, state (filled or empty mug) and different body parts can also be recognized. This gives rise to research and applications in context-aware computing, tangible interaction (with tokens and objects), and in industrial automation (e.g., recycling), or laboratory process control (e.g., traceability). While AquaCat (MobileHCI 2017 workshop) is a low-cost radar-based system capable of discriminating between a range of liquids and powders. Further in Solinteraction we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic with Radar.  

Beyond Radar, SpeCam (MobileHCI ’17) is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support “discreet computing” with micro-interactions to avoid the numerous distractions that users daily face with today’s mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down with blurred images. 

Finally, with touch we can show a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses (ISWC 2017). Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures in “discreet computing” can be used to control a wearable computer without calling attention to the user in public. We present two user studies where we test recognition accuracy for these movements. I will conclude this talk with some speculations around how touch, radar and vision processing might be used to realise “blended reality” interactions in AR and beyond. 

I will also use this talk to answer questions on the upcoming Blended Reality Summer School, May 13, 2019 to May 17, 2019 at the Keio-NUS CUTE Center, National University of Singapore. Applications for this will open soon. 

ACM Logo
ACM Logo
ACM Logo
ACM Logo
ACM Logo
ACM Logo

2019 Seminar: CUTE Centre Singapore

I am currently in Singapore on sabbatical with the CUTE centre. My welcome seminar was on the topic of Discreet Computing and showcased a number of SACHI projects.

The Keio-NUS CUTE (Connective Ubiquitous Technology for Embodiments) Center is a joint collaboration between National University of Singapore (NUS) and Keio University, Japan and partially funded by a grant from the National Research Foundation (NRF) administrated through the Media Development Authority (MDA) of Singapore. The main objective of setting up the center is to collaborate on fundamental research in the general area of Interactive Digital Media targeted at addressing the future of interactive, social and communication media.