Author Archives: admin

October – SxSW – Australia’s Next Giant Leap

I joined a panel to discuss music, technology and the arts at SxSW Sydney this week.

Australians have a talent for ingenuity and creative thinking. From ancient times, Australians have used the materials around them to develop unique and practical tools to help them live and prosper. Among the huge variety of Australian inventions are the boomerang, cochlear implants, polymer banknotes, wi-fi technology and off course, the mighty hills hoist.

The lucky country; a land of economic opportunity and bountiful natural resources. A young, multicultural and open country with an ever-growing economy has avoided recessions like no other, has been a destination for tertiary education and built some of the world’s newest technology unicorns.

As Australia has the pandemic in its rear-view mirror, it’s a good time to assess where is Australia going. National and Global challenges have anything but diminished and what got us here, won’t get us there.

Has the mineral boom ended? Will our universities lead the way and continue to attract students? Is our growing service economy going to change anything? Will the Australian market ever be big enough for companies to grow here? What opportunities will the metaverse, blockchain and NFTs provide Australian businesses? Is Australia really the best place to live?

Where is Australia’s next giant leap?

Oct 16 – ISMAR 2023

ISMAR 2023, the premier conference for Augmented Reality (AR), Mixed Reality (MR), and Virtual Reality (VR), collectively referred to as eXtended Reality (XR), was held in Sydney from October 16-20, 2023 where I served as one of the honorary general chairs. CSIRO was one of the conference sponsors this year.

IEEE ISMAR is the premier conference for Augmented Reality (AR), Mixed Reality (MR) and Virtual Reality (VR), which attracts the world’s leading researchers from both academia and industry. ISMAR explores the advances in commercial and research activities related to AR, MR, and VR by continuing the expansion of its scope over the past several years. The conference is planned to include two days of workshops, tutorials, and contests, and three days of presentations of scientific papers, posters, and demonstrations. In addition, exhibitions will be arranged to promote the interactions and exchange between industry and academia.

My deepest thanks and congratulations to Barrett Ens, Denis Kalkofen, Gelareh Mohammadi, Frank Guan the General Chairs for IEEE ISMAR 2023.

Oct 10 – The Underground @ K17 UNSW

In my role as Head of School in UNSW I was pleased to bring in the redevelopment of the lower ground floor into our plans for a more entrepreneurial school and campus. With an Underground, Overground and High Line for different forms of industry engagement, spin ins and spinouts I look forward to seeing CSE develop and extend its impact in the years to come.

Oct 9 – ACM VRST 2023

User engagement in Virtual Reality (VR) games is crucial for creating immersive and captivating gaming experiences that meet the expectations of players. However, understanding and measuring these levels in VR games presents a challenge for game designers, as current methods, such as self-reports, may be limited in capturing the full extent of user engagement. Additionally, approaches based on biological signals to measure engagement in VR games present complications and challenges, including signal complexity, interpretation difficulties, and ethical concerns. This study explores body movements, as a novel approach to measure user engagement in VR gaming. We employ E4, emteqPRO, and off-the-shelf IMUs to measure the body movements from diverse participants engaged in multiple VR games. Further, we examine the simultaneous occurrence of player motivation and physiological responses to explore potential associations with body movements. Our findings suggest that body movements hold promise as a reliable and objective indicator of user engagement, offering game designers valuable insights on generating more engaging and immersive experiences.

This year I was pleased to join my co-author Dr Rukshani Somarathna at VRST 2023 where she presented our paper entitled “Rukshani Somarathna, Don Samitha Elvitigala, Yijun Yan, Aaron J Quigley, and Gelareh Mohammadi. 2023. Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body Movements. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology (VRST ’23). Association for Computing Machinery, New York, NY, USA, Article 3, 1–8. https://doi.org/10.1145/3611659.3615687

Oct 4 – Stories of Connecting Research & Development at Scale

Along with Bhautik Joshi of Canva we submitted a Birds of a Feather session on stories of connecting R&D at Scale.

Stories from industry and applied research at scale. How is applied R&D undertaken to achieve impact at scale for use by millions or billions? Discuss lessons on how R&D is put into production with stories and discussions from leading industry figures. Following a general introduction and setting we will have a general discussion on what it takes to take industry, applied research and startups along the R&D journey to achieve impact at scale. Join us to discuss how to avoid the valley of death, the tar-pits of tech transfer onto a successful journey to scaling impact.

We were joined by Maggie Oh and Stella Solar as panelists on the day to discuss and answer questions.

Sept 29 – SydCHI for ACM VRST, ACM UIST and ACM UbiComp (IMWUT) paper presentations

On this day we welcomed the ACM SIGCHI Chapter – SydCHI here in Data61 for a series of practice talks from Rukshani, Owen, Jieshan and Don. The following are the details of these talks.

Rukshani Somarathna

Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body Movements

User engagement in Virtual Reality (VR) games is crucial for creating immersive and captivating gaming experiences that meet the expectations of players. However, understanding and measuring these levels in VR games presents a challenge for game designers, as current methods, such as self-reports, may be limited in capturing the full extent of user engagement. Additionally, approaches based on biological signals to measure engagement in VR games present complications and challenges, including signal complexity, interpretation difficulties, and ethical concerns. This study explores body movements, as a novel approach to measure user engagement in VR gaming. We employ E4, emteqPRO, and off-the-shelf IMUs to measure the body movements from diverse participants engaged in multiple VR games. Further, we examine the simultaneous occurrence of player motivation and physiological responses to explore potential associations with body movements. Our findings suggest that body movements hold promise as a reliable and objective indicator of user engagement, offering game designers valuable insights on generating more engaging and immersive experiences.

29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023), Christchurch, New Zealand.

Rukshani Somarathna, Don Samitha Elvitigala, Yijun Yan, Aaron J Quigley, and Gelareh Mohammadi. 2023. Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body Movements. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology (VRST ’23). Association for Computing Machinery, New York, NY, USA, Article 3, 1–8. https://doi.org/10.1145/3611659.3615687

Yongquan Hu (Owen)

MicroCam: Leveraging Smartphone Microscope Camera for Context-Aware Contact Surface Sensing

The primary focus of this research is the discreet and subtle everyday contact interactions between mobile phones and their surrounding surfaces. Such interactions are anticipated to facilitate mobile context awareness, encompassing aspects such as dispensing medication updates, intelligently switching modes (e.g., silent mode), or initiating commands (e.g., deactivating an alarm). We introduce MicroCam, a contact-based sensing system that employs smartphone IMU data to detect the routine state of phone placement and utilizes a built-in microscope camera to capture intricate surface details. In particular, a natural dataset is collected to acquire authentic surface textures in situ for training and testing. Moreover, we optimize the deep neural network component of the algorithm, based on continual learning, to accurately discriminate between object categories (e.g., tables) and material constituents (e.g., wood). Experimental results highlight the superior accuracy, robustness and generalization of the proposed method.

Ubicomp/ISWC 2023, Oct 8-12 Cancun, Mexico

Yongquan Hu, Hui-Shyong Yeo, Mingyue Yuan, Haoran Fan, Don Samitha Elvitigala, Wen Hu, and Aaron Quigley. 2023. MicroCam: Leveraging Smartphone Microscope Camera for Context-Aware Contact Surface Sensing. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 7, 3, Article 98 (September 2023), 28 pages. https://doi.org/10.1145/3610921

Jieshan Chen

Unveiling the Tricks: Automated Detection of Dark Patterns in Mobile Applications

Mobile apps bring us many conveniences, such as online shopping and communication, but some use malicious designs called dark patterns to trick users into doing things that are not in their best interest. Many works have been done to summarize the taxonomy of these patterns and some have tried to mitigate the problems through various techniques. However, these techniques are either time-consuming, not generalisable or limited to specific patterns. To address these issues, we propose \tool{}, a knowledge-driven system that utilizes computer vision and natural language pattern matching to automatically detect a wide range of dark patterns in mobile UIs. Our system relieves the need for manually creating rules for each new UI/app and covers more types with superior performance. In detail, we integrated existing taxonomies into a single and consistent one, conducted a characteristic analysis and distilled knowledge from real-world examples and the integrated taxonomy. Based on this, our \tool{} consists of two components, UI element detection and knowledge-driven dark pattern checker. For evaluation, we utilise Rico dataset and its semantic labelling to train and test each DL module in our system. We also contribute a new dark pattern dataset, which contains 4,999 benign UIs and 1,353 malicious UIs of 1,660 instances spanning 1,023 mobile apps. Our system achieves a superior performance in detecting dark patterns, with an overall performance of 0.83 in precision, 0.82 in recall, and 0.82 in F1 score. Our ablation experiments demonstrates the validity and necessity of each module. A user study involving 58 participants further showed that \tool{} significantly increases users’ knowledge of dark patterns, increasing the recall rate of detecting dark patterns from 18.5\% to 57.8\%. Our work is beneficial to end-users, app providers, and regulators, and can serve as a training tool for raising awareness of dark patterns.

Dr. Don Samitha Elvitigala

RadarFoot: Fine-grain Ground Surface Context Awareness for Smart Shoes

Everyday, billions of people use footwear for walking, running, or exercise. Of emerging interest are `smart footwear'', which help users track gait, count steps or even analyse performance. However, such nascent footwear lack fine-grain ground surface context awareness, which could allow them to adapt to the conditions and create usable functions and experiences. Hence, this research aims to recognize the walking surface using a radar sensor embedded in a shoe, enabling ground context-awareness. Using data collected from 23 participants from an in-the-wild setting, we developed several classification models. We show that our model can detect five common terrain types with an accuracy of 80.0% and further ten terrain types with an accuracy of 66.3%, while moving. Also, it can detect the gait motion types such aswalking’, stepping up',stepping down’, `still’, with an accuracy of 90%. Finally, we present potential use cases and insights for future work based on such ground-aware smart shoes.

The ACM Symposium on User Interface Software and Technology, UIST 2023, San Fransisco, USA

Don Samitha Elvitigala, Yunfan Wang, Yongquan Hu, and Aaron J Quigley. 2023. RadarFoot: Fine-grain Ground Surface Context Awareness for Smart Shoes. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology (UIST ’23). Association for Computing Machinery, New York, NY, USA, Article 87, 1–13. https://doi.org/10.1145/3586183.3606738