
Screen AR | One-Step Setup
I contributed to an ambitious cross-device exploration at Huawei Canada’s HMI(Human-Machine Interaction) Lab, aiming to create more natural and seamless interactions across devices, shaping product experiences for the next 3–5 years.
Huawei’s HMI Lab brings together global researchers, engineers, and designers to explore the future of human-technology interaction. Focused on innovation and user-centered design, the lab works across disciplines—from sensing and wearable tech to multimodal interfaces—to create impactful experiences and cutting-edge solutions.
*Due to confidentiality agreements, sensitive details have been either omitted or intentionally blurred in this case study. All views expressed here are my own and do not necessarily reflect those of the Huawei Human-Machine Interaction Lab.
Design by accretion
“Tap to Transfer” redefines how people share content at close range—transforming multi-step, time-consuming processes into a seamless, one-step interaction. This ultra-near-field experience brings a new level of fluidity and satisfaction to device-to-device sharing.
This cross-device interaction allows users to share information simply by bringing two smartphones close together. It also enables instant sharing of photos, videos, favorite items, or games, making it effortless to share with friends
Background
Cross-device interaction has long been one of Huawei’s core strengths, enabled by its broad product ecosystem spanning smartphones, tablets, wearables, PCs, TVs, and vehicles.
My role
From January to October 2024, I led the design exploration for device-to-screen interactions, successfully proposing and driving two features—NFC Tapping and Screen AR Entry—into product development. Both features are now in the final stage before launch.
In this project, I worked closely with 2 PhD researchers, 5 engineers, and 2 project managers to turn advanced technical research into practical and intuitive user interactions.
Interactions between devices, and between people and devices, should dynamically adapt to changes in distance.
Edward T. Hall’s proxemics theory suggests that physical proximity between individuals reflects and influences their social relationships—the closer the distance, the greater the anticipated level of interaction.
This principle can also be applied to human-device and device-to-device interactions, where proximity should inform the type and intensity of interaction.
We explored how interaction patterns can adapt based on varying proximity between people and devices, or between devices themselves. Here are the key interaction concepts mapped to distance ranges:
User Scenarios:
Ultra-Close Proximity Interaction
① User-to-Device Proximity – e.g. facial recognition, gestures (touch, tap, swipe), voice commands (whispers), or biometrics (fingerprint, iris scan)
② Device-to-Device Proximity – approaching from different angles and directions
Related projects
These concept videos was created by me using Adobe After Effects.
Screen AR
(Filed for patent protection)
As smart accessories like digital pens and earbuds gain more functionality, users frequently need to navigate through multiple menus and apps to manage their settings—leading to complicated and frustrating interactions. To solve this, we applied the concept of tangible interaction, allowing users to directly place accessories onto the screen to immediately access and configure settings in just one step.
NFC Tapping
(Filed for patent protection)