About Past Issues Editorial Board

KAIST
BREAKTHROUGHS

Research Webzine of the KAIST College of Engineering since 2014

Fall 2025 Vol. 25
Engineering

Wearable Haptics of Orthotropic Actuation for 3D Spatial Perception in Low-visibility Environment

February 19, 2026   hit 74

Professor Il-Kwon Oh’s research team at KAIST has developed Wearable Haptics for Orthotropic Actuation that converts 3D spatial information into tactile cues of independent axes. By encoding spatial cues, it facilitates hands-free 3D navigation and teleoperation in vision-compromised environments.

 

 

Visual limitations in extreme environments, such as smoke-filled disaster zones and fire sites, pose a significant impediment to search and rescue operations. Addressing this critical challenge, Professor Il-Kwon Oh’s research team at KAIST introduces Wearable Haptics for Orthotropic Actuation (WHOA). The core novelty of this technology, proposed by Professor Il-Kwon Oh, lies in leveraging the unique architecture of fabric actuators to convey three-dimensional spatial information through tactile signals, thereby expanding the wearer's perceptible domain.

 

Haptic interfaces mediate information directly through cutaneous perception. To deliver high-fidelity tactile feedback, the proposed actuator employs thin wires composed of shape memory alloys (SMAs)—smart materials that revert to a pre-trained configuration upon thermal activation. By constructing these wires into a lightweight, cross-layered architecture and stacking the layers orthogonally, the wearable device induces independent skin deformation along both horizontal and vertical axes. Furthermore, the integration of an engineered auxetic geometry—a structure that contracts transversely under compression—enables this compact mechanism to generate precise, directional sensations.

 

 

 

Figure 1 Schematic of WHOA and its actuation behavior. a) WHOA providing better directional haptic feedback that enhances control in virtual and real environments, improving spatial perception in 3D space. b) Schematic illustration of orthotropic SMA fabric when joule heating is applied in either vertical or horizontal stripes and both stripes. c) Multi-segment multimodal haptic device electrically isolated with parylene coating.

 

 

This dual-axis independence allows the system to encode spatial information with significantly greater complexity than conventional vibrotactile methods. The wearable device can transmit multi-directional cues—lateral, vertical, and longitudinal—and switch to distinct actuation patterns to alert the operator of obstacles or hazardous zones detected by the drone. In visibility-compromised environments, this capability allows operators to intuitively "read" the spatial context through touch, thereby maintaining visual and manual focus on safe control operations. (Figure 1)

 

 

Figure 2 Schematic of the haptic feedback-based spatial information transmission and drone teleoperation within a virtual fire environment  a) Mapping between tactile modes and motions of drone. b) Images showing rescue person controlling VR drone in case of a burning building, such that WHOA is on foot to give the haptic sensation in a blocked view situation due to fire smoke. c) VR environment where 3D spatial mapping of drone movement is performed. d) VR-based simulation images showing the motion of drone in cases of going forward, left, and up, in such a way the vision is blocked by fire smoke.

 

 

The efficacy of this approach was demonstrated in a virtual simulated fire scenario. Participants equipped with the device piloted a drone through smoke-obscured interiors, evading danger zones to execute simulated rescue protocols. Even when visual feedback on the screen was severely degraded, tactile cues significantly augmented the users' spatial awareness, facilitating confident and accurate steering. (Figure 2)

Designed for practical deployment in prolonged missions, the wearable prioritizes adaptability and ergonomics. While primarily designed for the arm, the system is engineered to accommodate highly constrained spaces, such as the interior of a shoe, thereby enabling hands-free guidance while mitigating user fatigue.

 

 

 

Video 1 Demonstration of WHOA-based haptic feedback of 3D spatial information and drone navigation in a virtual simulation

 

 

The system’s tactile capacity to ensure accessibility and effective mission execution under extreme conditions is demonstrated in Video 1. Beyond firefighting and disaster response, this tactile navigation paradigm holds significant potential for assistive technologies for the visually impaired, as well as for the teleoperation of robots in other hazardous settings. Furthermore, the applicability of this haptic interface extends to industrial inspections and military defense sectors that demand seamless coordination between human and unmanned systems. By serving as a critical tactile bridge for highly immersive teleoperation, this technology is envisioned to open new horizons in the field of Manned-Unmanned Teaming (MUM-T)

 

This research was published in Advanced Materials (Vol. 37, Issue 1) in January 2025, with Dr. Saewoong Oh and Mannan Khan participating as co-first authors under the supervision of Professor Il-Kwon Oh at KAIST. This research was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (Title: Wearable Haptics for Orthotropic Actuation Based on Perpendicularly Nested Auxetic SMA Knotting,

link: https://advanced.onlinelibrary.wiley.com/doi/full/10.1002/adma.202411353).