Skip to content
Lucyd WholesaleLucyd Wholesale
Haptic AR Report from New Lucyd Advisor

Haptic AR Report from New Lucyd Advisor

BEYOND VISUAL AR by Pedro Lopez

 

Augmented Reality (AR) interfaces allow superimposing information to the human senses while users are exploring the real world. One example that demonstrates the potential of such interactive technologies is augmented maps for smartphones or AR glasses, in which not only users can find their way around their surroundings but also benefit from annotated virtual content (e.g., tourism information directly at the building they are looking at, and so forth).

Hence, when we typically refer to Augmented Reality we imagine visually augmented content. However, AR is not limited to our visual apparatus. In fact, only when we design AR headsets that extend far beyond the visual sense we will be able to say “this AR feels just right”. This is what is happening in Virtual Reality (which happens to be the closest sibling of AR). In VR the current trend is not only to represent data visually but also to relay it to our senses of touch (also known as haptics). This allows new headsets to engage users in more realistic simulations of the intended virtual world. Several recent examples further confirm the market’s shift towards haptics in VR/AR: Ultrahaptics (a company that uses ultrasound for tactile sensations in VR/AR), UnlimitedHand (a wearable bracelet that stimulates the user’s muscles while in VR [3]), TeslaSuit (a full body suit with vibration, thermal stimulation, etc.) or Dexmo (a mechanically actuated glove for VR/AR that provides forces to the user’s fingers [2]).

 

An illustrated example of wearable haptics for headsets

Figure 1 depicts Impacto [1], a VR system that not only is able to visually depict virtual content but also to emulate sensations that arise from when a user collides with a virtual object. In Figure 1 we see how a user is wears the Impacto bracelet so as to experience a sports simulator, e.g., boxing. Here, the unique and novel aspect of these technologies is that users actually feel the impact of the punches while boxing in VR.

Figure 1: Impacto bracelet in action at the example of a sport simulator (boxing).

The technology behind Impacto 1 is designed for its miniaturization. Instead of using heavy mechanical actuators such as the ones used in mechanically actuated gloves and exoskeletons, Impacto uses Electrical Muscle Stimulation (EMS): a technique rooted in medical rehabilitation that uses small electrical impulses to actuate the user’s muscles involuntarily2; the effect of EMS is shown on Figure 2. As

such, by combining EMS with a small tactile unit (a solenoid) Impacto is able to generate the sensation of being hit at a very small form factor that harmonizes well with wearable headsets such as those used in AR and VR.

Figure 2: Applying 300ms of EMS via two electrodes attached to the biceps.

Why previous haptic technologies won’t work in AR

A body of work in the field of haptics has deal with generating strong counterforces so as to simulate contact with virtual objects (we call that force feedback). These haptic technologies revolve around devices that make users hold on to a handle, which is then actuated, for example, by means of (1) a robotic arm (e.g., Phantom robotic arm), (2) by means of mechanically actuated pulley systems (e.g., SPIDAR pulley system for AR/VR [4]) or (3) by means of an exoskeleton, which is a mechanically actuated device that provides forces by pulling the user’s limbs against the mounting point on the user’s body (e.g., FlexTorq, a exoskeleton for the upper arm, targeted at VR/AR [5]). While these solutions excel in many technical aspects (e.g., precision and accuracy of the motorized actuators), these fail in supplying a haptic device that is smaller than the AR headset. In fact, these mechanical actuated solutions often result in bulky devices that require large battery packs for operation; making them infeasible for tagging along a modern AR headset.

The rising need for wearable haptics in AR

For AR applications to blend in our daily lives they must meet the requirements of all our senses; this is what is known as realism or immersion. This includes our expectations of “how virtual objects should feel like”, which are rooted in daily experience because objects around us have weight, boundaries, textures,   and so forth. So, it is only natural to expect that AR content would feel the same way, especially as it blends visually in our surrounding context. These advancements in AR haptics will allow users to feel the virtual world they interact with. This will also make AR more practical and usable, because as right now all AR manufactures put users interacting in mid-air without any assistance from their sense of touch, which is remarkably harder than if you can feel where the objects are.

While, as we discussed, achieving realism in AR is essential, one cannot allow the haptics hardware to interfere with the current trend in AR/VR, which is mobility. As such, mechanically actuated devices, such as exoskeletons, do not provide the adequate miniaturization that is required for wearable haptics. So the search for AR haptics is also the search for haptic technologies that are intrinsically wearable, such as the aforementioned EMS technology.

Footnotes & References

1. The technology for Impacto was developed at the Hasso Plattner Institute, Germany by Pedro Lopes, Alexandra Ion and Patrick Baudisch (see [1] for technical details).

2. EMS devices deliver impulses by means of a pair of electrode attached to the user’s skin. When the electric current travels from one electrode to the other, it stimulates motor neurons. These in turn, causes the muscle fibers automatically contract.

1. Pedro Lopes, Alexandra Ion, and Patrick Baudisch. 2015. Impacto: Simulating Physical Impact by Combining Tactile Stimulation with Electrical Muscle Stimulation. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST ’15). ACM, New York, NY, USA, 11- 19. DOI: https://doi.org/10.1145/2807442.2807443

2. Xiaochi Gu, Yifei Zhang, Weize Sun, Yuanzhe Bian, Dao Zhou, and Per Ola Kristensson. 2016. Dexmo: An Inexpensive and Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems  (CHI  ’16). ACM, New York, NY, USA, 1991-1995. DOI: https://doi.org/10.1145/2858036.2858487

3. Emi Tamaki, Terence Chan, and Ken Iwasaki. 2016. UnlimitedHand: Input and Output Hand Gestures with Less Calibration Time. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST ’16 Adjunct). ACM, New York, NY, USA, 163-165. DOI: https://doi.org/10.1145/2984751.2985743

4. Jun Murayama, Laroussi Bougrila, YanLin Luo, Katsu-hito Akahane, Shoichi Hasegawa, Béat Hirsbrunner, and Makoto Sato. 2004. SPIDAR G&G: a two-handed haptic interface for bimanual VR interaction. In Pro-ceedings of EuroHaptics, vol. 2004, 138-146.

5. Dzmitry Tsetserukou. 2011. FlexTorque, FlexTensor, and HapticEye: exoskeleton haptic interfaces for augmented interaction. In Proceedings of the 2nd Augmented Human International Conference (AH ’11). ACM, New York, NY, USA, , Article 33 , 2 pages. DOI: http://dx.doi.org/10.1145/1959826.1959859

Author

Pedro Lopes, Human Computer Interaction Researcher at the Hasso Plattner Institute, Germany and serves as science advisor to Lucyd.

 

Cart 0

Your cart is currently empty.

Start Shopping