Augmented Reality & Spatial Interaction

Augmented Reality
& Spatial Interaction

Introduction

Fusing robotics’ physical reality with the digital, virtual world

In a previous era of robotics before Augmented Reality, there was only the reality of the factory & shop floor on the one hand and the Desktop-PC-based (offline) simulation, typically used in office environments, on the other hand. Augmented Reality (AR) is a game changer fusing both worlds, bringing spatial visualization and simulation to the shop floor. This technology matured during the last years with the inception of novel display technologies, tracking algorithms and software frameworks for mobile devices.

In robotics, the applications for AR range from cell layout planning, process simulation and trajectory visualization to interactive spatial interfaces for allowing more easy and efficient programming of robots. These interfaces bear the potential to provide support for usage and robot programming to everyone without the need of knowing programming languages or having extensive skills in robotics.

We provide tailored software development for your AR interfaces in robot-based production, human robot collaboration as well as the control of service robots. Furthermore, we provide the development of interactive AR-based fair exhibits.

Real objects can be tracked using markers to extract its position in real space.
Real objects can be tracked using markers to extract its position in real space. Augmented objects can be enhanced with information or complete alterations. AR is perceivable via display devices like smartphones and tablets or wearables like special AR-glasses

At GESTALT, there are international-renowned experts and pioneers in various applications of AR-based user interfaces for robotics. We hold numerous publications and IP dealing with AR and spatial interaction. Feel free to contact us and discuss your use case. Why not send us an email at info@gestalt-robotics.com or give us a call at +49 30 616 515 60?


Tech Outline

Augmented Reality and Spatial Interaction

Unlike Virtual Reality (VR), the objective of AR is not a perfect and complex replica of real environments, but a meaningful addition to reality in order to support the user with locally placed information, e.g. virtual objects. Accordingly, real objects can also be hidden, being replaced by virtual objects.

Principle of a video-see-through Augmented Reality application for industrial robotics
Principle of a video-see-through Augmented Reality application for industrial robotics

Hence, AR connects real environments and digital information. However, AR applications do not just allow the display of virtual objects in reality, but also form the basis for a free spatial interaction with real and virtual objects.

Virtual robot and same trajectory in VR (left) and AR (right)
Virtual robot and same trajectory in VR (left) and AR (right)

For the purpose of spatial interaction, neither conventional interaction paradigms, nor classic input devices like mouse or keyboard are suitable for manipulation in six degrees of freedom. Thus, we use gesture-based interaction for manipulation of real and virtual objects. Consequently, this is the most natural way of interaction for the users that are manipulating objects via gestures, which are carried out freely and bare-handed in space. In order to detect gestures within AR interfaces, we rely on different technologies, e.g. inertial sensors, data gloves and different camera and tracking principles.

Spatial translation of a pose via a gripping gesture and visual feedback
Spatial translation of a pose via a gripping gesture and visual feedback: near target pose (1), hand within the interaction radius (2), starting the translational movement (3), translation of the pose (4), release of the pose (5) and hand removal (6)

Applications & Use Cases

Spatial programming of Industrial Robots

Against the background of increasing demands by manufacturing companies in global competition, industrial robotics plays a key role in shaping the flexibility and cost-efficiency of production processes in western industrialized countries. Regarding small and medium-sized enterprises, manual robot programming often turns into a bottleneck. Accordingly, there is the motivation to ease the programming process and make it more time-efficient.

AR Android industrial robot simulation for the shop-floor
AR Android industrial robot simulation for the shop-floor

Within this approach, an interaction concept for industrial robot programming is introduced, taking into account natural communication through marker-less gesture recognition. The programmer is enabled to define the robot program by gestures following the principle “programming by demonstration“. Therefore, the programmer is capable of drawing poses and trajectories into space through natural three-dimensional bare-hand interaction.

In terms of the individual support of user groups with varying levels of qualification, a task-oriented as well as a motion-oriented programming level is considered.

Sequence for the gestural definition of poses

Image above - sequence for the Gestural Definition of Poses: (1) Positioning the hand in the workspace of the robot, (2) definition of the position, (3) feedback on the potential working direction of the Tool Center Point (TCP), (4) definition of working direction of the TCP, (5) feedback on the potential orientation of the second coordinate axis of the TCP, (6) definition of the second coordinate axis and (7) approaching the pose by the virtual robot.

The gestural definition of the robot program is combined with an Augmented Reality simulation on handheld devices or AR glasses. Based on the synergetic combination of gestures and Augmented Reality, a novel kind of intuitive interaction arises, providing interactive manipulation of the robot program in space. The AR device becomes the central programming environment, which is equipped with a unified interface for the transmission and execution of programs towards arbitrary robot controllers. In addition, wireless communication with enterprise information systems is included.

Motion sequence for AR-visualized motion trajectory, ready for transmission and execution on real robot controller: PTP movement (yellow), linear motion (orange) and circular motion (purple)
Motion sequence for AR-visualized motion trajectory, ready for transmission and execution on real robot controller: PTP movement (yellow), linear motion (orange) and circular motion (purple)

With the help of an academic user study, we evaluated the programming duration, programming errors and subjective assessment and compared it to Teach-In and Offline Programming. The analysis of the results proves a significant reduction of programming duration as well as a reduction of programming errors compared to Teach-In. Furthermore, most participants favor the spatial programming system.

Results of academic user study
Results of academic user study: spatial interaction beats other programming methods in terms of duration

Conclusion

Augmented Reality: an emerging technology with great potential to shape future work scenarios

Augmented Reality is just at its start to enter the mass markets of (industrial) robotics applications. We are following this road, shaping work and interaction scenarios of the future. GESTALT is an innovator and early adopter for this technology field, currently evaluating and building prototypes for close human-robot collaboration as well as control and integration of mobile service robots.

We at GESTALT believe that AR-based user interfaces are an important corner stone for future control and programming scenarios for robotics, giving non-experts the opportunity to control complex machines and processes as well as boosting efficiency and productivity. Get in touch with us to discuss your use case and how it can benefit from AR-support and interactive spatial user interfaces. You can contact us under info@gestalt-robotics.com or give us a call at +49 30 616 515 60 – we would love to hear from you.