Multi-modal Approaches for Post-Editing Machine Translation
Current advances in machine translation increase the need for translators to switch from traditional translation to post-editing (PE) of machine-translated text, a process that saves time and improves quality. This affects the design of translation interfaces, as the task changes from mainly generating text to correcting errors within otherwise helpful translation proposals. Our results of an elicitation study with professional translators indicate that a combination of pen, touch, and speech could well support common PE tasks, and received high subjective ratings by our participants. Therefore, we argue that future translation environment research should focus more strongly on these modalities in addition to mouse-and keyboard-based approaches. On the other hand, eye tracking and gesture modalities seem less important. An additional interview regarding interface design revealed that most translators would also see value in automatically re-ceiving additional resources when a high cognitive load is detected during PE.
Germ Destroyer - A Gamified System to Increase the Hand Washing Duration in Shared Bathrooms
Washing hands is important for public health as it prevents spreading germs to other people. One of the most important factors in cleaning hands is the hand washing duration. However, people mostly do not wash their hands for a long enough time leading to infections and diseases for themselves and others. To counter this, we present “Germ Destroyer”, a system consisting of a sensing device which can be mounted on the water tap and a mobile application providing gameful feedback to encourage users to meet the recommended duration. In the mobile application, users kill germs and collect points by washing their hands. Through a laboratory study (N=14) and a 10-day in-the-wild study (363 hand washing sessions), we found that Germ Destroyer enhances the enjoyment of hand washing, reduces the perceived hand washing duration, almost doubles the actual hand washing duration, and has the potential to reduce the risk of infection.
Overgrown: Supporting Plant Growth with an Endoskeleton for Ambient Notifications
Ambient notifications are an essential element to support users in their daily activities. Designing effective and aesthetic notifications that balance the alert level while maintaining an unobtrusive dialog, require them to be seamlessly integrated into the user’s environment. In an attempt to employ the living environment around us, we designed Overgrown, an actuated robotic structure capable of supporting a plant to grow over itself. As a plant endoskeleton, Overgrown aims to engage human empathy towards living creatures to increase effectiveness of ambient notifications while ensuring better integration with the environment. In a focus group, Overgrown was identified with having personality, showed potential as a user’s ambient avatar, and was suited for social experiments.
Drag:on - A Virtual Reality Controller Based on Drag and Weight Shift
Standard controllers for virtual reality (VR) lack sophisticated means to convey a realistic, kinesthetic impression of size, resistance or inertia. We present the concept and implementation of Drag:on, an ungrounded shape-changing VR controller that provides dynamic passive haptic feedback based on drag, i.e. air resistance, and weight shift.
Immersive Process Models
In many domains, real-world processes are traditionally communicated to users through abstract graph-based models like event-driven process chains (EPCs), i.e. 2D representations on paper or desktop monitors. We propose an alternative interface to explore EPCs, called immersive process models, which aims to transform the exploration of EPCs into a multisensory virtual reality journey.
Detection Thresholds for Hand Redirection in Virtual Reality
We present the results of an experiment on interaction in Virtual Reality. In the experiment, an interaction technique known as hand redirection was investigated. With this technique, the virtual hand of a user in Virtual Reality is displayed slightly offset from the real hand position, thereby "redirecting" the user's movement in real space. With the results of the experiment it was possible to derive how much redirection can go unnoticed by the user, even in "worst case" scenarios. This is particularly important for the development of VR applications that aim to redirect users in an undetectable way, e.g. for haptic retargeting.