Open Thesis Topics

The following open topics are currently available:

Topic

In recent years, a innovative technique known as hand redirection has revolutionized interaction in virtual reality (VR).
The technique leverages offsets between the user's real and virtual hand to take control over their hand movement trajectories (e.g., redirecting their real hand to locations that differ from the locations they see their virtual hand in). This can be used, for example, to interact with objects out of the user's reach. Previous research has shown that such techniques can go unnoticed by users if the offset stays below the so called perceptual detection threshold.

While existing investigations have only focused on the redirecting the user's hand, in this thesis, the goal is to apply the concept to the the user's legs and feet. Specifically, in the context of this thesis, you will:

  • identify relevant degrees of freedom in human leg and foot movement that are suitable for redirection
  • implement a prototype that applies redirection to legs and feet
  • study the perception of leg and foot redirection in a detection threshold experiment
  • implement 2-3 sample scenes that showcase this novel type of redirection

References

The following papers can serve as a reference to learn more about the topics (see also the videos on the linked websites and on my profile):

Prerequisites

  • Read the provided papers above (might require VPN to access)
  • Be familiar or get familiar with the concept of hand redirection (see also further publications on my profile website)
  • Background or interest in Virtual Reality
  • Ideally: experience with Unity3D
  • Completed HCI lecture and ideally already attended at least one seminar at our chair

How to apply

Please send me an email with the following pieces of information (if you do not answer every point, your application will not be considered):

  • When you plan to start the thesis
  • When you plan to finish the thesis
  • A short motivational statement (max 0.5 pages) why this topic is interesting for you
  • Your transcript of records and your CV

Advisors


See personal profile of Dr. André Zenner

This work focuses on human-robot collaboration, in more detail, how a robotic arm and a human can work together at an assembly cell such that the robot pro-actively supports the worker in assembling a workpiece. A prototypical set-up including the robot and components of the workpiece are already available as well as a mixed-reality duplicate of the set-up, which can be used to conduct Wizard-of-Oz (WoZ) style user-studies using AR glasses. Building on the existing work, your task is to conceptualize, plan, conduct and assess a user-study to determine the most appropriate work dynamic (the optimal division of tasks) between the robot and the worker. This also includes determining the most suitable modalities for human-robot communication during the process. This thesis is a collaboration between ZeMA (Zentrum für Mechatronik und Automatisierungstechnik gGmbH) and DFKI. The practical work will be done at the Power4Production Hall at Eschbergerweg 46, Saarbrücken.

Focus

The focus is on the user-study itself. You will need to create a storyline, think about relevant questions, including identifying which data must be recorded, find participants, plan the execution, and analyze the results. 

Prerequisites

  • Background in planning and conduction user-studies (e.g. from the HCI lecture)
  • Interest in Mixed Reality (e.g. Meta Quest AR headsets)
  • Interest in Robotics
  • Enrolled in a Bachelor programme in computer science, mediainformatics or related field

How to apply

Please send us an email with the following pieces of information (if you do not answer every point, your application will not be considered):

  • When you plan to start the thesis
  • When you plan to finish the thesis
  • A short motivational statement why this topic is interesting for you
  • A summary why you would be a good fit for this topic
  • Your transcript of records and CV

See personal profile of Dr. Tim Schwartz