Automotive User Interfaces Seminar

General Infarmation

An HMI (Human Machine Interface) is the part of a system that is in charge of the communication between the human and the machine. It is through it that the human is able to control the system (by means of input devices) to complete a certain task, but it also gives the human access to all the information relevant and necessary for this (through output devices). In the Automotive context, the HMI helps the driver fulfill not only the primary task (driving), but it also helps him with secondary or tertiary tasks (like using the infotainment system).

In this seminar, we will focus on the particular challenges of user-centered design for Human-Machine Interfaces in the Automotive Domain for driver and passengers, taking into account recent progress made in sensor and presentation tech available in a vehicle. You will read about these technologies and develop an in-car information or assistance system by applying one or more of the following Artificial Intelligence (AI) techniques:

- Multimodal Interaction Design
- Dialogue Systems
- User Adaptation
- Machine Learning

As a participant, you will give a 30 min. presentation on a paper related to these techniques and work on one of the practical projects below in a team.

Announcements

Kickoff - tbd

Requirements:

The course is intended for students in Computer Science and Media Informatics who like challenges that come with working on practical projects with frameworks and devices they have never used before. Very good programming skills and proactivity are essential.

Projects

  1. Transfer of Control with Distracted Drivers
    Autonomout vehicles are not perfect. There are situations in which the car will not be able to drive autonomously and will need to let the driver take control of the wheel. However, in many cases the driver will be distracted and not ready to assume control. In this project you will work on the transfer of control from an autonomous vehicle to the driver and vice-versa. This will involve some programming with a dialog platform, use of a driving simulator and machine learning. Part of the challenge will be to detect distracted drivers either with an eye tracker or with an RGB camera and computer vision.
  2. In-Car Referencing and Control with Multi-modal Fusion:
    In this project, you will work on hands free control of the car features (e.g.infotainment, wipers, doors) using the combination of eye-gaze and speech. You will also work on analyzing users' behaviour and detecting users' routine during first time use of a new car. The project requires user design knowledge as well as adequate programming skills (preferably in Java).
  3. Multimodal Interactive Car Windows:
    The technologies in future cars are rapidly changing. One of the recent technologies is Augmented Reality Side Windows, where the passengers interact with the side windows. The interactions could be such as providing information about the outside environment of the car, an interactive application like a game or weather forecast, and many other use cases. In this project, you will work with Microsoft Hololens to simulate the holographic side windows of the car. The requirements to complete this project are some pre-knowledge of Augmented Reality and programming skills in Unity3D (C#) for using Hololens. With the Hololens SDK, you will be able to define speech or gesture recognition for developing this project.
  4. tbd.

Registration

To register for the seminar, please use the university's seminar assignment system