I participated in the 2011 Grand Cooperative Driving Challenge (GCDC) as part of the Automotive Technology Team (ATeam) from the Eindhoven University of Technology. For this challenge, organized by TNO, teams from all over the world were invited to develop vehicles capable of showing cooperative driving behaviour according to a provided communication protocol on a closed section of highway. The implementation and hardware could differ between each team. For the 2011 challenge vehicles needed to show Cooperative Adaptive Cruise Control (CACC), meaning driving in a platoon with the system controlling longitudinal behaviour based on sensor input and communication between other participants and roadside infrastructure.
Our team utilized a DAF XF105 truck equipped with radar, Xsens motion sensors and wireless communication hardware. I was responsible for the Human Machine Interface. The HMI is an area that will be crucial in real-life (semi) autonomous cooperative driving scenarios. Automated cooperative driving can only be implemented if it is accepted by users as a more safe and comfortable way of travelling. As soon as systems start to act autonomously, reliability and trust become issues. It is, therefore, important to provide users with correct and useful information so a correct mental model of the system can be constructed. This mental model is needed in order for users to build confidence in the system and to learn how the system will react in particular conditions. The implemented HMI utilizes a touch screen extended with buttons on the steering wheel to provide the driver with visual and auditory information regarding system state, headway, speed and notifications (e.g. what is expected from the driver). A graph shows the reaction of the system to certain situations, allowing the user to form the correct mental model of the system.