Human-Machine Interaction for Automated Vehicles: Driver Status Monitoring and the Takeover Process explains how to design an intelligent human-machine interface by characterizing driver behavior before and during the takeover process. Multiple solutions are presented to accommodate different sensing technologies, driving environments and driving styles. Depending on the availability and location of the camera, the recognition of driving and non-driving tasks can be based on eye gaze, head movement, hand gesture or a combination. Technical solutions to recognize drivers various behaviors in adaptive automated driving are described with associated implications to the driving quality.
Finally, cutting-edge insights to improve the human-machine-interface design for safety and driving efficiency are also provided, based on the use of this sensing capability to measure drivers' cognition capability.
Please Note: This is an On Demand product, delivery may take up to 11 working days after payment has been received.
Table of Contents
1. Introduction 2. Driver Behaviour Recognition Based on Eye-gaze 3. Driver Behaviour Recognition Based on Hand-gesture 4. Driver Behaviour Recognition Based on Head Movement 5. Driver Behaviour Recognition Based on the Fusion of Head Movement and Hand Movement 6. Real-time Driver Behaviour Recognition 7. The Implication of Non-driving Tasks on the Take-over Process 8. Driver Workload Estimation 9. Neuromuscular Dynamics Characterization for Human-Machine Interface 10. Driver Steering Intention Prediction using Neuromuscular Dynamics 11. Intelligent Haptic Interface Design for Human-Machine Interaction in Automated Vehicles