This dataset contains comfort-related data in our human-in-the-loop experiments about passenger comfort in autonomous vehicles.
Data include electroencephalogram (EEG) data from EMOTIV EPOC X and EMOTIV EPOC + , ocular data from Gazepoint GP3 HD and Seeing Machines FOVIO , heart rate, skin temperature, and skin conductance data from Empatica E4 , and subjective comfort level data during the simulated autonomous vehicle journeys.
Each participant experienced a series of virtual autonomous vehicle journeys on our high-fidelity driving simulator in this study.
The driving simulator is displayed in Figure 1. The Stewart-platform-based simulator can generate 6-degree-of-freedom motions. A three-monitor display setup can provide wide field-of-view for the participants and create high-fidelity experiences with motions.
We created 27 video-based journeys with synchronized motions as the stimuli in the experiment. We chose three typical road systems, city, highway, and mountain/rural, to create our journeys. Three different routes on each type of road were defined for the simulated vehicle to complete. We defined three driving styles of the vehicle: gentle, normal, and aggressive. A 3x3x3 design was employed to generate a total of 27 journeys. Table I shows the setup for each journey we created. Figure 2 shows several sample frames of videos from journeys on different road types.
The participant was required to report their subjective comfort levels using a pressing device during the experiment. Meanwhile, a series of devices recorded physiological signals as objective measurements. Figure 3 shows the entire data acquisition system.
The scope of passenger comfort was limited to the behavioral factors of the vehicle, e.g., the style in which the vehicle switches lanes, cruises at a specific speed. Under this definition, influences from other factors, e.g., ambient temperature, seating comfort, were eliminated.
Besides subjective comfort level data and physiological signals, some additional contextual information can be extracted by labeling the video stimuli. We have created three spreadsheets, " City Maneuver Sequence.xlsx ", " Highway Maneuver Sequence.xlsx ", and " Mountain Maneuver Sequence.xlsx ", containing the maneuvers performed by the vehicle and some related contextual information extracted from the videos. Data in these spreadsheets can be used to analyze influential factors to passenger comfort or as an example of how to extract valuable information from the video stimuli.
For each participant, the experiment consisted of three separate sessions. In each session, only journeys from one specific road type would be experienced by the participant. The design was intended to reduce the fatigue of the participant. The duration of each experimental session was approximately 45 minutes.
There are 21 folders under the root directory, each corresponding to one participant. There are three sub-folders containing the data recorded from each session for each participant. A readme document is created in each sub-folder containing detailed guidelines on parsing the data.
You can download the dataset with the link below:
Clemson Comfort DatasetClemson Comfort Dataset Project Page/Clemson University/Collaborative Robotics and Automation (CRA) Lab/yunyij@clemson.edu