Commit 0aea0aa9 authored by BROCHETON Gabriel's avatar BROCHETON Gabriel
Browse files

add instructions

parent ada0fd47
!! This project is only compatible with the HP Omnicept reverb G2 VR Headset.
!! If you are using a different device the data from the sensors will not be recorded and the controllers may not be recognized.
!! You need to place the config.txt file at %appdata%\..\LocalLow\CERV\vr-ppe\config.txt
<<< bindings >>>
To be sure that you are using the right keys please refer to the image "know your controllers".
The primary buttons are X and A
The secondary buttons are Y and B
Good luck
;to place at %appdata%\..\LocalLow\CERV\VR-PPE\
;create the path if it didn't exist
[userid]
name = test
age = 23
health = good
[experiment]
;choosed scene -calcul -doubleTask -tripleTaskNoInteruption -tripleTaskNASAtlx
type = doubleTask
;if true begin the hard way true = 1 false = 0
reverse=0
;phase time in seconds
experienceTime = 20
;pause time in seconds
pauseTime = 30
;if true propose a survey to evaluate cognitive load true = 1 false = 0
survey = 1
;allow to skip the tutorial may cuse saving issues on cognitive load data true = 1 false = 0
skipTuto = 1
\ No newline at end of file
# Experiments
## progressive task
In this experiment you have to answer simple maths during a period of time <br>
A pause will occur and you could answer a survey if you choosed to (config file). <br>
After the pause it's time again to do some calculations but it will be a bit more complex. <br>
During all of this you have to respond as quickly as possible to the sound in order to measure your response time. <br>
To choose an answer aim the wanted key with your right controller and press **trigger** to select it. <br>
When you are satisfied select the **=** key to validate
Use the left **Joystick click** to respond the audio. <br>
![ProgressiveTask](./Images/Screen_ProgressiveTask1.png)
## double task
In this experiment you have to get the artificial horizon stable with the joystick and validate.
A pause will occur and you could answer a survey if you choosed to (config file)
after the pause the artificial horizon will come back but it will also be some calculations to do
during all of this you have to respond as quickly as possible to the sound in order to measure your response time
To get the artificial horizon stable you will use the right **joystick up** and **joystick down**
To validate you will use the **A** button (right primary button)
To choose an answer aim the wanted key with your right controller and press **trigger** to select it
When you are satisfied select the **=** key to validate
Use the left **Joystick click** to respond the audio
![DoubleTask](./Images/Screen_DoubleTask1.png)
## triple task
In this experiment you have to select the correct answer to the calculation between 3 buttons during 1 minute [STEP 1]
Then you will have to guess the origin of the sound played and do the calculations [STEP 2]
Finally adding to the two others tasks you will have to keep the artificial horizon as flat as possible [STEP 3]
After each step you could answer a survey if you choosed to (config file)
<span style="color:yellow">**WARNING**</span> You have a limited time to complete the audio and cacul tasks.
To select the answer in the calculation press the button with any controller (in the game)
To select the origin of the sound use the left joystick : **joystick up** , **joystick down** , **joystick left** or **joystick right**
To keep the artificial horizon flat you will have to grab the levers above your head and
* To turn the artificial horizon into trigonometric direction pull the right lever
* To turn the artificial horizon clockwise pull the left lever
* If the lever is green there is no counter rotation
* if it's yellow there is half speed counter rotation
* if it's red there is full speed counter rotation
![TripleTask](./Images/Screen_TripleTask1.png)
# Instructions
## games tasks
* calcul is the progressive task with **[SimpleCalcul](#simplecalcul)** and then **[ComplexCalcul](#complexcalcul)** maths with **[Audio](#audio)**
* doubleTask is the **[SimpleCalcul](#simplecalcul)** and **[VerticalArtificialHorizon](#verticalartificialhorizon)** with **[Audio](#audio)**
* tripleTaskNoInteruption is the **[ButtonCalcul](#buttoncalcul)**, **[AudioOrigin](#audioorigin)** and **[RotationArtificialHorizon](#rotationartificialhorizon)**
* tripleTaskNASAtlx is the same as tripleTaskNoInteruption with a pause between task to answer a survey
On the triple task experiment the errors are displayed and the game stops at the fifth error.
The first four errors will not be eliminatory.<br>
Each time you give the wrong answer, an error is counted.<br>
Each time the counter drops to 0 an error is counted.<br>
Each time the artificial horizon exceeds 60 degrees an error is counted.<br>
![Controllers inputs](./Images/Know_your_controllers.png)
## SimpleCalcul
Answer simple digit calculations on calculator type input using ray on controllers<br>
**[0-9]** to write numbers<br>
**[-]** for negatives numbers<br>
**[=]** to validate <br>
**[current entry]** for deleting one character<br>
**trigger** to select button<br>
## ComplexCalcul
Answer double digits calculations on calculator type input using ray on controllers<br>
**[0-9]** to write numbers<br>
**[-]** for negatives numbers<br>
**[=]** to validate<br>
**[current entry]** for deleting one character<br>
**trigger** to select button<br>
## Audio
**joystick click** when you hear a sound.<br>
It is used as a witness spot to measure your response time.
## VerticalArtificialHorizon
**joystick up** and **joystick down** to place the background on the line and **A** to validate
## ButtonCalcul
Press the correct answer using the controller. <br>There is at least one correct answer among the three shown.
## AudioOrigin
Use **joystick up**, **joystick down**, **joystick left** and **joystick right** to guess to origin of the sound.
## RotationArtificialHorizon
Use the levers to offset the rotation speed of the background.<br>
The left lever will offset the rotation clockwise.<br>
The right lever will offset the rotation anticlockwise.<br>
[Return to ReadMe](./../ReadMe.md)
[Go to setup](./setup.md)
présentation générale
|_ capteurs
|_ scenes
|_ instructions
|_ controles
|_installation
|_runtime
|_config
|_protocole
# Sensors
This headset allow us to access multiples sensors
Sensor data is saved in the location "ApplicationDataPath\SensorsCSVData\"
## Raw sensors
The recorded sensors are :
* HeartRate
* EyeGaze&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R] &nbsp; &nbsp; &nbsp; &nbsp; [x,y,z]</br>
* EyeGazeConfidence &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R]
* EyeOpenness &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R]
* EyeOpennessConfidence &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R]
* EyePupilDilation &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R]
* EyePupilDilationConfidence&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R]
* EyePupilPosition &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [L+R] &nbsp; &nbsp; &nbsp; &nbsp; [x,y]
* CombinedGaze &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; [x,y,z]
* CombinedGazeConfidence
* CognitiveLoadValue
* CognitiveLoadStandardDeviation
[L+R] a sensor is present for each eye<br>
[x,y,z] the number of dimensions contained in the vector.
## Completion informations
From the game we are saving several more informations
* **Stage** : get the current stage of the game.
* **TaskAudio** : when an audio task start.
* **TaskMath** : when an calculation task start.
* **Stabilization** : when the player is interacting with artificial horizon.
* **AnswerAudio** : errors and correct answer with the audio task.
* **AnswerMath** : errors and correct answer with the calculation task.
## Additionnal informations
More informations are saved into a text file netx to the csv
After each part of the experiment some NASA-TLX questions are asked.
* **Mental Demmand** : How much mental and perceptual activity was required? Was the task easy or demanding, simple or complex?
* **Performance** : How successful were you in performing the task? How satisfied were you with your performance?
* **Effort** : How hard did you have to work (mentally and physically) to accomplish your level of performance?
* **Frustration** : How irritated, stressed, and annoyed versus content, relaxed, and complacent did you feel during the task?
There is also the average response time for the more simple task and the more complec one.<br>
Finally we save the average cognitive load by part (calculated by the headset).
[Return to ReadMe](./../ReadMe.md)
# Set Up
## Windows Mixed Reality
The first step to get started with the Headset Omnicept is to set up it withim the Windows Mixed Reality. All steps to do this are instructed in the [pdf](./Images/Headset_Set-Up.pdf).
## Runtime
The Runtime is a system designed to connect the Headset sensors to applications running on a single PC. The download of the Runtime can be done at https://developers.hp.com/omnicept/downloads, in the option "Omnicept Users", click at the button "Download latest HP Omnicept Runtime", as indicated in red at the following image:
![](/Instructions/Images/ButtonDownloadRuntime.png)
A new Web Page will load and the download will start automatically. If the download doesn't begin, verify if the browser blocked a pop up, click to allow the pop up:
![](/Instructions/Images/PopUpAllow.png)
and click at "Try downloading again":
![](/Instructions/Images/TryDownloadAgain.png)
After the download is completed, open the executable and follow the installation steps, at the final screen check the box "Woud you like to enable all your sensors?", and then click "Finish" :
![](/Instructions/Images/EnableSensors.png)
A request to "Restart" the system will appear to apply the runtime settings, click "Yes". After the system restarts the Runtime is ready.
## Eye Tracking Calibration
For the first time you use the Headset, it is necessary to calibrate the Eye Tracking sensor. To do this you have to execute the application "HP Omnicept Eye Tracking Calibration", wich was installed along with the Runtime, and can be found by searching for its name in the Windows Menu.
## Config File
### Initialisation
<span style="color:yellow">**WARNING**</span> you need to place the config file at the right location, if not you're will not be able to change parameters for the experiment.<br>
Open the startup menu by pressing the win key on your keyboard.
Type the following directory and press enter.
> %appdata%\\..\LocalLow\
Create the remaining of the path
> CERV\VR-PPE\
Copy config.txt into the vr-ppe folder.<br>
Before starting the experiment fill up the config file with subject informations.<br>
At the end of the experiment you will get a folder named after the subject containing the raw data sensors and the result of the surveys.
### Parameters
**[userid]** is the section for the personnals informations of the subject it is hero to help you find the data afterward
* **name** you can write the full name of the subject and the output folder will be created after this name
* **age** just for statistics
* **health** you can report here any issues that might influence the experience
**[experiment]** is the section to modify the parameters of the experiment
* **type** this is where you can choose the type of experience
* **calcul** : [Progressive Task](./instructions.md#progressive-task)
* **doubleTask** : [Double Task](./instructions.md#double-task)
* **tripleTaskNoInteruption** : [Triple Task](./instructions.md#triple-task)
* **tripleTaskNASAtlx** : [Triple Task](./instructions.md#triple-task)
* **reverse** : Set it to **1** if you want to begin by the complex tasks work only with [Double Task](./instructions.md#double-task) and [Triple Task](./instructions.md#triple-task)
* **experienceTime** : Set the duration of each part of the experiment in seconds (default value = 180, 3 minutes for each part, 6 minutes in total)
* **pauseTime** : Set the duration of the pause between parts in seconds<br>
<span style="color:yellow">**WARNING**</span> only work if **survey** is disable<br>
(default value = 30)
* **survey** : displays a quiz derived from the NASA-TLX at the end of each part<br>answers are logged in a text file with the average response time and cognitive load by part
* **skipTuto** : allow you to skip the tutorial and begin with the experient right away.<br><span style="color:yellow">**WARNING**</span> the cognitive load start logging after one minute of play so if you skip the tutorial make sure to not start right away.
## Omnicept Sensor Diagnostic
To verify if the Headset sensors works there are an application for this. You can download the Omnicept Sensor Diagnostic at https://developers.hp.com/omnicept/downloads, scroll the page until the session tools, and you will find the download option. To download the application a HP Account is needed.
[Return to ReadMe](./../ReadMe.md)
[Go to instructions](./instructions.md)
\ No newline at end of file
# vr-ppe-build
<span style="color:yellow">**WARNING**</span> This project is only compatible with the HP Omnicept reverb G2 VR Headset.
<span style="color:yellow">**WARNING**</span> If you are using a different device the data from the sensors will not be recorded and the controllers may not be recognized.
This project aims to develop a virtual reality application to register, verify and validate the sensors' data of the HP Omnicept Reverb G2 VR Headset.
The Headset has four sensors:
- Eye Tracking
- Pupillometry
- Heart rate
- Cognitive Load
The Cognitive Load Sensor has the premise to calculate the mental effort of the user through an artificial intelligence based on the user's physiological data acquired by the other sensors when executing tasks. Therefore, the project is mainly focused on analyzing the data from this sensor.
The virtual reality application was developed using Unity. Three scenes was developed, each one with different tasks to stimulate the Cognitive Load Sensor.
For each played session the sensors' data are saved in a csv file. A python application was developped to read the csv file, manipulate the data and plot charts, facilitating data analysis.
<span style="color:red">**IMPORTANT**</span>
To configure the environment to use the headset, go check: [Setup](./Instructions/setup.md)
For more informations of how to configure and play the scenes, go check:
[Instruction](./Instructions/instructions.md)
In order to understand the sensors'data and the python application go check:
[Sensors](./Instructions/sensors.md)
\ No newline at end of file
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment