Jia, Jingyi 3 роки тому
батько
коміт
366e44edfa
47 змінених файлів з 168 додано та 31 видалено
  1. BIN
      .DS_Store
  2. BIN
      Hector_v2/.DS_Store
  3. BIN
      Hector_v2/Assets/.DS_Store
  4. BIN
      Hector_v2/Library/.DS_Store
  5. BIN
      Hector_v2/obj/.DS_Store
  6. 35 4
      Thesis.md
  7. BIN
      User Study/.DS_Store
  8. BIN
      User Study/Google Form/.DS_Store
  9. 10 3
      User Study/Google Form/Hector V2 Nutzerstudie.csv
  10. BIN
      User Study/Google Form/I found it easy to concentrate on controlling the robot.jpg
  11. BIN
      User Study/Google Form/I found it easy to move robot in desired position.jpg
  12. BIN
      User Study/Google Form/I found it easy to perceive the details of the environment.jpg
  13. 1 0
      User Study/Google Form/statistic.py
  14. 5 0
      User Study/TLX/HectorVR-10.csv
  15. 5 0
      User Study/TLX/HectorVR-11.csv
  16. 5 0
      User Study/TLX/HectorVR-12.csv
  17. 5 0
      User Study/TLX/HectorVR-5.csv
  18. 6 0
      User Study/TLX/HectorVR-6.csv
  19. 5 0
      User Study/TLX/HectorVR-7.csv
  20. 4 0
      User Study/TLX/HectorVR-8.csv
  21. 5 0
      User Study/TLX/HectorVR-9.csv
  22. 5 0
      User Study/TLX/Mean.csv
  23. 33 0
      User Study/TLX/Merged.csv
  24. BIN
      User Study/TLX/effort.png
  25. BIN
      User Study/TLX/frustration.png
  26. BIN
      User Study/TLX/mental-demand.png
  27. BIN
      User Study/TLX/performance.png
  28. BIN
      User Study/TLX/physical-demand.png
  29. 5 0
      User Study/TLX/standard_deviation.csv
  30. BIN
      User Study/TLX/summary.jpg
  31. BIN
      User Study/TLX/temporal-demand.png
  32. BIN
      User Study/TLX/total.png
  33. BIN
      User Study/TestResult/.DS_Store
  34. 0 5
      User Study/TestResult/1.csv
  35. 5 0
      User Study/TestResult/10.csv
  36. 5 0
      User Study/TestResult/11.csv
  37. 5 0
      User Study/TestResult/12.csv
  38. 0 5
      User Study/TestResult/2.csv
  39. 0 5
      User Study/TestResult/3.csv
  40. 0 5
      User Study/TestResult/4.csv
  41. 5 0
      User Study/TestResult/6.csv
  42. 5 0
      User Study/TestResult/7.csv
  43. 5 0
      User Study/TestResult/8.csv
  44. 5 0
      User Study/TestResult/9.csv
  45. BIN
      User Study/TestResult/Rescue situation.png
  46. BIN
      User Study/TestResult/Robot Performance.png
  47. 4 4
      User Study/TestResult/statistic.py

BIN
Hector_v2/.DS_Store


BIN
Hector_v2/Assets/.DS_Store


BIN
Hector_v2/Library/.DS_Store


BIN
Hector_v2/obj/.DS_Store


+ 35 - 4
Thesis.md

@@ -310,7 +310,7 @@ After all the tests were completed, participants were asked to compare the four
 >
 > experience vr
 
-A total of 8 volunteers participated in the user study (2 females and 6 males between 22 and 30 years, mean age 29.25 years). Five of them were computer science students at the university. Only two participants had previous experience with VR,  and both had played it only a few times.
+A total of 8 volunteers participated in the user study (3 females and 5 males between 22 and 32 years, mean age xxx years). Five of them were computer science students at the university. Four participants had previous experience with VR,  but had played it only a few times.
 
 ##### Quantitative Results
 
@@ -330,7 +330,7 @@ During the test ran, these following data were recorded.
 
 % `collision`
 
-The first is the number of collisions between the robot and objects in the scene, which reflects the probability of the robot being destroyed in different operation modes. The assumption before the experiment started was that the number of collisions should be highest in lab mode as well as in UI mode. This is because the two modes involve frequent switching back and forth between the scene and the console or control panel, which may result in the user not being able to concentrate on observing the obstacles in the scene.  【结果】
+The first is the number of collisions between the robot and objects in the scene, which reflects the probability of the robot being destroyed in different operation modes. The assumption before the experiment started was that the number of collisions should be highest in lab mode as well as in handle mode. This is because the two modes involve frequent switching back and forth between the scene and the console or screen, which may result in the user not being able to concentrate on observing the obstacles in the scene.  【结果】
 
 
 
@@ -366,8 +366,9 @@ A questionnaire was used to get their feedback:
 
 ```latex
 \begin{enumerate}
-\item
-
+\item I found it easy to move robot in desired position.
+\item I found it easy to concentrate on controlling the robot.
+\item I found it easy to perceive the details of the environment.
 \end{enumerate}
 ```
 
@@ -379,13 +380,43 @@ A questionnaire was used to get their feedback:
 >
 > other feedbacks
 
+This section will discuss the feedback from participants. Overall, every participant gave positive comments about operating the robot in a \gls{vr} platform. They thought the proposed system was exciting and did allow them to perceive more details in the post-disaster environment than the traditional video-based manipulation. The feedbackts obtained from each mode will be listed next.
+
+70% of participants ranked Lab Mode as the least preferred mode. Some experimenters were very unaccustomed to using \gls{vr} handles to grasp objects, which makes it difficult for them to operate the robot with virtual joysticks smoothly. For those who have \gls{vr} experience, even without any hints and learning, they subconsciously understood what each button and joystick represented and were able to operate the robot directly. Nevertheless, for the actual rescue experience in the test focus, both kinds of participants responded that the robot's operation was more complex and difficult than the other modes. Participants attributed the reasons to obstacles in the environment. One of the participants said:"There is no physical access to the joystick. So it is slightly tough for me to control the robot." In some cases, when the robot was stuck in a corner, it took them much effort to get the robot out of this situation. Also, since the lab mode uses a simulated screen, the lab mode is not as good as the other three in terms of observing the details of the scene. Participants felt that the simulated screen was blurred, and the frequent switching between multiple screens made them very tired. 
+
+%Handle
+Handle mode directly using motion controllers for moving robot, and the user can open and close the two monitoring screen through the button. The evaluation of this operation mode depends in large part on the construction of the motion controllers. More than half of the users thought that the \gls{htc} motion controllers made them less flexible when operating the robot's steering. Participants were often unable to accurately touch the correct position of the touchpad when using it, and it was very likely to be touched by mistake. At the end of the experiment, these participants were additionally invited to re-operate the robot using the \gls{vr} controller with joysticks, and said that using joysticks was easier for them to control the direction. Some participants said that they did not like the two monitoring screens provided by this mode. The additional surveillance screens made them subconsciously distracted to observe them, preventing them from concentrating on the rescue mission. Others, however, thought that the monitor was particularly helpful. As it was very difficult to control the robot while teleporting themselves, they first relied on the monitor screen to drive the robot to a place, and then teleported themselves to the location of the robot. The experiment also found that participants tended to forget that the two monitor screens could be closed, and they usually tried to drag the screens to places where they did not affect their view and dragged them back when they wanted to use them.
+
+Remote Mode and UI Mode that use AI intelligent obstacle avoidance walking algorithm were most well-received. Participants felt that in both modes they did not need to worry about how to control the robot's steering and forward speed, but that the computer was responsible for everything, allowing them to focus on virtual world exploration.
 
+For the UI model, one of the participants remarked: "I can just let the robot follow me. I don't need to think about how to operate the robot. This way I can concentrate on the rescue. " In the experiment, it was observed that all participants did not use the direction buttons and monitoring screens in the virtual menu. At the beginning of the test, they all turned on the follow me function directly and adjusted the robot's driving speed to the maximum. After that, the robot was more like a moveable \gls{lidar} sensor. This therefore leads to the fact that these participants could completely disregard the location of the robot and just explore the \gls{vr} world on their own. One participant in the experiment teleported so fast that when he reached a location and had been waiting for a while, the robot was still on its way. In fact, the problem of not being able to find the robot happens in Handle Mode as well.
+
+In contrast, Remote mode solves this problem of the robot not being in view. One participant stated that “The robot is always in sight, so I don't have to waste extra time looking for the robot. Control of the robot is also very easy.” Another participant reflected that after setting the destination of the trolley operation, he would subconsciously observe the movement of the robots so that the robot was always in his visual field of view. They thought it was very easy in this mode to operate robot. Many participants alternated between using the right and left hand rays, first setting the robot's moving target point with the right hand ray, and then teleporting themselves there with the left hand ray.  The security measures set up (remote control) were almost not used in the actual test. When it came to the robot's inability to navigate automatically to the destination, the participants preferred to move the robot by resetting the destination point or moving themselves.
+
+In addition to this, participants were found lost in each of the operational modes. They would forget whether the place was already visited by themselves.
 
 ##### Discussion
 
+In this section, some possible modifications will be given based on the data obtained from the tests and the feedback given by the participants.  At the end of the section, some ideas for developing an ideal \gls{vr}-based interaction approach for operating robots will be summarized
+
+In general, an ideal \gls{vr}-based robotics operation method should take away complexity from the user as much as possible. For the lab model, as the least favorite model of the participants and the one that they find very complicated and difficult to operate, it can be concluded that unless the \gls{vr} operating system is developed for training operators to learn to operate the robot in a real environment, a lab-like mode of operation is not desirable. If one wants to develop a interaction approach like Handle Mode, the choice of \gls{vr} handle should be taken into account. Motion controllers similar to Oculus Quest and Index are recommended because joysticks are better to operate than touchpads. 
+
+
+
+###### obstacle avoidance algorithm
+
+
+Remote Mode and UI Mode that use AI intelligent obstacle avoidance algorithm were well-received. In Remote mode, users set the driving destination by ray. In UI mode, the robot could move directly following the user's position in the virtual world. Participants felt that in both modes they did not need to worry about how to control the robot's steering and forward speed, but that the computer was responsible for everything, allowing them to focus on virtual world exploration. However, both control modes require improvement. First of all, the security measures set up (remote control in Remote Mode, orientation buttons in UI Mode) were not used in the actual test. When it came to the robot's inability to navigate automatically to the destination, the participants preferred to move the robot by resetting the destination point or moving themselves. The UI Mode was rated better than Remote Mode by the participants, but as a bystander, I observed some points that could be dangerous for the robot. When participants turned on Follow function, the robot was more like a moveable \gls{lidar} sensor. They would no longer pay attention to whether the robot would be damaged behind them. If in an actual disaster relief situation, in case of an unexpected event such as a secondary collapse, the user may not be able to detect and take action in time. In addition, both modes are highly dependent on AI intelligent obstacle avoidance algorithms. The proposed system currently uses the \gls{nav} component and simulates a post-disaster scene, instead of reconstructing it by scanning the real site through \gls{lidar}. Therefore, there remains a need for an intelligent obstacle avoidance algorithm when using a real robot. This algorithm should be accurate enough so that the user can entirely rely on the computer to control the robot.
+
+In general, an ideal \gls{vr}-based robotics operation method should take away complexity from the user as much as possible. Moreover, the choice of \gls{vr} handle should be taken into account when developing. Motion controllers similar to Oculus Quest and Index are recommended because joysticks are better to operate than touchpads. Some limitations are also worth noting. Because the number of participants was only eight, and most of them had no experience with \gls{vr}, the data tested and the conclusions drawn might not be entirely correct. After professional training, when users can operate \gls{vr} equipment flexibly, some features that are now considered redundant may instead be helpful in practical use.
+
+
+
+###### Screen 减少
 
 
 
+###### Map定位机器人,表明自己走过的路径
 
 # Conclusion
 

BIN
User Study/.DS_Store


BIN
User Study/Google Form/.DS_Store


+ 10 - 3
User Study/Google Form/Hector V2 Nutzerstudie.csv

@@ -1,3 +1,10 @@
-时间戳记,How old are you?,What is your gender?,What is or was your major (e.g. Computer Science)?,How much VR experience do you have?, I found it easy to move robot in desired position, I found it easy to concentrate on controlling the robot, I found it easy to perceive the details of the environment, I found it easy to move robot in desired position, I found it easy to concentrate on controlling the robot, I found it easy to perceive the details of the environment, I found it easy to move robot in desired position, I found it easy to concentrate on controlling the robot, I found it easy to perceive the details of the environment, I found it easy to move robot in desired position, I found it easy to concentrate on controlling the robot, I found it easy to perceive the details of the environment,Which operation mode do you like best? ,Why do you like it best? ,Which operation mode do you dislike the most?,Why do you dislike it the most? 
-2021/07/06 10:43:12 上午 GMT+2,22,Female,,5,2,3,4,2,4,1,2,4,4,1,4,4,Handle,,Lab,
-2021/07/06 10:44:10 上午 GMT+2,25,Male,,2,4,1,3,1,3,4,2,3,3,2,3,5,Handle,(Handle) because xxxxxx,Lab,(Lab) xxxxxxx
+"时间戳记","How old are you?","What is your gender?","What is or was your major (e.g. Computer Science)?","How much VR experience do you have?"," I found it easy to move robot in desired position"," I found it easy to concentrate on controlling the robot"," I found it easy to perceive the details of the environment"," I found it easy to move robot in desired position"," I found it easy to concentrate on controlling the robot"," I found it easy to perceive the details of the environment"," I found it easy to move robot in desired position"," I found it easy to concentrate on controlling the robot"," I found it easy to perceive the details of the environment"," I found it easy to move robot in desired position"," I found it easy to concentrate on controlling the robot"," I found it easy to perceive the details of the environment","Which operation mode do you like best? ","Why do you like it best? ","Which operation mode do you dislike the most?","Why do you dislike it the most? "
+"2021/07/16 4:51:55 下午 GMT+2","32","Male","Computer Science","2","5","5","3","5","5","4","5","5","5","5","5","5","UI","(UI) Most easiest to control (less dimensions)","Handle","(Handle) Too many things to control at the same time"
+"2021/07/16 4:53:22 下午 GMT+2","27","Male","Computer Science","1","3","2","2","2","5","1","5","5","5","5","5","5","UI","(UI) I can rescue people without concentrating on controlling car","Lab","(Lab) too many screens, so that I need always shift my head to find the target"
+"2021/07/17 2:52:00 下午 GMT+2","24","Female","Industrial Engineering","1","4","3","3","2","2","2","5","4","4","5","5","5","Remote","(Remote) I can very simply command the robot to detect the designated target area, while I can manually search for targets in the explored area. Dual-threaded work is more efficient.","Lab","The physiological workload of operating the robot through the two-handed paddle is relatively high and requires a long time to maintain a posture."
+"2021/07/17 3:57:35 下午 GMT+2","27","Female","Wirtschaftsinformaitk","1","4","3","4","5","5","3","5","5","5","2","3","4","Remote","(Remote) The robot is always in sight, so I don't have to waste extra time looking for the robot. Control of the robot is also very easy.","Handle","(Handle) The operation of the rescue and the operation of controlling the robot are different, and it takes time to adapt to switch between these two operations. After saving people, sometimes I can't find the location of the robot, and it also takes time to find the robot."
+"2021/07/17 4:36:09 下午 GMT+2","28","Female","Wirschaftsinformatik","2","4","5","4","3","4","2","5","5","5","5","5","5","UI","(UI)In this mode, I can just let the robot follow me. I don't need to think about how to operate the robot. This way I can concentrate on the rescue. In this mode can save more people in a shorter time.","Lab","(Lab) In lab mode, I need to control the robot while performing rescue. Because it is VR, so there is no physical access to the joystick, which is slightly tough to control the robot. I need to be distracted while looking for people while manipulating the robot, it feels a little laborious and not very time-saving."
+"2021/07/17 6:47:01 下午 GMT+2","23","Male","information and communication engieering","2","4","4","5","4","4","4","5","5","4","5","5","3","Remote","it is not difficult to use the Car to explore the unknown area, and not much phyical and mental efforts are required to finish the task ","Lab","1. Blurred Images 
+2. Operation accuracy cant be guaranteed, error operation possible"
+"2021/07/18 3:26:59 下午 GMT+2","23","Male"," Maschinenbau","1","3","2","4","3","3","1","5","5","5","5","4","5","UI","(UI) Don't have to concentrate on controlling the robot.","Lab","(Lab) Burden on arms. Image on monitor unclear."
+"2021/07/18 4:07:00 下午 GMT+2","22","Male","Chemie","2","4","3","4","2","3","1","5","4","4","4","5","3","Remote","(Remote)auto driving makes the control easy and I can focus myself on the rescue task","Handle","(Handle)hard to play with the controller"

BIN
User Study/Google Form/I found it easy to concentrate on controlling the robot.jpg


BIN
User Study/Google Form/I found it easy to move robot in desired position.jpg


BIN
User Study/Google Form/I found it easy to perceive the details of the environment.jpg


+ 1 - 0
User Study/Google Form/statistic.py

@@ -42,6 +42,7 @@ def draw2(filename,start):
         box.set_alpha(a)
         box.set( facecolor = c )
     plt.title(filename, fontsize=15)
+    plt.ylim(0.5,5.5)
     plt.savefig(filename+".jpg",dpi=300)
     #plt.show()
 

+ 5 - 0
User Study/TLX/HectorVR-10.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-10,Lab,65,25,5,35,80,5,35.83333,2021-07-17 16:07:20
+2,HectorVR-10,Remote,25,20,5,80,35,5,28.33333,2021-07-17 16:08:30
+3,HectorVR-10,UI,25,15,35,100,65,5,40.83333,2021-07-17 16:16:49
+4,HectorVR-10,Handle,70,50,15,85,55,15,48.33333,2021-07-17 16:28:57

+ 5 - 0
User Study/TLX/HectorVR-11.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-11,Remote,40,55,55,70,75,60,59.16667,2021-07-17 17:07:51
+2,HectorVR-11,UI,25,40,25,75,70,30,44.16667,2021-07-17 17:21:37
+3,HectorVR-11,Handle,60,60,65,45,70,60,60,2021-07-17 18:20:52
+4,HectorVR-11,Lab,80,90,85,15,90,70,71.66667,2021-07-17 18:38:16

+ 5 - 0
User Study/TLX/HectorVR-12.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-12,UI,5,55,70,5,35,15,30.83333,2021-07-18 14:15:16
+2,HectorVR-12,Handle,80,80,85,40,80,35,66.66667,2021-07-18 14:44:48
+3,HectorVR-12,Lab,90,100,90,80,80,80,86.66667,2021-07-18 14:58:27
+4,HectorVR-12,Remote,15,25,50,15,25,25,25.83333,2021-07-18 15:27:48

+ 5 - 0
User Study/TLX/HectorVR-5.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-5,Handle,70,35,75,55,40,75,58.33333,2021-07-11 15:56:08
+2,HectorVR-5,Lab,25,10,40,25,25,25,25,2021-07-11 16:10:33
+3,HectorVR-5,Remote,10,15,15,15,20,20,15.83333,2021-07-11 16:26:22
+4,HectorVR-5,UI,10,10,10,5,10,10,9.16667,2021-07-11 16:45:25

+ 6 - 0
User Study/TLX/HectorVR-6.csv

@@ -0,0 +1,6 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+2,HectorVR-13,Lab,70,90,100,75,75,75,80.83333,2021/7/18 14:35
+3,HectorVR-13,Remote,30,60,20,5,20,10,24.16667,2021/7/18 15:51
+4,HectorVR-13,UI,15,5,80,15,15,15,24.16667,2021/7/18 16:01
+1,HectorVR-13,Handle,80,85,80,55,40,65,67.5,2021/7/18 15:37
+,,,,,,,,,,

+ 5 - 0
User Study/TLX/HectorVR-7.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-7,Remote,15,45,35,40,15,5,25.83333,2021-07-13 15:14:02
+2,HectorVR-7,UI,5,15,15,20,10,20,14.16667,2021-07-13 15:27:23
+3,HectorVR-7,Handle,80,65,75,65,80,75,73.33333,2021-07-13 15:53:35
+4,HectorVR-7,Lab,85,85,70,55,85,85,77.5,2021-07-13 15:55:27

+ 4 - 0
User Study/TLX/HectorVR-8.csv

@@ -0,0 +1,4 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-8,UI,5,10,10,10,25,15,12.5,2021-07-17 14:17:06
+2,HectorVR-8,Handle,60,30,65,30,50,30,44.16667,2021-07-17 14:18:10
+3,HectorVR-8,Lab,50,80,60,35,90,60,62.5,2021-07-17 14:31:46

+ 5 - 0
User Study/TLX/HectorVR-9.csv

@@ -0,0 +1,5 @@
+id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+1,HectorVR-9,Handle,85,15,15,25,75,25,40,2021-07-17 15:08:11
+2,HectorVR-9,Lab,25,45,65,75,75,35,53.33333,2021-07-17 15:23:08
+3,HectorVR-9,Remote,15,15,35,90,25,15,32.5,2021-07-17 15:34:13
+4,HectorVR-9,UI,35,45,35,65,55,65,50,2021-07-17 15:45:15

+ 5 - 0
User Study/TLX/Mean.csv

@@ -0,0 +1,5 @@
+condition,id,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total
+Handle,2.125,73.125,52.5,59.375,50.0,61.25,47.5,57.29166625
+Lab,2.625,61.25,65.625,64.375,49.375,75.0,54.375,61.66666625
+Remote,2.4285714285714284,21.428571428571427,33.57142857142857,30.714285714285715,45.0,30.714285714285715,20.0,30.238094285714286
+UI,2.625,15.625,24.375,35.0,36.875,35.625,21.875,28.2291675

+ 33 - 0
User Study/TLX/Merged.csv

@@ -0,0 +1,33 @@
+,id,participant,condition,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total,time
+0,1.0,HectorVR-5,Handle,70.0,35.0,75.0,55.0,40.0,75.0,58.33333,2021-07-11 15:56:08
+1,2.0,HectorVR-5,Lab,25.0,10.0,40.0,25.0,25.0,25.0,25.0,2021-07-11 16:10:33
+2,3.0,HectorVR-5,Remote,10.0,15.0,15.0,15.0,20.0,20.0,15.83333,2021-07-11 16:26:22
+3,4.0,HectorVR-5,UI,10.0,10.0,10.0,5.0,10.0,10.0,9.16667,2021-07-11 16:45:25
+4,1.0,HectorVR-7,Remote,15.0,45.0,35.0,40.0,15.0,5.0,25.83333,2021-07-13 15:14:02
+5,2.0,HectorVR-7,UI,5.0,15.0,15.0,20.0,10.0,20.0,14.16667,2021-07-13 15:27:23
+6,3.0,HectorVR-7,Handle,80.0,65.0,75.0,65.0,80.0,75.0,73.33333,2021-07-13 15:53:35
+7,4.0,HectorVR-7,Lab,85.0,85.0,70.0,55.0,85.0,85.0,77.5,2021-07-13 15:55:27
+8,2.0,HectorVR-13,Lab,70.0,90.0,100.0,75.0,75.0,75.0,80.83333,2021/7/18 14:35
+9,3.0,HectorVR-13,Remote,30.0,60.0,20.0,5.0,20.0,10.0,24.16667,2021/7/18 15:51
+10,4.0,HectorVR-13,UI,15.0,5.0,80.0,15.0,15.0,15.0,24.16667,2021/7/18 16:01
+11,1.0,HectorVR-13,Handle,80.0,85.0,80.0,55.0,40.0,65.0,67.5,2021/7/18 15:37
+12,,,,,,,,,,,
+13,1.0,HectorVR-11,Remote,40.0,55.0,55.0,70.0,75.0,60.0,59.16667,2021-07-17 17:07:51
+14,2.0,HectorVR-11,UI,25.0,40.0,25.0,75.0,70.0,30.0,44.16667,2021-07-17 17:21:37
+15,3.0,HectorVR-11,Handle,60.0,60.0,65.0,45.0,70.0,60.0,60.0,2021-07-17 18:20:52
+16,4.0,HectorVR-11,Lab,80.0,90.0,85.0,15.0,90.0,70.0,71.66667,2021-07-17 18:38:16
+17,1.0,HectorVR-10,Lab,65.0,25.0,5.0,35.0,80.0,5.0,35.83333,2021-07-17 16:07:20
+18,2.0,HectorVR-10,Remote,25.0,20.0,5.0,80.0,35.0,5.0,28.33333,2021-07-17 16:08:30
+19,3.0,HectorVR-10,UI,25.0,15.0,35.0,100.0,65.0,5.0,40.83333,2021-07-17 16:16:49
+20,4.0,HectorVR-10,Handle,70.0,50.0,15.0,85.0,55.0,15.0,48.33333,2021-07-17 16:28:57
+21,1.0,HectorVR-12,UI,5.0,55.0,70.0,5.0,35.0,15.0,30.83333,2021-07-18 14:15:16
+22,2.0,HectorVR-12,Handle,80.0,80.0,85.0,40.0,80.0,35.0,66.66667,2021-07-18 14:44:48
+23,3.0,HectorVR-12,Lab,90.0,100.0,90.0,80.0,80.0,80.0,86.66667,2021-07-18 14:58:27
+24,4.0,HectorVR-12,Remote,15.0,25.0,50.0,15.0,25.0,25.0,25.83333,2021-07-18 15:27:48
+25,1.0,HectorVR-8,UI,5.0,10.0,10.0,10.0,25.0,15.0,12.5,2021-07-17 14:17:06
+26,2.0,HectorVR-8,Handle,60.0,30.0,65.0,30.0,50.0,30.0,44.16667,2021-07-17 14:18:10
+27,3.0,HectorVR-8,Lab,50.0,80.0,60.0,35.0,90.0,60.0,62.5,2021-07-17 14:31:46
+28,1.0,HectorVR-9,Handle,85.0,15.0,15.0,25.0,75.0,25.0,40.0,2021-07-17 15:08:11
+29,2.0,HectorVR-9,Lab,25.0,45.0,65.0,75.0,75.0,35.0,53.33333,2021-07-17 15:23:08
+30,3.0,HectorVR-9,Remote,15.0,15.0,35.0,90.0,25.0,15.0,32.5,2021-07-17 15:34:13
+31,4.0,HectorVR-9,UI,35.0,45.0,35.0,65.0,55.0,65.0,50.0,2021-07-17 15:45:15

BIN
User Study/TLX/effort.png


BIN
User Study/TLX/frustration.png


BIN
User Study/TLX/mental-demand.png


BIN
User Study/TLX/performance.png


BIN
User Study/TLX/physical-demand.png


+ 5 - 0
User Study/TLX/standard_deviation.csv

@@ -0,0 +1,5 @@
+,conditon,mental-demand,physical-demand,temporal-demand,performance,effort,frustration,total
+0,Handle,8.992184106211349,23.04886114323222,26.390990413396764,18.200274723201296,15.958931668504631,22.360679774997898,11.23262503176232
+1,Lab,23.946555075835022,31.862742741327214,28.55230945125105,23.642850399222173,19.685019685029527,27.321866242992993,20.691182703428854
+2,Remote,9.89743318610787,17.87142286170972,16.993396076243314,32.293298729878046,18.979043222266316,17.728105208558365,12.698538504833493
+3,UI,10.734727523323542,17.929985359726317,25.0,34.905005013608005,23.10810193417019,17.66661753137821,14.65717806822373

BIN
User Study/TLX/summary.jpg


BIN
User Study/TLX/temporal-demand.png


BIN
User Study/TLX/total.png


BIN
User Study/TestResult/.DS_Store


+ 0 - 5
User Study/TestResult/1.csv

@@ -1,5 +0,0 @@
-participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time,
-1,HandleTest,0,2,414.837,66.08962,6.276886,9,1,0,2021/07/10 14:41,
-1,LabTest,0,23,1045.603,172.5321,6.060342,6,3,1,2021/07/10 15:02,
-1,RemoteTest,0,8,456.9017,93.45023,4.889252,5,3,2,2021/07/10 15:13,
-1,UITest,0,7,407.7353,157.2579,2.592781,6,0,4,2021/07/10 15:22,

+ 5 - 0
User Study/TestResult/10.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
+10,LabTest,0,51,1407.924,279.7697,5.03244,5,5,0,2021/7/17 15:56
+10,RemoteTest,0,5,787.1339,193.2495,4.073148,8,1,1,2021/7/17 16:04
+10,UITest,13,4,816.5622,202.1584,4.039219,10,0,0,2021/7/17 16:15
+10,HandleTest,0,28,735.5511,162.4034,4.52916,9,0,1,2021/7/17 16:26

+ 5 - 0
User Study/TestResult/11.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
+11,RemoteTest,0,2,424.6442,107.2922,3.957829,7,2,1,2021/7/17 16:59
+11,UITest,39,1,571.3905,147.8818,3.863832,10,0,0,2021/7/17 17:17
+11,HandleTest,0,30,638.7712,156.522,4.081032,6,4,0,2021/7/17 18:18
+11,LabTest,0,35,711.0004,278.0309,2.557271,3,6,1,2021/7/17 18:36

+ 5 - 0
User Study/TestResult/12.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time,
+12,UITest,34,7,576.2582,147.1039,3.917355,10,0,0,2021/07/18 14:08,
+12,HandleTest,0,2,420.2331,137.5351,3.055461,9,1,0,2021/07/18 14:42,
+12,LabTest,0,1,473.2173,205.2661,2.305385,6,3,1,2021/07/18 14:56,
+12,RemoteTest,0,1,334.7256,85.25519,3.926162,9,1,0,2021/07/18 15:09,

+ 0 - 5
User Study/TestResult/2.csv

@@ -1,5 +0,0 @@
-participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
-2,LabTest,0,13,847.2929,207.5591,4.082153,5,5,0,2021/7/10 16:09
-2,RemoteTest,0,4,470.4358,158.8244,2.961986,6,2,2,2021/7/10 16:22
-2,UITest,0,1,647.9003,235.7928,2.747753,7,2,1,2021/7/10 16:58
-2,HandleTest,0,13,807.7278,104.3061,7.743824,8,2,0,2021/7/10 17:08

+ 0 - 5
User Study/TestResult/3.csv

@@ -1,5 +0,0 @@
-participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
-3,RemoteTest,0,3,730.2812,251.9804,2.898167,7,3,0,2021/7/10 17:39
-3,UITest,0,4,625.7359,225.2351,2.778146,6,3,1,2021/7/10 17:49
-3,HandleTest,0,29,735.5511,162.4034,4.52916,7,3,0,2021/7/10 17:59
-3,LabTest,0,78,836.9151,251.1339,3.332545,3,5,2,2021/7/10 18:28

+ 0 - 5
User Study/TestResult/4.csv

@@ -1,5 +0,0 @@
-participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
-4,UITest,0,12,521.4961,197.3396,2.642632,6,1,3,2021/7/10 19:06
-4,HandleTest,0,29,554.6651,118.1866,4.693131,3,5,2,2021/7/10 19:37
-4,LabTest,0,24,619.068,195.1462,3.172281,2,7,1,2021/7/10 19:48
-4,RemoteTest,0,3,420.6864,143.7477,2.926556,7,2,1,2021/7/10 19:58

+ 5 - 0
User Study/TestResult/6.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time
+6,LabTest,0,23,594.4617,245.8493,2.417992,6,4,0,2021/7/18 14:28
+6,RemoteTest,72,5,531.0661,127.4891,4.165582,10,0,0,2021/7/18 15:49
+6,UITest,31,5,904.6816,219.5801,4.120053,10,0,0,2021/7/18 15:59
+6,HandleTest,0,13,539.8613,150.9477,3.57648,6,4,0,2021/7/18 15:35

+ 5 - 0
User Study/TestResult/7.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time,
+7,RemoteTest,0,1,606.0823,157.652,3.844432,8,1,1,2021/07/13 15:10,
+7,UITest,0,6,722.0792,185.3295,3.896192,8,2,0,2021/07/13 15:25,
+7,HandleTest,0,12,594.5508,143.6405,4.13916,8,1,1,2021/07/13 15:39,
+7,LabTest,0,14,865.9753,230.5612,3.755945,5,4,1,2021/07/13 15:48,

+ 5 - 0
User Study/TestResult/8.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time,
+8,UITest,0,2,851.3873,235.9863,3.607783,8,2,0,2021/07/17 14:05,
+8,HandleTest,0,39,849.792,120.6477,7.04358,7,3,0,2021/07/17 14:14,
+8,LabTest,0,49,749.8241,245.2149,3.057825,6,4,0,2021/07/17 14:29,
+8,RemoteTest,58,1,544.5705,131.775,4.13257,10,0,0,2021/07/17 14:43,

+ 5 - 0
User Study/TestResult/9.csv

@@ -0,0 +1,5 @@
+participant,condition,Remained Time,Collision,Drive Distance,Total driving time,Adverage speed,Rescued Target,Remained Visible Target,Remained Unvisible Target,time,
+9,HandleTest,0,18,442.3194,80.0824,5.523304,5,4,1,2021/07/17 15:01,
+9,LabTest,0,4,492.4148,221.2128,2.225978,4,5,1,2021/07/17 15:20,
+9,RemoteTest,7,0,500.6959,123.5738,4.051796,10,0,0,2021/07/17 15:32,
+9,UITest,0,2,600.5369,235.7786,2.547038,7,2,1,2021/07/17 15:42,

BIN
User Study/TestResult/Rescue situation.png


BIN
User Study/TestResult/Robot Performance.png


+ 4 - 4
User Study/TestResult/statistic.py

@@ -113,7 +113,7 @@ def drawRobotPerformance():
     plt.savefig("Robot Performance",dpi=300)
 
 def drawRescue():
-    plt.figure(figsize =(15,7))
+    plt.figure(figsize =(7,7))
     x = np.arange(len(scales))
     total_width, n = 0.8, 4
     width = total_width / n
@@ -121,7 +121,7 @@ def drawRescue():
     error_params=dict(elinewidth=1,ecolor='black',capsize=5)
 
     # set range
-    plt.ylim(0, 10)
+    plt.ylim(0, 10.5)
 
     for i in range(0,4):
         result = []
@@ -170,8 +170,8 @@ a = 0.6
 # for scale in scales:
 #     draw(scale)
 
-
-scales = ["Rescued Target", "Remained Visible Target","Remained Unvisible Target"]
+#scales = ["Rescued Target", "Remained Visible Target","Remained Unvisible Target"]
+scales = ["Rescued Target", "Remained Visible Target"]
 drawRescue()