Browse Source

update thesis

Jia, Jingyi 2 years ago
parent
commit
f724e9e7ed

BIN
.DS_Store


BIN
Concepts for operating ground based rescue robots using virtual reality/.DS_Store


BIN
Concepts for operating ground based rescue robots using virtual reality/Concepts_for_operating_ground_based_rescue_robots_using_virtual_reality.pdf


+ 1 - 1
Concepts for operating ground based rescue robots using virtual reality/chapters/result.tex

@@ -127,7 +127,7 @@ This section will discuss the feedback from participants. Overall, every partici
 %Handle
 Handle Mode directly uses motion controllers for moving the robot. The evaluation of this operation mode depends in large part on the design of the motion controllers. The proposed system uses the touchpad on the \gls{htc} motion controllers to control the direction of the robot. More than half of the users thought that the touchpad made them less flexible when operating the robot's steering. Participants were often unable to accurately touch the correct position of the touchpad when using it, and it was very likely to be touched by mistake. At the end of the experiment, these participants were additionally invited to re-operate the robot using the \gls{vr} controller with thumb sticks, and said that using thumb sticks was easier for them to control the direction. Some participants said that they did not like the two monitoring screens provided by this mode. The additional surveillance screens made them subconsciously distracted to observe them, preventing them from concentrating on the rescue mission. Others, however, thought that the monitor was particularly helpful. As it was very difficult to control the robot while teleporting themselves, they first relied on the monitor screen to drive the robot to a place, and then teleported themselves to the location of the robot. The experiment also found that participants tended to forget that the two monitor screens could be closed, and they usually tried to drag the screens to places where they did not affect their view and dragged them back when they wanted to use them.
 
-Remote Mode and UI Mode that use AI intelligent obstacle avoidance driving algorithm were most well-received. Participants felt that in both modes they did not need to worry about how to control the robot's steering and forward speed, but that the computer was responsible for everything, allowing them to focus on virtual world exploration.
+Remote Mode and UI Mode that use intelligent obstacle avoidance driving algorithm were most well-received. Participants felt that in both modes they did not need to worry about how to control the robot's steering and forward speed, but that the computer was responsible for everything, allowing them to focus on virtual world exploration.
 
 For the UI Mode, one of the participants remarked: "\textit{I can just let the robot follow me. I don't need to think about how to operate the robot. This way I can concentrate on the rescue.}" In the user study, all participants could easily learn how to operate the UI menu. This may be explained by the fact that the menu interface was very familiar to them. It was observed that all participants did not use the direction buttons and monitoring screens in the virtual menu. At the beginning of the test, they all turned on the follow function directly and adjusted the robot's driving speed to the maximum. After that, the robot was more like a moveable \gls{lidar} sensor. This therefore led to the fact that these participants could completely disregard the location of the robot and just explore the \gls{vr} world on their own. One participant in the experiment teleported so fast that when he reached a location and had been waiting for a while, the robot was still on its way. In fact, the problem of not being able to find the robot happened in Handle Mode as well.
 

BIN
Hector_v2/.DS_Store


BIN
Hector_v2/Assets/.DS_Store


BIN
Hector_v2/Assets/Asterix/.DS_Store


BIN
Hector_v2/Assets/Asterix/hector_components_description/.DS_Store


BIN
Hector_v2/Library/.DS_Store


BIN
User Study/.DS_Store