Browse Source

update thesis

Jia, Jingyi 2 years ago
parent
commit
0aab30f9af

BIN
.DS_Store


BIN
Hector_v2/Assets/.DS_Store


+ 45 - 25
Thesis.md

@@ -62,12 +62,14 @@ However, there remains a need to explore human-computer interaction patterns and
 >
 > - General content of the survey
 
-For this purpose, this paper presents a preliminary VR-based system for the simulation of ground rescue robots with four different modes of operation and corresponding test scenes imitating a post-disaster city. The test scene simulates a robot collaborating with Unity to construct a virtual 3D scene. The robot has a simulated radar, which makes the display of the scene dependent on the robot's movement. In order to find an interaction approach that is as intuitive and low mental fatigue as possible, a user study was executed after the development was completed.
+For this purpose, this paper presents a preliminary VR-based system for the simulation of ground rescue robots with four different modes of operation and corresponding test scenes imitating a post-disaster city. The test scene simulates a robot collaborating with Unity to construct a virtual 3D scene. The robot has a simulated LiDAR remote sensor, which makes the display of the scene dependent on the robot's movement. In order to find an interaction approach that is as intuitive and low mental fatigue as possible, a user study was executed after the development was completed.
 
 
 
 > ##### Paper Architecture
 
+Chapter \ref{related} talks about some of the research involving the integration of VR and human-computer interaction.
+
 Chapter \ref{*i*mplementation} provides details of the purposed system, including the techniques used for the different interaction modes and the structure of the test scenes.
 Chapter \ref{evaluate} will talk about the design and process of user study.
 
@@ -77,9 +79,14 @@ Finally, in Chapter \ref{conclusion}, conclusions and future work are summarized
 
 
 
-# (Related Work)
+# Related Work
+
+In this chapter, some research on the integration of VR and human-computer interaction will be discussed. The relevant literature and its contributions will be briefly presented. The topic of VR and human-computer integration is an open research with many kinds of focus perspectives.
+Robotic manipulation platforms combined with virtual worlds have several application scenarios. It can be used, for example, to train operators or to collaborate directly with real robots. Elias Matsas et al. \cite{Matsas:2017aa} provided a VR-based training system using hand recognition. Kinect cameras are used to capture the user's positions and motions, and virtual user models are constructed in the VR environment based on the collected data to operate robots as well as virtual objects, such as buttons. Users will learn how to operate the robot in a VR environment. The framework of VR purposed by Luis Pérez et al. \cite{Perez:2019ub} is applied to train operators to learn to control the robot. Since the environment does not need to change in real time, but rather needs to realistically recreate the factory scene, the VR scene here is not reconstructed in a way that it is captured and rendered in real time. Rather, a highly accurate 3D environment was reconstructed in advance using Blender after being captured with a 3D scanner.
 
+Building 3D scenes in virtual worlds based on information collected by robots is also a research highlight. Wang, et al. \cite{Wang:2017uy} were concerned with the visualization of the rescue robot and its surroundings in a virtual environment. The proposed human-robot interaction system uses incremental 3D-NDT map to render the robot's surroundings in real time. The user can view the robot's surroundings in a first-person view through the HTC-Vive and send control commands through the handle's arrow keys. The novel VR-based practical system provided by Patrick Stotko et al. \cite{Stotko:2019ud} uses distributed systems to reconstruct 3D scene. The data collected by the robot is fiest transmitted to the client responsible for reconstructing the scene. After the client has constructed the 3d scene, the set of actively reconstructed visible voxel blocks is sent to the server responsible for communication, which has a robot-based live telepresence and teleoperation system. This server will then broadcast the data back to the client used by the operator, thus enabling an immersive visualization of the robot within the scene.
 
+Others are more concerned about the manipulation of the robotic arm mounted on the robot. Moniri et al. \cite{Moniri:2016ud} provided a VR-based operating model for the robotic arm. The user wearing a headset can see a simulated 3D scene at the robot's end and send pickup commands to the remote robot by clicking on the target object with the mouse. The system proposed by Ostanin et al. \cite{Ostanin:2020uo} is worth mentioning. Although their proposed system for operating a robotic arm is based on mixed reality(MR), the article is highly relevant to this paper, considering the high relevance of MR and VR and the proposed system detailing the combination of ROS and robotics. In their system,the ROS Kinect was used as middleware and was responsible for communicating with the robot and the Unity side. The user can control the movement of the robot arm by selecting predefined options in the menu. In addition, the orbit and target points of the robot arm can be set by clicking on a hologram with a series of control points.
 
 # Implementation
 
@@ -94,7 +101,7 @@ In this chapter, the tools and techniques used in building this human-computer c
 > - the purpose of the unity project
 > - Components of the project: 4 operation modes & test Scene
 
-The main goal of this work is to design and implement a VR-based human-robot collaboration system with different methods of operating the robot in order to find out which method of operation is more suitable to be used to control the rescue robot. Further, it is to provide some basic insights for future development directions and to provide a general direction for finding an intuitive, easy-to-use and efficient operation method. Therefore, the proposed system was developed using Unity, including four modes of operation and a corresponding test environment for simulating post-disaster scenarios. In each operation mode, the user has a different method to control the robot. In addition, in order to better simulate the process by which the robot scans its surroundings and the computer side cumulatively gets a reconstructed 3D virtual scene, the test environment was implemented in such a way that the picture seen by the user depends on the direction of the robot's movement and the trajectory it travels through.
+The main goal of this work is to design and implement a VR-based human-robot collaboration system with different methods of operating the robot in order to find out which method of operation is more suitable to be used to control the rescue robot. Further, it is to provide some basic insights for future development directions and to provide a general direction for finding an intuitive, easy-to-use and efficient operation method. Therefore, the proposed system was developed using Unity, including four modes of operation and a corresponding test environment for simulating post-disaster scenarios. In each operation mode, the user has a different method to control the robot. In addition, in order to better simulate the process by which the robot scans its surroundings and the computer side cumulatively gets a reconstructed 3D virtual scene, the test environment was implemented in such a way that the picture seen by the user depends on the robot's movement and the trajectory it travels through.
 
 
 
@@ -126,7 +133,7 @@ Unity was chosen as the platform to develop the system. Unity is a widely used g
 >
 > information
 
-To simulate the process of a robot using a probe camera to detect the real environment and synchronise it to Unity, a conical collision body was set up on the robot. The robot will transform the Layers of the objects in the scene into visible Layers by collision detection and a trigger event (onTriggerEnter function). The robot's driving performance, such as the number of collisions, average speed, total distance, etc., will be recorded in each test. The detailed recorded information can be seen in Fig.\ref{fig:uml}. The movement of the robot depends on the value of the signal that is updated in each mode. In addition, the robot's Gameobject has the NavMeshAgent \footnote{https://docs.unity3d.com/ScriptReference/AI.NavMeshAgent.html} component, which supports the robot's navigation to the specified destination with automatic obstacle avoidance in the test scene. The virtual robot has three cameras. Each camera is set up in such a way that it can only see the area detected by the robot's radar and change the image bound to it in real time. The four operations described later all use the camera viewport as a monitoring screen by rendering the camera viewport on UI canvas.
+To simulate the process of a robot using a LiDAR remote sensor to detect the real environment and synchronise it to Unity, a sphere collision body was set up on the robot. The robot will transform the Layers of the objects in the scene into visible Layers by collision detection and a trigger event (onTriggerEnter function). The robot's driving performance, such as the number of collisions, average speed, total distance, etc., will be recorded in each test. The detailed recorded information can be seen in Fig.\ref{fig:uml}. The movement of the robot depends on the value of the signal that is updated in each mode. In addition, the robot's Gameobject has the NavMeshAgent \footnote{https://docs.unity3d.com/ScriptReference/AI.NavMeshAgent.html} component, which supports the robot's navigation to the specified destination with automatic obstacle avoidance in the test scene. The virtual robot has three cameras. One of the cameras is a simulation of a surveillance camera mounted on the robot, which can see all the items in the scene, although the distant items are not yet detected by LiDAR. Two of these camera are set up in such a way that they can only see the area detected by the robot's LiDAR remote sensor. Each camera captures what it sees and modifies the bound image bound in real time. The four operation modes described later all use the camera viewport as a monitoring screen by rendering the camera viewport on UI canvas.
 
 
 
@@ -162,14 +169,14 @@ In order to improve the reusability of the code and to facilitate the management
 > - functions: how to move robot, camera, map...
 > - photo
 
-In this mode, the user is controlling the robot's movement directly through the motion controller in the right hand. The touch pad of the motion controller determines the direction of rotation of the robot. The user can control the robot's driving speed by pulling the Trigger button. With the right-hand menu button, the surveillance screen around the robot can be turned on or off. The monitor window can be adjusted to a suitable position by dragging and rotating it. In the literature dealing with VR and human-computer collaboration, many researchers have used a similar operational approach. Therefore, as a widely used, and in a sense default operation approach, this mode was designed and became one of the proposed operation modes.
+In this mode, the user is controlling the robot's movement directly through the motion controller in the right hand. The touch pad of the motion controller determines the direction of rotation of the robot. The user can control the robot's driving speed by pulling the Trigger button. Fig.\ref{fig:htc} shows how to get the values from the HTC motion controller. The robot rotation direction will read the value of the touchpad X-axis. The range of values is $[-1,1]$. Forward speed reads the Trigger button passed in as a variable of type SteamVR_Action_Single, and the range of the variable is $[0,1]$. With the right-hand menu button, the surveillance screen around the robot can be turned on or off. The monitor window can be adjusted to a suitable position by dragging and rotating it. In the literature dealing with VR and human-computer collaboration, many researchers have used a similar operational approach. Therefore, as a widely used, and in a sense default operation approach, this mode was designed and became one of the proposed operation modes.
 
 ```latex
 \begin{figure}[h]
     \centering
-    \includegraphics[height=12cm]{graphics/htc controller.png}
-    \caption{HTC handle illustration. The robot rotation direction will read the value of the touchpad X-axis. The range of values is $[-1,1]$. Forward speed reads the Trigger button passed in as a variable of type SteamVR_Action_Single, and the range of the variable is $[0,1]$.}
-    \label{fig:uml}
+    \includegraphics[height=12cm]{graphics/htc.png}
+    \caption{HTC handle illustration. }
+    \label{fig:htc}
 \end{figure}
 ```
 
@@ -181,7 +188,7 @@ In this mode, the user is controlling the robot's movement directly through the
 > - functions: how to move robot, button, speed editor, auto drive 3 monitor....
 > - photo
 
-The original intention of designing this mode is that there is a part of the literature where the immersive human-robot collaborative framework are used to train operators how to operate the robot, avoiding risks and saving learning costs or directly as a platform for operating the robot \cite{Perez:2019ub}\cite{Matsas:2017vz}. Therefore, in this mode, a virtual laboratory environment is constructed, in which simulated buttons, controllers, and monitoring equipment are placed. The laboratory consists of two parts. The first part is the monitoring equipment: the monitoring screen is enlarged and placed at the front of the lab as a huge display. The second part is the operating console in the center of the laboratory, which can be moved by the user as desired. The user can use the buttons on the right side to lock the robot or let it walk forward automatically. In the middle of the console are two operating joysticks that determine the robot's forward motion and rotation respectively. The part that involves virtual joystick movement and button effects uses an open source github project VRtwix\footnote{https://github.com/rav3dev/vrtwix}. With the sliding stick on the left, the user can edit the speed of the robot's forward movement and rotation.
+The original intention of designing this mode is that there is a part of the literature where the immersive human-robot collaborative framework are used to train operators how to operate the robot, avoiding risks and saving learning costs or directly as a platform for operating the robot \cite{Perez:2019ub}\cite{Matsas:2017aa}. Therefore, in this mode, a virtual laboratory environment is constructed, in which simulated buttons, controllers, and monitoring equipment are placed. The laboratory consists of two parts. The first part is the monitoring equipment: the monitoring screen is enlarged and placed at the front of the lab as a huge display. The second part is the operating console in the center of the laboratory, which can be moved by the user as desired. The user can use the buttons on the right side to lock the robot or let it walk forward automatically. In the middle of the console are two operating joysticks that determine the robot's forward motion and rotation respectively. The part that involves virtual joystick movement and button effects uses an open source github project VRtwix\footnote{https://github.com/rav3dev/vrtwix}. With the sliding stick on the left, the user can edit the speed of the robot's forward movement and rotation.
 
 ##### 3. Remote Mode
 
@@ -189,6 +196,9 @@ The original intention of designing this mode is that there is a part of the lit
 > - functions: how to move robot: target(Pseudocode?) or virtural joystick. ItemPackage in Steam
 > - photo
 
+In this mode, the user can set the driving target point directly or control the robot by picking up the remote control that is placed on the toolbar. The target point is set by the ray emitted by the right motion controller. This process is similar to setting a teleportation point. After the target point is set, a square representing the destination is shown in the scene, and the robot will automatically travel to the set destination. The entire driving process uses the NavMeshAgent component and is therefore capable of automatic obstacle avoidance.
+By clicking on the menu button, a movable toolbar is opened with a remote control and a monitoring device. The remote control is a safety precaution in case the automatic navigation fails to navigate to the target point properly. The user can adjust the direction of the robot's travel by using the remote control. The pickup and auto-release parts use the ItemPackage component available in the SteamVR plugin.
+
 
 
 ##### 4. UI Mode
@@ -197,7 +207,7 @@ The original intention of designing this mode is that there is a part of the lit
 > - functions: introcude here compositions of the UI menu
 > - photo
 
-
+The virtual menu is also an interaction method that is often used in VR, so this mode is proposed. In this mode, the user must interact with the virtual menu using the ray emitted by the right motion controller. The virtual menu is set up with buttons for the direction of movement, speed controller, and buttons to open and close the monitor screen. In addition to this, an additional follow function is added to the menu, allowing the robot to follow the user's position in the virtual world. This is intended to let the user concentrate on observing the rendered VR environment. Also, having a real robot follow the user's location in the virtual world is a novel, unique human-machine integration mode in VR. The robot's automatic navigation uses the NavMeshAgent.
 
 
 
@@ -205,7 +215,7 @@ The original intention of designing this mode is that there is a part of the lit
 
 > - goal of the project: rescue robots  => destroyed city,
 > - environment:  destroyed city & Collider for test  [photo] 
-> - radar layer
+> - LiDAR layer
 
 In order to simulate the use of rescue robots in disaster scenarios, the test scenes were built to mimic the post-disaster urban environment as much as possible. The POLYGON Apocalypse\footnote{https://assetstore.unity.com/packages/3d/environments/urban/polygon-apocalypse-low-poly-3d-art-by-synty-154193}, available on the Unity Asset Store, is a low poly asset pack with a large number of models of buildings, streets, vehicles, etc. Using this resource pack as a base, additional collision bodies of the appropriate size were manually added to each building and obstacle after the pack was imported, which was needed to help track the robot's driving crash in subsequent tests.
 
@@ -221,13 +231,13 @@ The entire scene is initially invisible, and the visibility of each objects in t
 >
 > - Evaluate user experience and robot performance in different operating modes
 
-This chapter describes the design and detailed process of the user evaluation. The purpose of this user study is to measure the impact of four different modes of operation on rescue efficiency, robot driving performance, and psychological and physiological stress and fatigue, etc. For this purpose, participants are asked to find victims in a test scene using different modes of operation and to answer a questionnaire after the test corresponding to each mode of operation.
+This chapter describes the design and detailed process of the user evaluation. The purpose of this user study is to measure the impact of four different modes of operation on rescue efficiency, robot driving performance, and psychological and physiological stress and fatigue, etc. For this purpose, participants are asked to find victims in a test scene using different modes of operation and to answer questionnaires after the test corresponding to each mode of operation.
 
 
 
 ## Study Design
 
-The evaluation for each mode of operation consists of two main parts. The first part is the data recorded during the process of the participant driving the robot in the VR environment to find the victims. The recorded data includes information about the robot's collision and the speed of driving etc. The rescue of the victims was also considered as part of the evaluation. Besides the number of victims rescued, the number of victims who were already visible but neglected is also important data. The Official NASA Task Load Index (TLX) was used to measure the participants subjective workload asessments. Additionally, participants were asked specific questions for each mode of operation and were asked to select their favorite and least favorite modes of operation.
+The evaluation for each mode of operation consists of two main parts. The first part is the data recorded during the process of the participant driving the robot in the VR environment to find the victims. The recorded data includes information about the robot's collision and the speed of driving etc. The rescue of the victims was also considered as part of the evaluation. The Official NASA Task Load Index (TLX) was used to measure the participants subjective workload asessments. Additionally, participants were asked specific questions for each mode and were asked to select their favorite and least favorite operation mode. In order to reduce the influence of order effects on the testl results, the Balanced Latin Square was used when arranging the test order for the four operation modes.
 
 
 
@@ -246,30 +256,39 @@ Before the beginning of the actual testing process, participants were informed o
 ##### Entering the world of VR
 
 > 1. wear the headset
-> 2. familiar with the menu : switch & select mode(practice) & close
-> 3. change position : teleport & raise or lower
-> 4. rescue 1 victim
+>
+> 2. familiar with the basic VR action :
+>
+> 	- open & close Menu
+>
+> 	- change position : teleport & raise or lower
+>
+> 3. rescue 1 victim
 
-After the basic introduction part, participants would directly put on the VR headset and enter the VR environment to complete the rest of the tutorial. Considering that participants might not have experience with VR and that it would take time to learn how to operate the four different modes, the proposed system additionally sets up a practice pattern and places some models of victims in the practice scene. After entering the VR world, participants first needed to familiarize themselves with the opening and selecting options of the menu, as this involves switching between different modes and entering the test scenes. Then participants would use the motion controllers to try to teleport themselves, or raise themselves into mid-air. Finally participants were asked to interact with the victim model through virtual hands. After this series of general tutorials, participants were already generally familiar with the use of VR and how to move around in the VR world.
+After the basic introduction part, participants would directly put on the VR headset and enter the VR environment to complete the rest of the tutorial. Considering that participants might not have experience with VR and that it would take time to learn how to operate the four different modes, the proposed system additionally sets up a practice pattern and places some models of victims in the practice scene. After entering the VR world, participants first needed to familiarize themselves with the opening and closing menu, as well as useing the motion controllers to try to teleport themselves, or raise themselves into mid-air. Finally participants were asked to interact with the victim model through virtual hands. After this series of general tutorials, participants were already generally familiar with the use of VR and how to move around in the VR world.
 
 
 
 ##### Practice and evaluation of modes
 
 > 1. `foreach Mode`:
-> 	1. enter mode(practice)
-> 	2. try to move the robot
-> 	3. try to rescue 1-2 victims
-> 	4. enter test scene
-> 	5. -testing- 
-> 	6. Fill out the questionnaire: google form + TLX
+> 	1. practice
+> 		- try to move the robot
+> 		- try to rescue 1-2 victims
+> 	2. enter test scene
+> 	3. -testing- 
+> 	4. Fill out the questionnaire: google form + TLX
 >
 > 2. summary part of google form: 
 > 	- like/dislike most
 > 	- reason 
 > 	- feedback
 
-Given the different manipulation approaches for each mode, in order to avoid confusion between the different modes, participants would then take turns practicing and directly evaluating each mode immediately afterwards. The participant first switched to the mode of operation to be tested and manipulated the robot to move in that mode. After attempting to rescue 1-2 victim models and the participant indicated that he or she was familiar enough with this operation mode, the participant would enter the test scene. In the test scene, participants had to save as many victims as possible in a given time limit. Participants were required to move the robot around the test scene to explore the post-disaster city and to find and rescue victims. In this process, if the robot crashes with buildings, obstacles, etc., besides the collision information being recorded as test data, participants would also receive sound and vibration feedback. The test will automatically end when time runs out or when all the victims in the scene have been rescued. Participants were required to complete the evaluation questionnaire and the NASA evaluation form at the end of each test. This process was repeated in each mode of operation. 
+Given the different manipulation approaches for each mode, in order to avoid confusion between the different modes, participants would then take turns practicing and directly evaluating each mode immediately afterwards. 
+
+The sequence of modes to be tested  is predetermined. The order effect is an important factor affecting the test results. If the order of the operational modes to be tested was the same for each participant, the psychological and physical exhaustion caused by the last operation mode would inevitably be greater. In order to minimize the influence of the order effect on the results of the test, the Balanced Latin Square with the size of four was used to arrange the test order of the four operation modes.
+
+Participants automatically entered the practice scene corresponding to the relevant operation mode in the predefined order. After attempting to rescue 1-2 victim models and the participant indicated that he or she was familiar enough with this operation mode, the participant would enter the test scene. In the test scene, participants had to save as many victims as possible in a given time limit. Participants were required to move the robot around the test scene to explore the post-disaster city and to find and rescue victims. In this process, if the robot crashes with buildings, obstacles, etc., besides the collision information being recorded as test data, participants would also receive sound and vibration feedback. The test will automatically end when time runs out or when all the victims in the scene have been rescued. Participants were required to complete the evaluation questionnaire and the NASA evaluation form at the end of each test. This process was repeated in each mode of operation. 
 
 After all the tests were completed, participants were asked to compare the four operation modes and select the one they liked the most and the one they liked the least. In addition, participants could give their reasons for the choice and express their opinions as much as they wanted, such as suggestions for improvement or problems found during operation.
 
@@ -311,4 +330,5 @@ After all the tests were completed, participants were asked to compare the four
 >
 > - communication with ROS
 > - Real Robots
-> - Use real scene: reconstructed 3D model based on the daten from robot sensors
+> - Use real scene: reconstructed 3D model based on the daten from robot sensors
+

BIN
Thesis_LaTeX/.DS_Store


+ 20 - 14
Thesis_LaTeX/BibTexDatei.bib

@@ -1,13 +1,32 @@
 %% This BibTeX bibliography file was created using BibDesk.
 %% https://bibdesk.sourceforge.io/
 
-%% Created for 夜未央 at 2021-07-01 15:35:33 +0200 
+%% Created for 夜未央 at 2021-07-04 13:54:41 +0200 
 
 
 %% Saved with string encoding Unicode (UTF-8) 
 
 
 
+@article{Matsas:2017aa,
+	abstract = {This paper presents a highly interactive and immersive Virtual Reality Training System (VRTS) (``beWare of the Robot'') in terms of a serious game that simulates in real-time the cooperation between industrial robotic manipulators and humans, executing simple manufacturing tasks. The scenario presented refers to collaborative handling in tape-laying for building aerospace composite parts. The tools, models and techniques developed and used to build the ``beWare of the Robot''application are described. System setup and configuration are presented in detail, as well as user tracking and navigation issues. Special emphasis is given to the interaction techniques used to facilitate implementation of virtual human--robot (HR) collaboration. Safety issues, such as contacts and collisions are mainly tackled through ``emergencies'', i.e. warning signals in terms of visual stimuli and sound alarms. Mental safety is of utmost priority and the user is provided augmented situational awareness and enhanced perception of the robot's motion due to immersion and real-time interaction offered by the VRTS as well as by special warning stimuli. The short-term goal of the research was to investigate users'enhanced experience and behaviour inside the virtual world while cooperating with the robot and positive pertinent preliminary findings are presented and briefly discussed. In the longer term, the system can be used to investigate acceptability of H--R collaboration and, ultimately, serve as a platform for programming collaborative H--R manufacturing cells.},
+	author = {Matsas, Elias and Vosniakos, George-Christopher},
+	da = {2017/05/01},
+	date-added = {2021-07-04 13:54:19 +0200},
+	date-modified = {2021-07-04 13:54:19 +0200},
+	doi = {10.1007/s12008-015-0259-2},
+	id = {Matsas2017},
+	isbn = {1955-2505},
+	journal = {International Journal on Interactive Design and Manufacturing (IJIDeM)},
+	number = {2},
+	pages = {139--153},
+	title = {Design of a virtual reality training system for human--robot collaboration in manufacturing tasks},
+	ty = {JOUR},
+	url = {https://doi.org/10.1007/s12008-015-0259-2},
+	volume = {11},
+	year = {2017},
+	Bdsk-Url-1 = {https://doi.org/10.1007/s12008-015-0259-2}}
+
 @book{Stotko:2019ud,
 	author = {Stotko, Patrick and Krumpen, Stefan and Schwarz, Max and Lenz, Christian and Behnke, Sven and Klein, Reinhard and Weinmann, Michael},
 	date = {2019/11/01},
@@ -396,19 +415,6 @@
 	year1 = {2020},
 	Bdsk-Url-1 = {https://doi.org/10.1080/0951192X.2019.1690685}}
 
-@article{Matsas:2017vz,
-	author = {Matsas, Elias and Vosniakos, George-Christopher},
-	date = {2017},
-	date-added = {2021-06-02 14:34:55 +0200},
-	date-modified = {2021-06-02 14:34:55 +0200},
-	journal = {International Journal on Interactive Design and Manufacturing (IJIDeM)},
-	number = {2},
-	pages = {139-153 %@ 1955-2513},
-	publisher = {Springer},
-	title = {Design of a virtual reality training system for human--robot collaboration in manufacturing tasks},
-	volume = {11},
-	year = {2017}}
-
 @article{SOARES20151656,
 	abstract = {The remote handling (RH) plays an important role in nuclear test facilities, such as in ITER, for in-vessel and ex-vessel maintenance operations. Unexpected situations may occur when RH devices fail. Since no human being is allowed during the RH operations, a Multi-purpose Rescue Vehicle (MPRV) must be required for providing support in site. This paper proposes a design of a MPRV, i.e., a mobile platform equipped with different sensors and two manipulators with different sets of end-effectors. A human--machine interface is also proposed to remotely operate the MPRV and to carry out rescue and recovery operations.},
 	author = {Jo{\~a}o Soares and Alberto Vale and Rodrigo Ventura},

+ 72 - 44
Thesis_LaTeX/Thesis_Jingyi.aux

@@ -28,11 +28,11 @@
 \babel@aux{english}{}
 \babel@aux{ngerman}{}
 \babel@aux{english}{}
-\BKM@entry{id=1,dest={636861707465722E31},srcline={1},srcfile={2E2F63686170746572732F61627374726163742E746578}}{5C3337365C3337375C303030415C303030625C303030735C303030745C303030725C303030615C303030635C30303074}
+\BKM@entry{id=1,dest={636861707465722E31},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F61627374726163742E746578}}{5C3337365C3337375C303030415C303030625C303030735C303030745C303030725C303030615C303030635C30303074}
 \@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {1}Abstract}{4}{chapter.1}\protected@file@percent }
 \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
-\BKM@entry{id=2,dest={636861707465722E32},srcline={1},srcfile={2E2F63686170746572732F696E74726F64756374696F6E2E746578}}{5C3337365C3337375C303030495C3030306E5C303030745C303030725C3030306F5C303030645C303030755C303030635C303030745C303030695C3030306F5C3030306E}
+\BKM@entry{id=2,dest={636861707465722E32},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696E74726F64756374696F6E2E746578}}{5C3337365C3337375C303030495C3030306E5C303030745C303030725C3030306F5C303030645C303030755C303030635C303030745C303030695C3030306F5C3030306E}
 \abx@aux@cite{Casper:2003tk}
 \abx@aux@segm{0}{0}{Casper:2003tk}
 \abx@aux@cite{Murphy:2012th}
@@ -68,57 +68,80 @@
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
 \abx@aux@cite{Wang:2017uy}
 \abx@aux@segm{0}{0}{Wang:2017uy}
-\BKM@entry{id=3,dest={636861707465722E33},srcline={1},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030495C3030306D5C303030705C3030306C5C303030655C3030306D5C303030655C3030306E5C303030745C303030615C303030745C303030695C3030306F5C3030306E}
-\BKM@entry{id=4,dest={73656374696F6E2E332E31},srcline={8},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C3030304F5C303030765C303030655C303030725C303030765C303030695C303030655C30303077}
-\BKM@entry{id=5,dest={73656374696F6E2E332E32},srcline={12},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030535C303030795C303030735C303030745C303030655C3030306D5C3030305C3034305C303030415C303030725C303030635C303030685C303030695C303030745C303030655C303030635C303030745C303030755C303030725C30303065}
+\BKM@entry{id=3,dest={636861707465722E33},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F72656C617465645F776F726B2E746578}}{5C3337365C3337375C303030525C303030655C3030306C5C303030615C303030745C303030655C303030645C3030305C3034305C303030575C3030306F5C303030725C3030306B}
+\abx@aux@cite{Matsas:2017aa}
+\abx@aux@segm{0}{0}{Matsas:2017aa}
+\abx@aux@cite{Perez:2019ub}
+\abx@aux@segm{0}{0}{Perez:2019ub}
+\abx@aux@cite{Wang:2017uy}
+\abx@aux@segm{0}{0}{Wang:2017uy}
+\abx@aux@cite{Stotko:2019ud}
+\abx@aux@segm{0}{0}{Stotko:2019ud}
+\abx@aux@cite{Moniri:2016ud}
+\abx@aux@segm{0}{0}{Moniri:2016ud}
+\abx@aux@cite{Ostanin:2020uo}
+\abx@aux@segm{0}{0}{Ostanin:2020uo}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {3}Related Work}{7}{chapter.3}\protected@file@percent }
+\@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
+\@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
+\newlabel{related}{{3}{7}{Related Work}{chapter.3}{}}
+\BKM@entry{id=4,dest={636861707465722E34},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030495C3030306D5C303030705C3030306C5C303030655C3030306D5C303030655C3030306E5C303030745C303030615C303030745C303030695C3030306F5C3030306E}
+\BKM@entry{id=5,dest={73656374696F6E2E342E31},srcline={8},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C3030304F5C303030765C303030655C303030725C303030765C303030695C303030655C30303077}
+\BKM@entry{id=6,dest={73656374696F6E2E342E32},srcline={11},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030535C303030795C303030735C303030745C303030655C3030306D5C3030305C3034305C303030415C303030725C303030635C303030685C303030695C303030745C303030655C303030635C303030745C303030755C303030725C30303065}
 \abx@aux@cite{Whitney:2018wk}
 \abx@aux@segm{0}{0}{Whitney:2018wk}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {3}Implementation}{7}{chapter.3}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {4}Implementation}{9}{chapter.4}\protected@file@percent }
 \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
-\newlabel{implementation}{{3}{7}{Implementation}{chapter.3}{}}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {3.1}Overview}{7}{section.3.1}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {3.2}System Architecture}{7}{section.3.2}\protected@file@percent }
-\BKM@entry{id=6,dest={73656374696F6E2E332E33},srcline={17},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030525C3030306F5C303030625C3030306F5C30303074}
-\BKM@entry{id=7,dest={73656374696F6E2E332E34},srcline={21},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030495C3030306E5C303030745C303030655C303030725C303030615C303030635C303030745C303030695C3030306F5C3030306E5C3030305C3034305C303030745C303030655C303030635C303030685C3030306E5C303030695C303030715C303030755C303030655C30303073}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {3.3}Robot}{8}{section.3.3}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {3.4}Interaction techniques}{8}{section.3.4}\protected@file@percent }
-\BKM@entry{id=8,dest={73756273656374696F6E2E332E342E31},srcline={40},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030485C303030615C3030306E5C303030645C3030306C5C303030655C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
-\BKM@entry{id=9,dest={73756273656374696F6E2E332E342E32},srcline={41},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C3030304C5C303030615C303030625C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
-\BKM@entry{id=10,dest={73756273656374696F6E2E332E342E33},srcline={42},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030525C303030655C3030306D5C3030306F5C303030745C303030655C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
-\BKM@entry{id=11,dest={73756273656374696F6E2E332E342E34},srcline={43},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030555C303030495C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
-\BKM@entry{id=12,dest={73656374696F6E2E332E35},srcline={46},srcfile={2E2F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030545C303030655C303030735C303030745C3030305C3034305C303030535C303030635C303030655C3030306E5C30303065}
-\@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\contentsline {figure}{\numberline {3.1}{\ignorespaces UML Class diagram for the main structure of the system}}{9}{figure.3.1}\protected@file@percent }
-\newlabel{fig:uml}{{3.1}{9}{UML Class diagram for the main structure of the system}{figure.3.1}{}}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {3.4.1}Handle Mode}{10}{subsection.3.4.1}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {3.4.2}Lab Mode}{10}{subsection.3.4.2}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {3.4.3}Remote Mode}{10}{subsection.3.4.3}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {3.4.4}UI Mode}{10}{subsection.3.4.4}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {3.5}Test Scene}{10}{section.3.5}\protected@file@percent }
-\BKM@entry{id=13,dest={636861707465722E34},srcline={1},srcfile={2E2F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030455C303030765C303030615C3030306C5C303030755C303030615C303030745C303030695C3030306F5C3030306E5C3030305C3034305C3030306F5C303030665C3030305C3034305C303030555C303030735C303030655C303030725C3030305C3034305C303030455C303030785C303030705C303030655C303030725C303030695C303030655C3030306E5C303030635C30303065}
-\BKM@entry{id=14,dest={73656374696F6E2E342E31},srcline={9},srcfile={2E2F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030535C303030745C303030755C303030645C303030795C3030305C3034305C303030445C303030655C303030735C303030695C303030675C3030306E}
-\BKM@entry{id=15,dest={73656374696F6E2E342E32},srcline={15},srcfile={2E2F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030505C303030725C3030306F5C303030635C303030655C303030645C303030755C303030725C30303065}
-\BKM@entry{id=16,dest={73656374696F6E2E342E33},srcline={20},srcfile={2E2F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030455C3030306E5C303030745C303030655C303030725C303030695C3030306E5C303030675C3030305C3034305C303030745C303030685C303030655C3030305C3034305C303030775C3030306F5C303030725C3030306C5C303030645C3030305C3034305C3030306F5C303030665C3030305C3034305C303030565C30303052}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {4}Evaluation of User Experience}{11}{chapter.4}\protected@file@percent }
+\newlabel{implementation}{{4}{9}{Implementation}{chapter.4}{}}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.1}Overview}{9}{section.4.1}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.2}System Architecture}{9}{section.4.2}\protected@file@percent }
+\BKM@entry{id=7,dest={73656374696F6E2E342E33},srcline={16},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030525C3030306F5C303030625C3030306F5C30303074}
+\BKM@entry{id=8,dest={73656374696F6E2E342E34},srcline={20},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030495C3030306E5C303030745C303030655C303030725C303030615C303030635C303030745C303030695C3030306F5C3030306E5C3030305C3034305C303030745C303030655C303030635C303030685C3030306E5C303030695C303030715C303030755C303030655C30303073}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.3}Robot}{10}{section.4.3}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.4}Interaction techniques}{10}{section.4.4}\protected@file@percent }
+\BKM@entry{id=9,dest={73756273656374696F6E2E342E342E31},srcline={39},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030485C303030615C3030306E5C303030645C3030306C5C303030655C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
+\BKM@entry{id=10,dest={73756273656374696F6E2E342E342E32},srcline={49},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C3030304C5C303030615C303030625C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
+\abx@aux@cite{Perez:2019ub}
+\abx@aux@segm{0}{0}{Perez:2019ub}
+\abx@aux@cite{Matsas:2017aa}
+\abx@aux@segm{0}{0}{Matsas:2017aa}
+\BKM@entry{id=11,dest={73756273656374696F6E2E342E342E33},srcline={52},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030525C303030655C3030306D5C3030306F5C303030745C303030655C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {4.4.1}Handle Mode}{11}{subsection.4.4.1}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {4.4.2}Lab Mode}{11}{subsection.4.4.2}\protected@file@percent }
+\@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\contentsline {figure}{\numberline {4.1}{\ignorespaces UML Class diagram for the main structure of the system}}{12}{figure.4.1}\protected@file@percent }
+\newlabel{fig:uml}{{4.1}{12}{UML Class diagram for the main structure of the system}{figure.4.1}{}}
+\BKM@entry{id=12,dest={73756273656374696F6E2E342E342E34},srcline={58},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030555C303030495C3030305C3034305C3030304D5C3030306F5C303030645C30303065}
+\@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\contentsline {figure}{\numberline {4.2}{\ignorespaces HTC handle illustration.}}{13}{figure.4.2}\protected@file@percent }
+\newlabel{fig:htc}{{4.2}{13}{HTC handle illustration}{figure.4.2}{}}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {4.4.3}Remote Mode}{13}{subsection.4.4.3}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {subsection}{\numberline {4.4.4}UI Mode}{13}{subsection.4.4.4}\protected@file@percent }
+\BKM@entry{id=13,dest={73656374696F6E2E342E35},srcline={62},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F696D706C656D656E746174696F6E2E746578}}{5C3337365C3337375C303030545C303030655C303030735C303030745C3030305C3034305C303030535C303030635C303030655C3030306E5C30303065}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.5}Test Scene}{14}{section.4.5}\protected@file@percent }
+\BKM@entry{id=14,dest={636861707465722E35},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030455C303030765C303030615C3030306C5C303030755C303030615C303030745C303030695C3030306F5C3030306E5C3030305C3034305C3030306F5C303030665C3030305C3034305C303030555C303030735C303030655C303030725C3030305C3034305C303030455C303030785C303030705C303030655C303030725C303030695C303030655C3030306E5C303030635C30303065}
+\BKM@entry{id=15,dest={73656374696F6E2E352E31},srcline={8},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030535C303030745C303030755C303030645C303030795C3030305C3034305C303030445C303030655C303030735C303030695C303030675C3030306E}
+\BKM@entry{id=16,dest={73656374696F6E2E352E32},srcline={14},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030505C303030725C3030306F5C303030635C303030655C303030645C303030755C303030725C30303065}
+\BKM@entry{id=17,dest={73656374696F6E2E352E33},srcline={19},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030455C3030306E5C303030745C303030655C303030725C303030695C3030306E5C303030675C3030305C3034305C303030745C303030685C303030655C3030305C3034305C303030775C3030306F5C303030725C3030306C5C303030645C3030305C3034305C3030306F5C303030665C3030305C3034305C303030565C30303052}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {5}Evaluation of User Experience}{15}{chapter.5}\protected@file@percent }
 \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
-\newlabel{evaluate}{{4}{11}{Evaluation of User Experience}{chapter.4}{}}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.1}Study Design}{11}{section.4.1}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.2}Procedure}{11}{section.4.2}\protected@file@percent }
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.3}Entering the world of VR}{11}{section.4.3}\protected@file@percent }
-\BKM@entry{id=17,dest={73656374696F6E2E342E34},srcline={25},srcfile={2E2F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030505C303030725C303030615C303030635C303030745C303030695C303030635C303030655C3030305C3034305C303030615C3030306E5C303030645C3030305C3034305C303030655C303030765C303030615C3030306C5C303030755C303030615C303030745C303030695C3030306F5C3030306E5C3030305C3034305C3030306F5C303030665C3030305C3034305C3030306D5C3030306F5C303030645C303030655C30303073}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {4.4}Practice and evaluation of modes}{12}{section.4.4}\protected@file@percent }
-\BKM@entry{id=18,dest={636861707465722E35},srcline={1},srcfile={2E2F63686170746572732F726573756C742E746578}}{5C3337365C3337375C303030525C303030655C303030735C303030755C3030306C5C303030745C30303073}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {5}Results}{13}{chapter.5}\protected@file@percent }
+\newlabel{evaluate}{{5}{15}{Evaluation of User Experience}{chapter.5}{}}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {5.1}Study Design}{15}{section.5.1}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {5.2}Procedure}{15}{section.5.2}\protected@file@percent }
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {5.3}Entering the world of VR}{15}{section.5.3}\protected@file@percent }
+\BKM@entry{id=18,dest={73656374696F6E2E352E34},srcline={24},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F6576616C756174652E746578}}{5C3337365C3337375C303030505C303030725C303030615C303030635C303030745C303030695C303030635C303030655C3030305C3034305C303030615C3030306E5C303030645C3030305C3034305C303030655C303030765C303030615C3030306C5C303030755C303030615C303030745C303030695C3030306F5C3030306E5C3030305C3034305C3030306F5C303030665C3030305C3034305C3030306D5C3030306F5C303030645C303030655C30303073}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {section}{\numberline {5.4}Practice and evaluation of modes}{16}{section.5.4}\protected@file@percent }
+\BKM@entry{id=19,dest={636861707465722E36},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F726573756C742E746578}}{5C3337365C3337375C303030525C303030655C303030735C303030755C3030306C5C303030745C30303073}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {6}Results}{17}{chapter.6}\protected@file@percent }
 \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
-\newlabel{result}{{5}{13}{Results}{chapter.5}{}}
-\BKM@entry{id=19,dest={636861707465722E36},srcline={1},srcfile={2E2F63686170746572732F636F6E636C7573696F6E2E746578}}{5C3337365C3337375C303030435C3030306F5C3030306E5C303030635C3030306C5C303030755C303030735C303030695C3030306F5C3030306E5C3030305C3034305C303030615C3030306E5C303030645C3030305C3034305C303030665C303030755C303030745C303030755C303030725C303030655C3030305C3034305C303030775C3030306F5C303030725C3030306B}
-\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {6}Conclusion and future work}{14}{chapter.6}\protected@file@percent }
+\newlabel{result}{{6}{17}{Results}{chapter.6}{}}
+\BKM@entry{id=20,dest={636861707465722E37},srcline={1},srcfile={2F55736572732F656C61696E652F4465736B746F702F5468657369732D486563746F722D56522F5468657369735F4C615465582F63686170746572732F636F6E636C7573696F6E2E746578}}{5C3337365C3337375C303030435C3030306F5C3030306E5C303030635C3030306C5C303030755C303030735C303030695C3030306F5C3030306E5C3030305C3034305C303030615C3030306E5C303030645C3030305C3034305C303030665C303030755C303030745C303030755C303030725C303030655C3030305C3034305C303030775C3030306F5C303030725C3030306B}
+\@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\contentsline {chapter}{\numberline {7}Conclusion and future work}{18}{chapter.7}\protected@file@percent }
 \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\addvspace {10\p@ }}
 \@writefile{lot}{\defcounter {refsection}{0}\relax }\@writefile{lot}{\addvspace {10\p@ }}
-\newlabel{conclusion}{{6}{14}{Conclusion and future work}{chapter.6}{}}
-\abx@aux@read@bbl@mdfivesum{C19EC8D5009345DD3009530555541C21}
+\newlabel{conclusion}{{7}{18}{Conclusion and future work}{chapter.7}{}}
+\abx@aux@read@bbl@mdfivesum{23885E836F2050D1EFE57D3A6A443718}
 \abx@aux@refcontextdefaultsdone
 \abx@aux@defaultrefcontext{0}{Casper:2003tk}{none/global//global/global}
 \abx@aux@defaultrefcontext{0}{Murphy:2012th}{none/global//global/global}
@@ -136,9 +159,14 @@
 \abx@aux@defaultrefcontext{0}{Villani:2018ub}{none/global//global/global}
 \abx@aux@defaultrefcontext{0}{Liu:2017tw}{none/global//global/global}
 \abx@aux@defaultrefcontext{0}{Wang:2017uy}{none/global//global/global}
+\abx@aux@defaultrefcontext{0}{Matsas:2017aa}{none/global//global/global}
+\abx@aux@defaultrefcontext{0}{Perez:2019ub}{none/global//global/global}
+\abx@aux@defaultrefcontext{0}{Stotko:2019ud}{none/global//global/global}
+\abx@aux@defaultrefcontext{0}{Moniri:2016ud}{none/global//global/global}
+\abx@aux@defaultrefcontext{0}{Ostanin:2020uo}{none/global//global/global}
 \abx@aux@defaultrefcontext{0}{Whitney:2018wk}{none/global//global/global}
 \global\@namedef{scr@dte@chapter@lastmaxnumwidth}{11.67593pt}
 \global\@namedef{scr@dte@section@lastmaxnumwidth}{21.47992pt}
 \global\@namedef{scr@dte@subsection@lastmaxnumwidth}{31.48792pt}
 \@writefile{toc}{\defcounter {refsection}{0}\relax }\@writefile{toc}{\providecommand\tocbasic@end@toc@file{}\tocbasic@end@toc@file}
-\gdef \@abspage@last{16}
+\gdef \@abspage@last{21}

+ 245 - 2
Thesis_LaTeX/Thesis_Jingyi.bbl

@@ -766,6 +766,249 @@
       \verb 10.1109/CAC.2017.8243836
       \endverb
     \endentry
+    \entry{Matsas:2017aa}{article}{}
+      \name{author}{2}{}{%
+        {{hash=aa1ca691cd4a3c881b245f9e8af506a3}{%
+           family={Matsas},
+           familyi={M\bibinitperiod},
+           given={Elias},
+           giveni={E\bibinitperiod}}}%
+        {{hash=964c3c81b9720a9f9b35ab5d6770f3a2}{%
+           family={Vosniakos},
+           familyi={V\bibinitperiod},
+           given={George-Christopher},
+           giveni={G\bibinithyphendelim C\bibinitperiod}}}%
+      }
+      \strng{namehash}{c7a15625feb988f401822db815aed1d9}
+      \strng{fullhash}{c7a15625feb988f401822db815aed1d9}
+      \strng{bibnamehash}{c7a15625feb988f401822db815aed1d9}
+      \strng{authorbibnamehash}{c7a15625feb988f401822db815aed1d9}
+      \strng{authornamehash}{c7a15625feb988f401822db815aed1d9}
+      \strng{authorfullhash}{c7a15625feb988f401822db815aed1d9}
+      \field{sortinit}{1}
+      \field{sortinithash}{4f6aaa89bab872aa0999fec09ff8e98a}
+      \field{labelnamesource}{author}
+      \field{labeltitlesource}{title}
+      \field{abstract}{This paper presents a highly interactive and immersive Virtual Reality Training System (VRTS) (``beWare of the Robot'') in terms of a serious game that simulates in real-time the cooperation between industrial robotic manipulators and humans, executing simple manufacturing tasks. The scenario presented refers to collaborative handling in tape-laying for building aerospace composite parts. The tools, models and techniques developed and used to build the ``beWare of the Robot''application are described. System setup and configuration are presented in detail, as well as user tracking and navigation issues. Special emphasis is given to the interaction techniques used to facilitate implementation of virtual human--robot (HR) collaboration. Safety issues, such as contacts and collisions are mainly tackled through ``emergencies'', i.e. warning signals in terms of visual stimuli and sound alarms. Mental safety is of utmost priority and the user is provided augmented situational awareness and enhanced perception of the robot's motion due to immersion and real-time interaction offered by the VRTS as well as by special warning stimuli. The short-term goal of the research was to investigate users'enhanced experience and behaviour inside the virtual world while cooperating with the robot and positive pertinent preliminary findings are presented and briefly discussed. In the longer term, the system can be used to investigate acceptability of H--R collaboration and, ultimately, serve as a platform for programming collaborative H--R manufacturing cells.}
+      \field{isbn}{1955-2505}
+      \field{journaltitle}{International Journal on Interactive Design and Manufacturing (IJIDeM)}
+      \field{number}{2}
+      \field{title}{Design of a virtual reality training system for human--robot collaboration in manufacturing tasks}
+      \field{volume}{11}
+      \field{year}{2017}
+      \field{pages}{139\bibrangedash 153}
+      \range{pages}{15}
+      \verb{doi}
+      \verb 10.1007/s12008-015-0259-2
+      \endverb
+      \verb{urlraw}
+      \verb https://doi.org/10.1007/s12008-015-0259-2
+      \endverb
+      \verb{url}
+      \verb https://doi.org/10.1007/s12008-015-0259-2
+      \endverb
+    \endentry
+    \entry{Perez:2019ub}{article}{}
+      \name{author}{4}{}{%
+        {{hash=758632de20f8819be2d06cd9fb54ce3f}{%
+           family={Pérez},
+           familyi={P\bibinitperiod},
+           given={Luis},
+           giveni={L\bibinitperiod}}}%
+        {{hash=d1cf8f2b0d80080b7fd4cb8b878703d4}{%
+           family={Diez},
+           familyi={D\bibinitperiod},
+           given={Eduardo},
+           giveni={E\bibinitperiod}}}%
+        {{hash=2d4a1692d5dec8545ffb0f671bc76b34}{%
+           family={Usamentiaga},
+           familyi={U\bibinitperiod},
+           given={Rubén},
+           giveni={R\bibinitperiod}}}%
+        {{hash=4ac126b84d6f38bd8d62979d308c378b}{%
+           family={Garcı́a},
+           familyi={G\bibinitperiod},
+           given={Daniel\bibnamedelima F.},
+           giveni={D\bibinitperiod\bibinitdelim F\bibinitperiod}}}%
+      }
+      \strng{namehash}{afd1f7a72016ac4257bd84049d59ed11}
+      \strng{fullhash}{381f97873c358835a8d186af720f6d3e}
+      \strng{bibnamehash}{afd1f7a72016ac4257bd84049d59ed11}
+      \strng{authorbibnamehash}{afd1f7a72016ac4257bd84049d59ed11}
+      \strng{authornamehash}{afd1f7a72016ac4257bd84049d59ed11}
+      \strng{authorfullhash}{381f97873c358835a8d186af720f6d3e}
+      \field{sortinit}{1}
+      \field{sortinithash}{4f6aaa89bab872aa0999fec09ff8e98a}
+      \field{labelnamesource}{author}
+      \field{labeltitlesource}{title}
+      \field{abstract}{Nowadays, we are involved in the fourth industrial revolution, commonly referred to as ``Industry 4.0,''where cyber-physical systems and intelligent automation, including robotics, are the keys. Traditionally, the use of robots has been limited by safety and, in addition, some manufacturing tasks are too complex to be fully automated. Thus, human-robot collaborative applications, where robots are not isolated, are necessary in order to increase the productivity ensuring the safety of the operators with new perception systems for the robot and new interaction interfaces for the human. Moreover, virtual reality has been extended to the industry in the last years, but most of its applications are not related to robots. In this context, this paper works on the synergies between virtual reality and robotics, presenting the use of commercial gaming technologies to create a totally immersive environment based on virtual reality. This environment includes an interface connected to the robot controller, where the necessary mathematical models have been implemented for the control of the virtual robot. The proposed system can be used for training, simulation, and what is more innovative, for robot controlling in an integrated, non-expensive and unique application. Results show that the immersive experience increments the efficiency of the training and simulation processes, offering a cost-effective solution.}
+      \field{isbn}{0166-3615}
+      \field{journaltitle}{Computers in Industry}
+      \field{title}{Industrial robot control and operator training using virtual reality interfaces}
+      \field{volume}{109}
+      \field{year}{2019}
+      \field{pages}{114\bibrangedash 120}
+      \range{pages}{7}
+      \verb{doi}
+      \verb https://doi.org/10.1016/j.compind.2019.05.001
+      \endverb
+      \verb{urlraw}
+      \verb https://www.sciencedirect.com/science/article/pii/S0166361518308546
+      \endverb
+      \verb{url}
+      \verb https://www.sciencedirect.com/science/article/pii/S0166361518308546
+      \endverb
+      \keyw{Robots; Virtual reality; Human-machine interface; Virtual manufacturing; Industry 4.0}
+    \endentry
+    \entry{Stotko:2019ud}{book}{}
+      \name{author}{7}{}{%
+        {{hash=a5adb02efccd0a8f6facf03ad6526673}{%
+           family={Stotko},
+           familyi={S\bibinitperiod},
+           given={Patrick},
+           giveni={P\bibinitperiod}}}%
+        {{hash=bb48cd8acc8c7c1f9ee6ba11711163c3}{%
+           family={Krumpen},
+           familyi={K\bibinitperiod},
+           given={Stefan},
+           giveni={S\bibinitperiod}}}%
+        {{hash=caccf52a9bcef49fdf5dd218bc6699ac}{%
+           family={Schwarz},
+           familyi={S\bibinitperiod},
+           given={Max},
+           giveni={M\bibinitperiod}}}%
+        {{hash=e7a63ec25b9465f6374f6f7c63779d56}{%
+           family={Lenz},
+           familyi={L\bibinitperiod},
+           given={Christian},
+           giveni={C\bibinitperiod}}}%
+        {{hash=3110caa22b682a3c7f48017d49dccc6b}{%
+           family={Behnke},
+           familyi={B\bibinitperiod},
+           given={Sven},
+           giveni={S\bibinitperiod}}}%
+        {{hash=83ee01374e4d644d739e4382e249dcf8}{%
+           family={Klein},
+           familyi={K\bibinitperiod},
+           given={Reinhard},
+           giveni={R\bibinitperiod}}}%
+        {{hash=3fac57dfc6c43c66bebe18d4ff8873ad}{%
+           family={Weinmann},
+           familyi={W\bibinitperiod},
+           given={Michael},
+           giveni={M\bibinitperiod}}}%
+      }
+      \strng{namehash}{2181d290cc02b5140bf79ff0aabf85a4}
+      \strng{fullhash}{813e44226b2a71da22f44e910b806b8f}
+      \strng{bibnamehash}{2181d290cc02b5140bf79ff0aabf85a4}
+      \strng{authorbibnamehash}{2181d290cc02b5140bf79ff0aabf85a4}
+      \strng{authornamehash}{2181d290cc02b5140bf79ff0aabf85a4}
+      \strng{authorfullhash}{813e44226b2a71da22f44e910b806b8f}
+      \field{sortinit}{2}
+      \field{sortinithash}{8b555b3791beccb63322c22f3320aa9a}
+      \field{labelnamesource}{author}
+      \field{labeltitlesource}{title}
+      \field{month}{11}
+      \field{title}{A VR System for Immersive Teleoperation and Live Exploration with a Mobile Robot}
+      \field{year}{2019}
+      \verb{doi}
+      \verb 10.1109/IROS40897.2019.8968598
+      \endverb
+      \warn{\item Entry 'Stotko:2019ud' (BibTexDatei.bib): Invalid format '2019/11/01' of date field 'date' - ignoring}
+    \endentry
+    \entry{Moniri:2016ud}{inproceedings}{}
+      \name{author}{4}{}{%
+        {{hash=b249a8ff2846d2fce417998d16be7d09}{%
+           family={Moniri},
+           familyi={M\bibinitperiod},
+           given={M.\bibnamedelimi M.},
+           giveni={M\bibinitperiod\bibinitdelim M\bibinitperiod}}}%
+        {{hash=1b19da6558091d4a5e850530e9246fd6}{%
+           family={Valcarcel},
+           familyi={V\bibinitperiod},
+           given={F.\bibnamedelimi A.\bibnamedelimi E.},
+           giveni={F\bibinitperiod\bibinitdelim A\bibinitperiod\bibinitdelim E\bibinitperiod}}}%
+        {{hash=90eade11523d9865cc41c8ff27c760f9}{%
+           family={Merkel},
+           familyi={M\bibinitperiod},
+           given={D.},
+           giveni={D\bibinitperiod}}}%
+        {{hash=5c66f73772e871d9ebda68e20577fddc}{%
+           family={Sonntag},
+           familyi={S\bibinitperiod},
+           given={D.},
+           giveni={D\bibinitperiod}}}%
+      }
+      \strng{namehash}{23a55dcc813e53ff96b34db2df809206}
+      \strng{fullhash}{473ca9b76bcbc60848f42a8e1b023420}
+      \strng{bibnamehash}{23a55dcc813e53ff96b34db2df809206}
+      \strng{authorbibnamehash}{23a55dcc813e53ff96b34db2df809206}
+      \strng{authornamehash}{23a55dcc813e53ff96b34db2df809206}
+      \strng{authorfullhash}{473ca9b76bcbc60848f42a8e1b023420}
+      \field{sortinit}{2}
+      \field{sortinithash}{8b555b3791beccb63322c22f3320aa9a}
+      \field{labelnamesource}{author}
+      \field{labeltitlesource}{title}
+      \field{booktitle}{2016 12th International Conference on Intelligent Environments (IE)}
+      \field{isbn}{2472-7571}
+      \field{journaltitle}{2016 12th International Conference on Intelligent Environments (IE)}
+      \field{title}{Human Gaze and Focus-of-Attention in Dual Reality Human-Robot Collaboration}
+      \field{year}{2016}
+      \field{pages}{238\bibrangedash 241}
+      \range{pages}{4}
+      \verb{doi}
+      \verb 10.1109/IE.2016.54
+      \endverb
+    \endentry
+    \entry{Ostanin:2020uo}{inproceedings}{}
+      \name{author}{5}{}{%
+        {{hash=9d931db6ce27f9253971798ed2eb8d75}{%
+           family={Ostanin},
+           familyi={O\bibinitperiod},
+           given={M.},
+           giveni={M\bibinitperiod}}}%
+        {{hash=8c1f6930f79d5acddc796e8a4c475219}{%
+           family={Mikhel},
+           familyi={M\bibinitperiod},
+           given={S.},
+           giveni={S\bibinitperiod}}}%
+        {{hash=ff90ecc0f392ba558ccd07e543e46ab6}{%
+           family={Evlampiev},
+           familyi={E\bibinitperiod},
+           given={A.},
+           giveni={A\bibinitperiod}}}%
+        {{hash=7eb2896a3920936fee870b3f4cdebf51}{%
+           family={Skvortsova},
+           familyi={S\bibinitperiod},
+           given={V.},
+           giveni={V\bibinitperiod}}}%
+        {{hash=081886ea19bedbcebefd3c0792cee4d6}{%
+           family={Klimchik},
+           familyi={K\bibinitperiod},
+           given={A.},
+           giveni={A\bibinitperiod}}}%
+      }
+      \strng{namehash}{a65c3da217bdd6ad276dc78303f4a7ab}
+      \strng{fullhash}{78fee1a093d530ec9a580dc3adc249cc}
+      \strng{bibnamehash}{a65c3da217bdd6ad276dc78303f4a7ab}
+      \strng{authorbibnamehash}{a65c3da217bdd6ad276dc78303f4a7ab}
+      \strng{authornamehash}{a65c3da217bdd6ad276dc78303f4a7ab}
+      \strng{authorfullhash}{78fee1a093d530ec9a580dc3adc249cc}
+      \field{sortinit}{2}
+      \field{sortinithash}{8b555b3791beccb63322c22f3320aa9a}
+      \field{labelnamesource}{author}
+      \field{labeltitlesource}{title}
+      \field{booktitle}{2020 IEEE International Conference on Robotics and Automation (ICRA)}
+      \field{isbn}{2577-087X}
+      \field{journaltitle}{2020 IEEE International Conference on Robotics and Automation (ICRA)}
+      \field{title}{Human-robot interaction for robotic manipulator programming in Mixed Reality}
+      \field{year}{2020}
+      \field{pages}{2805\bibrangedash 2811}
+      \range{pages}{7}
+      \verb{doi}
+      \verb 10.1109/ICRA40945.2020.9196965
+      \endverb
+    \endentry
     \entry{Whitney:2018wk}{inproceedings}{}
       \name{author}{5}{}{%
         {{hash=f9acc463c0cf522bcb3d8fef1242c47a}{%
@@ -800,8 +1043,8 @@
       \strng{authorbibnamehash}{f0ec3ff18257407ae62f7b8d22d8a3f6}
       \strng{authornamehash}{f0ec3ff18257407ae62f7b8d22d8a3f6}
       \strng{authorfullhash}{b6fa5d84bcb8b2825308c7e3ae4951ec}
-      \field{sortinit}{1}
-      \field{sortinithash}{4f6aaa89bab872aa0999fec09ff8e98a}
+      \field{sortinit}{2}
+      \field{sortinithash}{8b555b3791beccb63322c22f3320aa9a}
       \field{labelnamesource}{author}
       \field{labeltitlesource}{title}
       \field{booktitle}{2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)}

+ 9 - 1
Thesis_LaTeX/Thesis_Jingyi.bcf

@@ -2360,7 +2360,15 @@
     <bcf:citekey order="14">Villani:2018ub</bcf:citekey>
     <bcf:citekey order="15">Liu:2017tw</bcf:citekey>
     <bcf:citekey order="16">Wang:2017uy</bcf:citekey>
-    <bcf:citekey order="17">Whitney:2018wk</bcf:citekey>
+    <bcf:citekey order="17">Matsas:2017aa</bcf:citekey>
+    <bcf:citekey order="18">Perez:2019ub</bcf:citekey>
+    <bcf:citekey order="19">Wang:2017uy</bcf:citekey>
+    <bcf:citekey order="20">Stotko:2019ud</bcf:citekey>
+    <bcf:citekey order="21">Moniri:2016ud</bcf:citekey>
+    <bcf:citekey order="22">Ostanin:2020uo</bcf:citekey>
+    <bcf:citekey order="23">Whitney:2018wk</bcf:citekey>
+    <bcf:citekey order="24">Perez:2019ub</bcf:citekey>
+    <bcf:citekey order="25">Matsas:2017aa</bcf:citekey>
   </bcf:section>
   <!-- SORTING TEMPLATES -->
   <bcf:sortingtemplate name="none">

+ 30 - 25
Thesis_LaTeX/Thesis_Jingyi.blg

@@ -1,27 +1,32 @@
 [0] Config.pm:311> INFO - This is Biber 2.16
 [0] Config.pm:314> INFO - Logfile is 'Thesis_Jingyi.blg'
-[62] biber-darwin:340> INFO - === Thu Jul  1, 2021, 12:33:05
-[74] Biber.pm:415> INFO - Reading 'Thesis_Jingyi.bcf'
-[152] Biber.pm:952> INFO - Found 17 citekeys in bib section 0
-[164] Biber.pm:4340> INFO - Processing section 0
-[172] Biber.pm:4531> INFO - Looking for bibtex format file 'BibTexDatei.bib' for section 0
-[176] bibtex.pm:1689> INFO - LaTeX decoding ...
-[214] bibtex.pm:1494> INFO - Found BibTeX data source 'BibTexDatei.bib'
-[233] Utils.pm:395> WARN - ISBN '1941-0492' in entry 'Casper:2003tk' is invalid - run biber with '--validate_datamodel' for details.
-[254] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Murphy:2012th' is invalid - run biber with '--validate_datamodel' for details.
-[258] Utils.pm:395> WARN - ISBN '0197-7385' in entry 'Huang:2011wq' is invalid - run biber with '--validate_datamodel' for details.
-[260] Utils.pm:395> WARN - ISBN '1558-2442' in entry 'Murphy:2004wl' is invalid - run biber with '--validate_datamodel' for details.
-[280] Utils.pm:395> WARN - Entry 'Sousa:2017tn' (BibTexDatei.bib): Invalid format '2017/10/18' of date field 'date' - ignoring
-[288] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Nagi:2014vu' is invalid - run biber with '--validate_datamodel' for details.
-[292] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Pourmehr:2013ta' is invalid - run biber with '--validate_datamodel' for details.
-[296] Utils.pm:395> WARN - ISBN '1558-2531' in entry 'Ma:2015wu' is invalid - run biber with '--validate_datamodel' for details.
-[303] Utils.pm:395> WARN - ISBN '1944-9437' in entry 'Villani:2018ub' is invalid - run biber with '--validate_datamodel' for details.
-[307] Utils.pm:395> WARN - ISBN '1944-9437' in entry 'Liu:2017tw' is invalid - run biber with '--validate_datamodel' for details.
-[317] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Whitney:2018wk' is invalid - run biber with '--validate_datamodel' for details.
-[332] UCollate.pm:68> INFO - Overriding locale 'en-US' defaults 'variable = shifted' with 'variable = non-ignorable'
-[332] UCollate.pm:68> INFO - Overriding locale 'en-US' defaults 'normalization = NFD' with 'normalization = prenormalized'
-[332] Biber.pm:4168> INFO - Sorting list 'none/global//global/global' of type 'entry' with template 'none' and locale 'en-US'
-[332] Biber.pm:4174> INFO - No sort tailoring available for locale 'en-US'
-[343] bbl.pm:654> INFO - Writing 'Thesis_Jingyi.bbl' with encoding 'UTF-8'
-[354] bbl.pm:757> INFO - Output to Thesis_Jingyi.bbl
-[354] Biber.pm:128> INFO - WARNINGS: 11
+[87] biber-darwin:340> INFO - === Sun Jul  4, 2021, 16:59:40
+[104] Biber.pm:415> INFO - Reading 'Thesis_Jingyi.bcf'
+[197] Biber.pm:952> INFO - Found 22 citekeys in bib section 0
+[211] Biber.pm:4340> INFO - Processing section 0
+[221] Biber.pm:4531> INFO - Looking for bibtex format file 'BibTexDatei.bib' for section 0
+[225] bibtex.pm:1689> INFO - LaTeX decoding ...
+[274] bibtex.pm:1494> INFO - Found BibTeX data source 'BibTexDatei.bib'
+[303] Utils.pm:395> WARN - ISBN '1941-0492' in entry 'Casper:2003tk' is invalid - run biber with '--validate_datamodel' for details.
+[327] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Murphy:2012th' is invalid - run biber with '--validate_datamodel' for details.
+[333] Utils.pm:395> WARN - ISBN '0197-7385' in entry 'Huang:2011wq' is invalid - run biber with '--validate_datamodel' for details.
+[336] Utils.pm:395> WARN - ISBN '1558-2442' in entry 'Murphy:2004wl' is invalid - run biber with '--validate_datamodel' for details.
+[369] Utils.pm:395> WARN - Entry 'Sousa:2017tn' (BibTexDatei.bib): Invalid format '2017/10/18' of date field 'date' - ignoring
+[382] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Nagi:2014vu' is invalid - run biber with '--validate_datamodel' for details.
+[387] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Pourmehr:2013ta' is invalid - run biber with '--validate_datamodel' for details.
+[392] Utils.pm:395> WARN - ISBN '1558-2531' in entry 'Ma:2015wu' is invalid - run biber with '--validate_datamodel' for details.
+[402] Utils.pm:395> WARN - ISBN '1944-9437' in entry 'Villani:2018ub' is invalid - run biber with '--validate_datamodel' for details.
+[407] Utils.pm:395> WARN - ISBN '1944-9437' in entry 'Liu:2017tw' is invalid - run biber with '--validate_datamodel' for details.
+[421] Utils.pm:395> WARN - ISBN '1955-2505' in entry 'Matsas:2017aa' is invalid - run biber with '--validate_datamodel' for details.
+[426] Utils.pm:395> WARN - ISBN '0166-3615' in entry 'Perez:2019ub' is invalid - run biber with '--validate_datamodel' for details.
+[436] Utils.pm:395> WARN - Entry 'Stotko:2019ud' (BibTexDatei.bib): Invalid format '2019/11/01' of date field 'date' - ignoring
+[444] Utils.pm:395> WARN - ISBN '2472-7571' in entry 'Moniri:2016ud' is invalid - run biber with '--validate_datamodel' for details.
+[451] Utils.pm:395> WARN - ISBN '2577-087X' in entry 'Ostanin:2020uo' is invalid - run biber with '--validate_datamodel' for details.
+[457] Utils.pm:395> WARN - ISBN '2153-0866' in entry 'Whitney:2018wk' is invalid - run biber with '--validate_datamodel' for details.
+[482] UCollate.pm:68> INFO - Overriding locale 'en-US' defaults 'normalization = NFD' with 'normalization = prenormalized'
+[482] UCollate.pm:68> INFO - Overriding locale 'en-US' defaults 'variable = shifted' with 'variable = non-ignorable'
+[482] Biber.pm:4168> INFO - Sorting list 'none/global//global/global' of type 'entry' with template 'none' and locale 'en-US'
+[482] Biber.pm:4174> INFO - No sort tailoring available for locale 'en-US'
+[503] bbl.pm:654> INFO - Writing 'Thesis_Jingyi.bbl' with encoding 'UTF-8'
+[522] bbl.pm:757> INFO - Output to Thesis_Jingyi.bbl
+[522] Biber.pm:128> INFO - WARNINGS: 16

+ 34 - 26
Thesis_LaTeX/Thesis_Jingyi.fdb_latexmk

@@ -1,13 +1,13 @@
 # Fdb version 3
-["biber Thesis_Jingyi"] 1625135582 "Thesis_Jingyi.bcf" "Thesis_Jingyi.bbl" "Thesis_Jingyi" 1625135600
-  "BibTexDatei.bib" 1623954830 18242 f91e0f97a7082496f8d971bccd740a81 ""
-  "Thesis_Jingyi.bcf" 1625135599 106912 702bf6176148abbc2b1924486424ccbf "lualatex"
+["biber Thesis_Jingyi"] 1625410778 "Thesis_Jingyi.bcf" "Thesis_Jingyi.bbl" "Thesis_Jingyi" 1625410802
+  "BibTexDatei.bib" 1625399681 23523 46c59dd3f19db39c137bfdfd0fb31b72 ""
+  "Thesis_Jingyi.bcf" 1625410801 107357 0b8a06157e3a60fdee0766d0fe5d1a87 "lualatex"
   (generated)
-  "Thesis_Jingyi.bbl"
   "Thesis_Jingyi.blg"
-["lualatex"] 1625135595 "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.tex" "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.pdf" "Thesis_Jingyi" 1625135600
-  "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.aux" 1625135599 16512 c36e27cc918bb87c8061f2e5afb42ea8 ""
-  "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.tex" 1625069396 3410 e307d17b3cc34441d48ac04a84c1f17a ""
+  "Thesis_Jingyi.bbl"
+["lualatex"] 1625410794 "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.tex" "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.pdf" "Thesis_Jingyi" 1625410802
+  "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux" 1625410801 18337 5a511f02aa52aaff87194fabd24d7cba ""
+  "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.tex" 1625408960 3486 c0ba66ba4168b3c41a43c055548f892d ""
   "/Users/elaine/Library/texlive/2021/texmf-var/luatex-cache/generic/fonts/otl/lmroman12-bold.luc" 1624377961 124128 826eace173a4c277f00a3961e5bd55a0 ""
   "/Users/elaine/Library/texlive/2021/texmf-var/luatex-cache/generic/fonts/otl/lmroman12-regular.luc" 1624377960 123499 243bfe951f41de9e33b640d9f8cc3644 ""
   "/Users/elaine/Library/texlive/2021/texmf-var/luatex-cache/generic/fonts/otl/roboto-black.luc" 1617870450 252497 421d48f342cf271e92ded36c8499153e ""
@@ -71,6 +71,9 @@
   "/usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy6.tfm" 1136768653 1116 933a60c408fc0a863a92debe84b2d294 ""
   "/usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy7.tfm" 1136768653 1120 2b3f9b25605010c69bc328bea6ac000f ""
   "/usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy8.tfm" 1136768653 1120 8b7d695260f3cff42e636090a8002094 ""
+  "/usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb" 1248133631 36741 fa121aac0049305630cf160b86157ee4 ""
+  "/usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmr12.pfb" 1248133631 32722 d7379af29a190c3f453aba36302ff5a9 ""
+  "/usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb" 1248133631 32569 5e5ddc8df908dea60932f3c484a54c0d ""
   "/usr/local/texlive/2021/texmf-dist/tex/context/base/mkii/supp-pdf.mkii" 1461363279 71627 94eb9990bed73c364d7f53f960cc8c5b ""
   "/usr/local/texlive/2021/texmf-dist/tex/generic/babel-english/english.ldf" 1496785618 7008 9ff5fdcc865b01beca2b0fe4a46231d4 ""
   "/usr/local/texlive/2021/texmf-dist/tex/generic/babel-german/ngerman.ldf" 1614462139 2289 b1d356015c8fd065cb4437d567449559 ""
@@ -146,6 +149,7 @@
   "/usr/local/texlive/2021/texmf-dist/tex/latex/csquotes/csquotes.sty" 1614030765 62518 6e0d74482f5cb16b3b0755031e72faf1 ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty" 1579991033 13886 d1306dcf79a944f6988e688c1785f9ce ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/etoolbox/etoolbox.sty" 1601931149 46845 3b58f70c6e861a13d927bff09d35ecbc ""
+  "/usr/local/texlive/2021/texmf-dist/tex/latex/float/float.sty" 1137110151 6749 16d2656a1984957e674b149555f1ea1d ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/fontspec/fontspec-luatex.sty" 1582325645 151809 9b29a813bdecad31db825cd830187121 ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/fontspec/fontspec.cfg" 1532898934 549 c4adac819276241fea8eb79c5ab7b99e ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/fontspec/fontspec.lua" 1582325645 3021 d32bd2298bedcca160fe737883f3615e ""
@@ -209,6 +213,8 @@
   "/usr/local/texlive/2021/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty" 1575674187 9715 b051d5b493d9fe5f4bc251462d039e5f ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/roboto/roboto-mono.sty" 1576101225 4737 e3536d0db9fc3d461954c39370c5432d ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/roboto/roboto.sty" 1576101225 14885 1a5132fb349578b9a7edd65f7ff1d141 ""
+  "/usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.cfg" 1167176009 2062 a0e7d66e09e508f51289a656aec06ed2 ""
+  "/usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.sty" 1167176009 15188 91281c7ddbccfa54a8e0c3b56ab5aa72 ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/tools/array.sty" 1601675358 12675 9a7bbb9e485cd81cdcc1d56212b088ff ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/tools/tabularx.sty" 1580683321 7148 4ba718200276533b3a293c311a9349e0 ""
   "/usr/local/texlive/2021/texmf-dist/tex/latex/tuda-ci/tuda-a4paper.clo" 1616189291 1686 c2b0b87c4165e6a0ccfb501d0694498e ""
@@ -261,32 +267,34 @@
   "/usr/local/texlive/2021/texmf-var/fonts/map/pdftex/updmap/pdftex.map" 1616695603 5169178 a2ce6e2d73f603b690db6777d06dbccb ""
   "/usr/local/texlive/2021/texmf-var/tex/generic/config/language.dat" 1616695599 6652 ca80e6e2fc9736cbccadc9431e849ff0 ""
   "/usr/local/texlive/2021/texmf-var/web2c/luahbtex/lualatex.fmt" 1617806998 3313487 c362c5bf493095d177e958243f69bfa7 ""
-  "BibTexDatei.bib" 1623954830 18242 f91e0f97a7082496f8d971bccd740a81 ""
-  "Thesis_Jingyi.aux" 1625135599 16512 c36e27cc918bb87c8061f2e5afb42ea8 "lualatex"
-  "Thesis_Jingyi.bbl" 1625135585 35373 c19ec8d5009345dd3009530555541c21 "biber Thesis_Jingyi"
-  "Thesis_Jingyi.out" 1625135598 0 d41d8cd98f00b204e9800998ecf8427e "lualatex"
-  "Thesis_Jingyi.run.xml" 1625135599 2465 3451d595289a2dfd4f36a3855c6a77d8 "lualatex"
-  "Thesis_Jingyi.tex" 1625069396 3410 e307d17b3cc34441d48ac04a84c1f17a ""
-  "Thesis_Jingyi.toc" 1625135599 2383 2b0499b21e489a765630c1a634f07a07 "lualatex"
-  "Thesis_Jingyi.xmpdata" 1625135598 155 d646172d23b1cbec6b5499b8324738e3 "lualatex"
+  "BibTexDatei.bib" 1625399681 23523 46c59dd3f19db39c137bfdfd0fb31b72 ""
+  "Thesis_Jingyi.aux" 1625410801 18337 5a511f02aa52aaff87194fabd24d7cba "lualatex"
+  "Thesis_Jingyi.bbl" 1625410780 48369 23885e836f2050d1efe57d3a6a443718 "biber Thesis_Jingyi"
+  "Thesis_Jingyi.out" 1625410799 0 d41d8cd98f00b204e9800998ecf8427e "lualatex"
+  "Thesis_Jingyi.run.xml" 1625410801 2465 3451d595289a2dfd4f36a3855c6a77d8 "lualatex"
+  "Thesis_Jingyi.tex" 1625408960 3486 c0ba66ba4168b3c41a43c055548f892d ""
+  "Thesis_Jingyi.toc" 1625410801 2485 62456ca387e9a2d11413c824c6273fe0 "lualatex"
+  "Thesis_Jingyi.xmpdata" 1625410799 155 d646172d23b1cbec6b5499b8324738e3 "lualatex"
   "chapters/abstract.tex" 1624382427 20 247a882ae4b7334fc7325d03790db55f ""
   "chapters/conclusion.tex" 1624959033 56 06b20b208584b88731f86e6573a5434e ""
-  "chapters/evaluate.tex" 1625135562 4412 ca7f13a08db4ce60890392bd003642cf ""
-  "chapters/implementation.tex" 1625067055 6823 aff7546dbc14faf17e8bd5c0aaba8f65 ""
-  "chapters/introduction.tex" 1625135358 5399 20c44b695f399b354aa48f471b91f5dc ""
+  "chapters/evaluate.tex" 1625398045 4842 1c5b126dfd1ce3e9fb213c2a32ff7e86 ""
+  "chapters/implementation.tex" 1625407236 11636 8a2d9a48237effd00ac85820cf46e9d1 ""
+  "chapters/introduction.tex" 1625399334 5527 792f2574bb38a9cf54696beae28333e9 ""
+  "chapters/related_work.tex" 1625399907 3561 a7beaad53e0b8f722f7093c7ccaf02fa ""
   "chapters/result.tex" 1624911790 33 367b67bb5c51a3381413668eb4c009da ""
+  "graphics/htc.png" 1625139359 138079 f2286e15c09310165fded74368c13910 ""
   "graphics/uml.png" 1625047151 217947 ca27c2dd1a6486144d2f5b2bd1a96c3d ""
-  "pdfa.xmpi" 1625135597 5351 7dc41d133cdc233169c6efb9063444d0 "lualatex"
+  "pdfa.xmpi" 1625410798 5351 d11d23dd8d809dda842e5b62d54455c0 "lualatex"
   "tuda_logo.pdf" 1616080498 535101 e91be11cb03c1d5496698a80d026c7ef ""
   (generated)
+  "Thesis_Jingyi.bcf"
+  "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.pdf"
+  "Thesis_Jingyi.log"
   "pdfa.xmpi"
-  "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.pdf"
-  "Thesis_Jingyi.xmpdata"
-  "Thesis_Jingyi.pdf"
-  "Thesis_Jingyi.run.xml"
-  "/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.log"
   "Thesis_Jingyi.aux"
-  "Thesis_Jingyi.log"
   "Thesis_Jingyi.out"
-  "Thesis_Jingyi.bcf"
+  "Thesis_Jingyi.xmpdata"
+  "/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.log"
+  "Thesis_Jingyi.run.xml"
+  "Thesis_Jingyi.pdf"
   "Thesis_Jingyi.toc"

+ 64 - 48
Thesis_LaTeX/Thesis_Jingyi.fls

@@ -1,7 +1,7 @@
-PWD /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX
+PWD /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX
 INPUT /usr/local/texlive/2021/texmf-var/web2c/luahbtex/lualatex.fmt
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.tex
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.log
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.tex
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.log
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/latexconfig/lualatexquotejobname.lua
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/base/ltluatex.lua
 INPUT /usr/local/texlive/2021/texmf-dist/tex/luatex/luaotfload/luaotfload-main.lua
@@ -326,10 +326,10 @@ INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/stringenc/se-ascii-print.de
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/stringenc/se-ascii-print.def
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/AdobeColorProfiles.tex
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/AdobeColorProfiles.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.xmpdata
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.xmpdata
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.xmpdata
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.xmpdata
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/colorprofiles/sRGB.icc
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.pdf
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.pdf
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/colorprofiles/sRGB.icc
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/l8u-penc.def
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/pdftex/glyphtounicode.tex
@@ -343,8 +343,8 @@ INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/xmpincl/xmpincl.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/xmpincl/xmpincl.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/pdfa.xmp
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/pdfa.xmp
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/pdfa.xmpi
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/pdfa.xmpi
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/pdfa.xmpi
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/pdfa.xmpi
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/tuda-ci/tudathesis.cfg
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/tuda-ci/tudathesis.cfg
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/babel/babel.sty
@@ -414,6 +414,14 @@ INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/tools/array.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/booktabs/booktabs.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/booktabs/booktabs.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/booktabs/booktabs.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/float/float.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/float/float.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/float/float.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.sty
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.cfg
+INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.cfg
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/pifont.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/pifont.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/pifont.sty
@@ -423,9 +431,9 @@ INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/adobe/zapfding/pzdr.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/upsy.fd
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/upsy.fd
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/adobe/symbol/psyr.tfm
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.aux
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.aux
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.aux
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/base/ts1cmr.fd
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/base/ts1cmr.fd
 INPUT /usr/local/texlive/2021/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
@@ -444,12 +452,12 @@ INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/refcount/refcount.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/bookmark/bookmark.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/bookmark/bookmark.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/bookmark/bookmark.sty
@@ -466,15 +474,15 @@ INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/biblatex/lbx/german.lbx
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/biblatex/blx-case-expl3.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/biblatex/blx-case-expl3.sty
 INPUT /usr/local/texlive/2021/texmf-dist/tex/latex/biblatex/blx-case-expl3.sty
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.bcf
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.bbl
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.bbl
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.bbl
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.bbl
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.xmpdata
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/tuda_logo.pdf
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/tuda_logo.pdf
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/tuda_logo.pdf
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bcf
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bbl
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bbl
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bbl
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bbl
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.xmpdata
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/tuda_logo.pdf
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/tuda_logo.pdf
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/tuda_logo.pdf
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi12.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi8.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi6.tfm
@@ -496,31 +504,39 @@ INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmr10.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi10.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmr10.tfm
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.toc
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.toc
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.toc
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/abstract.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/abstract.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/introduction.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/introduction.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/implementation.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/implementation.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.toc
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.toc
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.toc
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/abstract.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/abstract.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/introduction.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/introduction.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/related_work.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/related_work.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/implementation.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/implementation.tex
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi7.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmmi5.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy7.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmsy5.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmr7.tfm
 INPUT /usr/local/texlive/2021/texmf-dist/fonts/tfm/public/cm/cmr5.tfm
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/graphics/uml.png
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/graphics/uml.png
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/graphics/uml.png
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/evaluate.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/evaluate.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/result.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/result.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/conclusion.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/chapters/conclusion.tex
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.aux
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.out
-INPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.run.xml
-OUTPUT /Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.run.xml
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/uml.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/uml.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/uml.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/htc.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/htc.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/htc.png
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/evaluate.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/evaluate.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/result.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/result.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/conclusion.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/conclusion.tex
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out
+INPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.run.xml
+OUTPUT /Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.run.xml
+INPUT /usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb
+INPUT /usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmr12.pfb
+INPUT /usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb

+ 226 - 166
Thesis_LaTeX/Thesis_Jingyi.log

@@ -1,8 +1,8 @@
-This is LuaHBTeX, Version 1.13.0 (TeX Live 2021)  (format=lualatex 2021.4.7)  1 JUL 2021 14:25
+This is LuaHBTeX, Version 1.13.0 (TeX Live 2021)  (format=lualatex 2021.4.7)  4 JUL 2021 16:59
  restricted system commands enabled.
  file:line:error style messages enabled.
-**/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi
-(/Users/elaine/Desktop/Hector_Unity_v2.0/Thesis_LaTeX/Thesis_Jingyi.tex
+**/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.tex
+(/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.tex
 LaTeX2e <2020-10-01> patch level 4
 Lua module: luaotfload 2021-01-08 3.17 Lua based OpenType font support
 Lua module: lualibs 2020-12-30 2.73 ConTeXt Lua standard libraries.
@@ -990,7 +990,7 @@ Package: colorprofiles 2018/11/01 v1.0.1 color profiles for PDF/X and PDF/A supp
 File: se-ascii-print.def 2019/11/29 v1.12 stringenc: Printable ASCII characters
 ) (/usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/AdobeColorProfiles.tex)
 ** pdfx: Metadata file Thesis_Jingyi.xmpdata read successfully.
-(./Thesis_Jingyi.xmpdata)<<sRGB.icc>>
+(/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.xmpdata)<<sRGB.icc>>
 Package hyperref Info: Option `unicode' set `true' on input line 2411.
 Package hyperref Info: Option `unicode' set `true' on input line 2412.
  (/usr/local/texlive/2021/texmf-dist/tex/generic/pdftex/glyphtounicode.tex) (/usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/glyphtounicode-cmr.tex) (/usr/local/texlive/2021/texmf-dist/tex/latex/pdfx/glyphtounicode-ntx.tex) (/usr/local/texlive/2021/texmf-dist/tex/latex/base/ifthen.sty
@@ -1415,6 +1415,32 @@ Package: booktabs 2020/01/12 v1.61803398 Publication quality tables
 \@thisruleclass=\count419
 \@lastruleclass=\count420
 \@thisrulewidth=\dimen191
+) (/usr/local/texlive/2021/texmf-dist/tex/latex/float/float.sty
+Package: float 2001/11/08 v1.3d Float enhancements (AL)
+\c@float@type=\count421
+\float@exts=\toks26
+\float@box=\box58
+\@float@everytoks=\toks27
+\@floatcapt=\box59
+) (/usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.sty
+Package: subfigure 2002/03/15 v2.1.5 subfigure package
+\subfigtopskip=\skip91
+\subfigcapskip=\skip92
+\subfigcaptopadj=\dimen192
+\subfigbottomskip=\skip93
+\subfigcapmargin=\dimen193
+\subfiglabelskip=\skip94
+\c@subfigure=\count422
+\c@lofdepth=\count423
+\c@subtable=\count424
+\c@lotdepth=\count425
+
+****************************************
+* Local config file subfigure.cfg used *
+****************************************
+(/usr/local/texlive/2021/texmf-dist/tex/latex/subfigure/subfigure.cfg)
+\subfig@top=\skip95
+\subfig@bottom=\skip96
 ) (/usr/local/texlive/2021/texmf-dist/tex/latex/psnfss/pifont.sty
 Package: pifont 2020/03/25 PSNFSS-v9.3 Pi font support (SPQR) 
 LaTeX Font Info:    Trying to load font information for U+pzd on input line 63.
@@ -1427,49 +1453,49 @@ File: upsy.fd 2001/06/04 font definitions for U/psy.
 ))
 Package csquotes Info: Checking for multilingual support...
 Package csquotes Info: ... found 'babel' package.
- (./Thesis_Jingyi.aux)
+ (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux)
 \openout1 = Thesis_Jingyi.aux
 
-LaTeX Font Info:    Checking defaults for OML/cmm/m/it on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for OMS/cmsy/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for OT1/cmr/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for T1/cmr/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for TS1/cmr/m/n on input line 65.
-LaTeX Font Info:    Trying to load font information for TS1+cmr on input line 65.
+LaTeX Font Info:    Checking defaults for OML/cmm/m/it on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for OMS/cmsy/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for OT1/cmr/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for T1/cmr/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for TS1/cmr/m/n on input line 73.
+LaTeX Font Info:    Trying to load font information for TS1+cmr on input line 73.
  (/usr/local/texlive/2021/texmf-dist/tex/latex/base/ts1cmr.fd
 File: ts1cmr.fd 2019/12/16 v2.5j Standard LaTeX font definitions
 )
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for TU/lmr/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for OMX/cmex/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for U/cmr/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for PD1/pdf/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-LaTeX Font Info:    Checking defaults for PU/pdf/m/n on input line 65.
-LaTeX Font Info:    ... okay on input line 65.
-Package scrbase Info: activating english \contentsname on input line 65.
-Package scrbase Info: activating english \listfigurename on input line 65.
-Package scrbase Info: activating english \listtablename on input line 65.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for TU/lmr/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for OMX/cmex/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for U/cmr/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for PD1/pdf/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+LaTeX Font Info:    Checking defaults for PU/pdf/m/n on input line 73.
+LaTeX Font Info:    ... okay on input line 73.
+Package scrbase Info: activating english \contentsname on input line 73.
+Package scrbase Info: activating english \listfigurename on input line 73.
+Package scrbase Info: activating english \listtablename on input line 73.
  (/usr/local/texlive/2021/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
 [Loading MPS to PDF converter (version 2006.09.02).]
-\scratchcounter=\count421
-\scratchdimen=\dimen192
-\scratchbox=\box58
-\nofMPsegments=\count422
-\nofMParguments=\count423
-\everyMPshowfont=\toks26
-\MPscratchCnt=\count424
-\MPscratchDim=\dimen193
-\MPnumerator=\count425
-\makeMPintoPDFobject=\count426
-\everyMPtoPDFconversion=\toks27
+\scratchcounter=\count426
+\scratchdimen=\dimen194
+\scratchbox=\box60
+\nofMPsegments=\count427
+\nofMParguments=\count428
+\everyMPshowfont=\toks28
+\MPscratchCnt=\count429
+\MPscratchDim=\dimen195
+\MPnumerator=\count430
+\makeMPintoPDFobject=\count431
+\everyMPtoPDFconversion=\toks29
 ) (/usr/local/texlive/2021/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
 Package: epstopdf-base 2020-01-24 v2.11 Base part for package epstopdf
 Package epstopdf-base Info: Redefining graphics rule for `.eps' on input line 485.
@@ -1482,57 +1508,57 @@ Package fontspec Info: Adjusting the maths setup (use [no-math] to avoid
 
 \symlegacymaths=\mathgroup4
 LaTeX Font Info:    Overwriting symbol font `legacymaths' in version `bold'
-(Font)                  OT1/cmr/m/n --> OT1/cmr/bx/n on input line 65.
-LaTeX Font Info:    Redeclaring math accent \acute on input line 65.
-LaTeX Font Info:    Redeclaring math accent \grave on input line 65.
-LaTeX Font Info:    Redeclaring math accent \ddot on input line 65.
-LaTeX Font Info:    Redeclaring math accent \tilde on input line 65.
-LaTeX Font Info:    Redeclaring math accent \bar on input line 65.
-LaTeX Font Info:    Redeclaring math accent \breve on input line 65.
-LaTeX Font Info:    Redeclaring math accent \check on input line 65.
-LaTeX Font Info:    Redeclaring math accent \hat on input line 65.
-LaTeX Font Info:    Redeclaring math accent \dot on input line 65.
-LaTeX Font Info:    Redeclaring math accent \mathring on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \colon on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Gamma on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Delta on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Theta on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Lambda on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Xi on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Pi on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Sigma on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Upsilon on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Phi on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Psi on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \Omega on input line 65.
-LaTeX Font Info:    Redeclaring math symbol \mathdollar on input line 65.
-LaTeX Font Info:    Redeclaring symbol font `operators' on input line 65.
+(Font)                  OT1/cmr/m/n --> OT1/cmr/bx/n on input line 73.
+LaTeX Font Info:    Redeclaring math accent \acute on input line 73.
+LaTeX Font Info:    Redeclaring math accent \grave on input line 73.
+LaTeX Font Info:    Redeclaring math accent \ddot on input line 73.
+LaTeX Font Info:    Redeclaring math accent \tilde on input line 73.
+LaTeX Font Info:    Redeclaring math accent \bar on input line 73.
+LaTeX Font Info:    Redeclaring math accent \breve on input line 73.
+LaTeX Font Info:    Redeclaring math accent \check on input line 73.
+LaTeX Font Info:    Redeclaring math accent \hat on input line 73.
+LaTeX Font Info:    Redeclaring math accent \dot on input line 73.
+LaTeX Font Info:    Redeclaring math accent \mathring on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \colon on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Gamma on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Delta on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Theta on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Lambda on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Xi on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Pi on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Sigma on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Upsilon on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Phi on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Psi on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \Omega on input line 73.
+LaTeX Font Info:    Redeclaring math symbol \mathdollar on input line 73.
+LaTeX Font Info:    Redeclaring symbol font `operators' on input line 73.
 LaTeX Font Info:    Encoding `OT1' has changed to `TU' for symbol font
-(Font)              `operators' in the math version `normal' on input line 65.
+(Font)              `operators' in the math version `normal' on input line 73.
 LaTeX Font Info:    Overwriting symbol font `operators' in version `normal'
-(Font)                  OT1/cmr/m/n --> TU/XCharter(0)/m/n on input line 65.
+(Font)                  OT1/cmr/m/n --> TU/XCharter(0)/m/n on input line 73.
 LaTeX Font Info:    Encoding `OT1' has changed to `TU' for symbol font
-(Font)              `operators' in the math version `bold' on input line 65.
+(Font)              `operators' in the math version `bold' on input line 73.
 LaTeX Font Info:    Overwriting symbol font `operators' in version `bold'
-(Font)                  OT1/cmr/bx/n --> TU/XCharter(0)/m/n on input line 65.
+(Font)                  OT1/cmr/bx/n --> TU/XCharter(0)/m/n on input line 73.
 LaTeX Font Info:    Overwriting symbol font `operators' in version `normal'
-(Font)                  TU/XCharter(0)/m/n --> TU/XCharter(0)/m/n on input line 65.
+(Font)                  TU/XCharter(0)/m/n --> TU/XCharter(0)/m/n on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathit' in version `normal'
-(Font)                  OT1/cmr/m/it --> TU/XCharter(0)/m/it on input line 65.
+(Font)                  OT1/cmr/m/it --> TU/XCharter(0)/m/it on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathbf' in version `normal'
-(Font)                  OT1/cmr/bx/n --> TU/XCharter(0)/b/n on input line 65.
+(Font)                  OT1/cmr/bx/n --> TU/XCharter(0)/b/n on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathsf' in version `normal'
-(Font)                  OT1/cmss/m/n --> TU/Roboto(0)/m/n on input line 65.
+(Font)                  OT1/cmss/m/n --> TU/Roboto(0)/m/n on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathtt' in version `normal'
-(Font)                  OT1/cmtt/m/n --> TU/RobotoMono(0)/m/n on input line 65.
+(Font)                  OT1/cmtt/m/n --> TU/RobotoMono(0)/m/n on input line 73.
 LaTeX Font Info:    Overwriting symbol font `operators' in version `bold'
-(Font)                  TU/XCharter(0)/m/n --> TU/XCharter(0)/b/n on input line 65.
+(Font)                  TU/XCharter(0)/m/n --> TU/XCharter(0)/b/n on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathit' in version `bold'
-(Font)                  OT1/cmr/bx/it --> TU/XCharter(0)/b/it on input line 65.
+(Font)                  OT1/cmr/bx/it --> TU/XCharter(0)/b/it on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathsf' in version `bold'
-(Font)                  OT1/cmss/bx/n --> TU/Roboto(0)/b/n on input line 65.
+(Font)                  OT1/cmss/bx/n --> TU/Roboto(0)/b/n on input line 73.
 LaTeX Font Info:    Overwriting math alphabet `\mathtt' in version `bold'
-(Font)                  OT1/cmtt/m/n --> TU/RobotoMono(0)/b/n on input line 65.
+(Font)                  OT1/cmtt/m/n --> TU/RobotoMono(0)/b/n on input line 73.
 
 *geometry* driver: auto-detecting
 *geometry* detected driver: luatex
@@ -1568,26 +1594,26 @@ LaTeX Font Info:    Overwriting math alphabet `\mathtt' in version `bold'
 * \@reversemarginfalse
 * (1in=72.27pt=25.4mm, 1cm=28.453pt)
 
-\ptxcd_headrule_box=\box59
-Package scrbase Info: activating english \ptxcd_byname on input line 65.
-Package scrbase Info: activating english \ptxcd_fromname on input line 65.
-Package scrbase Info: activating english \ptxcd_departmentprefix on input line 65.
-Package scrbase Info: activating english \ptxcd_reviewname on input line 65.
-Package scrbase Info: activating english \ptxcd_examdatename on input line 65.
-Package scrbase Info: activating english \ptxcd_submissiondatename on input line 65.
-Package scrbase Info: activating english \ptxcd_studentIDname on input line 65.
-Package scrbase Info: activating english \ptxcd_thesisType on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_attr_by: on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_attr_nc: on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_attr_sa: on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_attr_nd: on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_intro: on input line 65.
-Package scrbase Info: activating english \g__ptxcd_cc_sep: on input line 65.
-Package scrbase Info: activating english \researchgroupname on input line 65.
-Package scrbase Info: activating english \institutename on input line 65.
-Package scrbase Info: activating english \authorandname on input line 65.
-Package scrbase Info: activating english \ptxcd_datename on input line 65.
-Package hyperref Info: Link coloring OFF on input line 65.
+\ptxcd_headrule_box=\box61
+Package scrbase Info: activating english \ptxcd_byname on input line 73.
+Package scrbase Info: activating english \ptxcd_fromname on input line 73.
+Package scrbase Info: activating english \ptxcd_departmentprefix on input line 73.
+Package scrbase Info: activating english \ptxcd_reviewname on input line 73.
+Package scrbase Info: activating english \ptxcd_examdatename on input line 73.
+Package scrbase Info: activating english \ptxcd_submissiondatename on input line 73.
+Package scrbase Info: activating english \ptxcd_studentIDname on input line 73.
+Package scrbase Info: activating english \ptxcd_thesisType on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_attr_by: on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_attr_nc: on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_attr_sa: on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_attr_nd: on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_intro: on input line 73.
+Package scrbase Info: activating english \g__ptxcd_cc_sep: on input line 73.
+Package scrbase Info: activating english \researchgroupname on input line 73.
+Package scrbase Info: activating english \institutename on input line 73.
+Package scrbase Info: activating english \authorandname on input line 73.
+Package scrbase Info: activating english \ptxcd_datename on input line 73.
+Package hyperref Info: Link coloring OFF on input line 73.
 (/usr/local/texlive/2021/texmf-dist/tex/latex/hyperref/nameref.sty
 Package: nameref 2021-04-02 v2.47 Cross-referencing by name of section
  (/usr/local/texlive/2021/texmf-dist/tex/latex/refcount/refcount.sty
@@ -1595,12 +1621,12 @@ Package: refcount 2019/12/15 v3.6 Data extraction from label references (HO)
 ) (/usr/local/texlive/2021/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
 Package: gettitlestring 2019/12/15 v1.6 Cleanup title references (HO)
 )
-\c@section@level=\count427
+\c@section@level=\count432
 )
-LaTeX Info: Redefining \ref on input line 65.
-LaTeX Info: Redefining \pageref on input line 65.
-LaTeX Info: Redefining \nameref on input line 65.
- (./Thesis_Jingyi.out) (./Thesis_Jingyi.out)
+LaTeX Info: Redefining \ref on input line 73.
+LaTeX Info: Redefining \pageref on input line 73.
+LaTeX Info: Redefining \nameref on input line 73.
+ (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.out)
 \@outlinefile=\write5
 
 \openout5 = Thesis_Jingyi.out
@@ -1611,21 +1637,21 @@ Class scrreprt Info: loading recommended package `bookmark'.
 (scrreprt)           `bookmarkpackage=false' before \begin{document} and
 (scrreprt)           you can avoid this message adding:
 (scrreprt)             \usepackage{bookmark}
-(scrreprt)           before \begin{document} on input line 65.
+(scrreprt)           before \begin{document} on input line 73.
  (/usr/local/texlive/2021/texmf-dist/tex/latex/bookmark/bookmark.sty
 Package: bookmark 2020-11-06 v1.29 PDF bookmarks (HO)
  (/usr/local/texlive/2021/texmf-dist/tex/latex/bookmark/bkm-pdftex.def
 File: bkm-pdftex.def 2020-11-06 v1.29 bookmark driver for pdfTeX (HO)
-\BKM@id=\count428
+\BKM@id=\count433
 ))
-LaTeX Info: Redefining \microtypecontext on input line 65.
+LaTeX Info: Redefining \microtypecontext on input line 73.
 Package microtype Info: Generating PDF output.
 Package microtype Info: Character protrusion enabled (level 2).
 Package microtype Info: Using default protrusion set `alltext'.
 Package microtype Info: Automatic font expansion enabled (level 2),
 (microtype)             stretch: 20, shrink: 20, step: 1, non-selected.
 Package microtype Info: Using default expansion set `alltext-nott'.
-LaTeX Info: Redefining \showhyphens on input line 65.
+LaTeX Info: Redefining \showhyphens on input line 73.
 Package microtype Info: No adjustment of tracking.
 Package microtype Info: No adjustment of spacing.
 Package microtype Info: No adjustment of kerning.
@@ -1668,15 +1694,20 @@ Package: blx-case-expl3 2020/12/31 v3.16 expl3 case changing code for biblatex
 
 Package biblatex Info: Trying to load bibliographic data...
 Package biblatex Info: ... file 'Thesis_Jingyi.bbl' found.
- (./Thesis_Jingyi.bbl
+ (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.bbl
 
 Package biblatex Warning: Biber reported the following issues
 (biblatex)                with 'Sousa:2017tn':
 (biblatex)                - Entry 'Sousa:2017tn' (BibTexDatei.bib): Invalid format '2017/10/18' of date field 'date' - ignoring.
 
+
+Package biblatex Warning: Biber reported the following issues
+(biblatex)                with 'Stotko:2019ud':
+(biblatex)                - Entry 'Stotko:2019ud' (BibTexDatei.bib): Invalid format '2019/11/01' of date field 'date' - ignoring.
+
 )
-Package biblatex Info: Reference section=0 on input line 65.
-Package biblatex Info: Reference segment=0 on input line 65.
+Package biblatex Info: Reference section=0 on input line 73.
+Package biblatex Info: Reference segment=0 on input line 73.
 
 Package tudapub/thesis Info: You submitted birthplace data for title
 (tudapub/thesis)             information.
@@ -1689,41 +1720,41 @@ Package tudapub/thesis Info: You submitted examdate data for title
 (tudapub/thesis)             This field is only used for type=dr/drfinal.
 (tudapub/thesis)             It will be ignored.
 
-Package scrbase Info: activating english \ptxcd_department on input line 88.
-Package scrbase Info: activating english \departmentname on input line 88.
-Package scrbase Info: activating english \ptxcd_departmentprefix on input line 88.
-Package scrbase Info: activating english \departmentfullname on input line 88.
+Package scrbase Info: activating english \ptxcd_department on input line 96.
+Package scrbase Info: activating english \departmentname on input line 96.
+Package scrbase Info: activating english \ptxcd_departmentprefix on input line 96.
+Package scrbase Info: activating english \departmentfullname on input line 96.
 \ptxcd_xmpdata_stream=\write6
 
 \openout6 = Thesis_Jingyi.xmpdata
 LaTeX Font Info:    Font shape `TU/Roboto(0)/m/n' will be
-(Font)              scaled to size 12.0pt on input line 88.
+(Font)              scaled to size 12.0pt on input line 96.
 Package microtype Info: Loading generic protrusion settings for font family
 (microtype)             `Roboto' (encoding: TU).
 (microtype)             For optimal results, create family-specific settings.
 (microtype)             See the microtype manual for details.
 LaTeX Font Info:    Font shape `TU/Roboto(0)/b/n' will be
-(Font)              scaled to size 12.0pt on input line 88.
+(Font)              scaled to size 12.0pt on input line 96.
 LaTeX Font Info:    Font shape `TU/Roboto(0)/b/n' will be
-(Font)              scaled to size 36.0pt on input line 88.
+(Font)              scaled to size 36.0pt on input line 96.
 
 warning  (file tuda_logo.pdf) (pdf inclusion): PDF inclusion: found PDF version '1.5', but at most version '1.4' allowed
 <tuda_logo.pdf, id=6, 156.48964pt x 62.59586pt>
 File: tuda_logo.pdf Graphic file (type pdf)
 <use tuda_logo.pdf>
-Package luatex.def Info: tuda_logo.pdf  used on input line 88.
+Package luatex.def Info: tuda_logo.pdf  used on input line 96.
 (luatex.def)             Requested size: 156.49402pt x 62.59605pt.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <12> on input line 88.
+(Font)              <12> on input line 96.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <8> on input line 88.
+(Font)              <8> on input line 96.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <6> on input line 88.
+(Font)              <6> on input line 96.
 (/usr/local/texlive/2021/texmf-dist/tex/latex/microtype/mt-cmr.cfg
 File: mt-cmr.cfg 2013/05/19 v2.2 microtype config. file: Computer Modern Roman (RS)
 )
 LaTeX Font Info:    Font shape `TU/Roboto(0)/m/n' will be
-(Font)              scaled to size 10.0pt on input line 88.
+(Font)              scaled to size 10.0pt on input line 96.
  [1
 
 
@@ -1733,15 +1764,15 @@ LaTeX Font Info:    Font shape `TU/Roboto(0)/m/n' will be
 
 
 
-<./tuda_logo.pdf>]
+</Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/tuda_logo.pdf>]
 LaTeX Font Info:    Font shape `TU/Roboto(0)/b/n' will be
-(Font)              scaled to size 17.28pt on input line 89.
+(Font)              scaled to size 17.28pt on input line 97.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <17.28> on input line 89.
+(Font)              <17.28> on input line 97.
 LaTeX Font Info:    Font shape `TU/Roboto(0)/m/n' will be
-(Font)              scaled to size 10.95pt on input line 89.
+(Font)              scaled to size 10.95pt on input line 97.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <10.95> on input line 89.
+(Font)              <10.95> on input line 97.
  [2
 
 
@@ -1751,66 +1782,95 @@ LaTeX Font Info:    External font `cmex10' loaded for size
 
 ]
 LaTeX Font Info:    Font shape `TU/Roboto(0)/b/n' will be
-(Font)              scaled to size 24.88pt on input line 89.
-Package tocbasic Info: character protrusion at toc deactivated on input line 89.
- (./Thesis_Jingyi.toc)
+(Font)              scaled to size 24.88pt on input line 97.
+Package tocbasic Info: character protrusion at toc deactivated on input line 97.
+ (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.toc)
 \tf@toc=\write7
 
 \openout7 = Thesis_Jingyi.toc
- (./chapters/abstract.tex [3
+ (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/abstract.tex [3
 
 
 ]
 chapter 1.
-) (./chapters/introduction.tex [4
+
+Class scrreprt Warning: \float@addtolists detected!
+(scrreprt)              Implementation of \float@addtolist became
+(scrreprt)              deprecated in KOMA-Script v3.01 2008/11/14 and
+(scrreprt)              has been replaced by several more flexible
+(scrreprt)              features of package `tocbasic`.
+(scrreprt)              Since Version 3.12 support for deprecated
+(scrreprt)              \float@addtolist interface has been
+(scrreprt)              restricted to only some of the KOMA-Script
+(scrreprt)              features and been removed from others.
+(scrreprt)              Loading of package `scrhack' may help to
+(scrreprt)              avoid this warning, if you are using a
+(scrreprt)              a package that still implements the
+(scrreprt)              deprecated \float@addtolist interface.
+
+) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/introduction.tex [4
 
 ]
 chapter 2.
 [5
 
-]) (./chapters/implementation.tex [6]
+]) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/related_work.tex [6]
 chapter 3.
+) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/implementation.tex [7
+
+Missing character: There is no , (U+FF0C) in font [XCharter-Roman.otf]:mode=node;script=latn;language=dflt;+tlig;!
+] [8]
+chapter 4.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <7> on input line 15.
+(Font)              <7> on input line 14.
 LaTeX Font Info:    External font `cmex10' loaded for size
-(Font)              <5> on input line 15.
-[7
+(Font)              <5> on input line 14.
+[9
 
-] [8]
-<graphics/uml.png, id=141, 1237.62375pt x 1149.29375pt>
+] [10]
+<graphics/uml.png, id=162, 1237.62375pt x 1149.29375pt>
 File: graphics/uml.png Graphic file (type png)
 <use graphics/uml.png>
-Package luatex.def Info: graphics/uml.png  used on input line 35.
-(luatex.def)             Requested size: 367.66415pt x 341.43306pt.
+Package luatex.def Info: graphics/uml.png  used on input line 34.
+(luatex.def)             Requested size: 512.14963pt x 475.61522pt.
 LaTeX Font Info:    Font shape `TU/Roboto(0)/b/n' will be
-(Font)              scaled to size 14.4pt on input line 40.
-
-Underfull \hbox (badness 10000) in paragraph at lines 47--47
+(Font)              scaled to size 14.4pt on input line 39.
+<graphics/htc.png, id=164, 952.55875pt x 847.165pt>
+File: graphics/htc.png Graphic file (type png)
+<use graphics/htc.png>
+Package luatex.def Info: graphics/htc.png  used on input line 44.
+(luatex.def)             Requested size: 319.92674pt x 284.52756pt.
+ [11] [12
+warning  (pdf backend): ignoring duplicate destination with the name 'figure.4.1'
+</Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/uml.png>] [13
+warning  (pdf backend): ignoring duplicate destination with the name 'figure.4.2'
+</Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/graphics/htc.png>]
+Underfull \hbox (badness 10000) in paragraph at lines 63--63
   [][][]\TU/XCharter(0)/m/n/10 https://as-set-store.unity.com/pack-ages/3d/en-vi-ron-ments/ur-ban/polygon-apocalypse-low-poly-3d-art-by-synty-
  []
 
-[9<./graphics/uml.png>]) (./chapters/evaluate.tex [10]
-chapter 4.
-[11
-
-]) (./chapters/result.tex [12]
+) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/evaluate.tex [14]
 chapter 5.
-) (./chapters/conclusion.tex [13
+[15
 
-]
+]) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/result.tex [16]
 chapter 6.
-) [14
+) (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/chapters/conclusion.tex [17
+
+]
+chapter 7.
+) [18
 
 ]
 LaTeX Font Info:    Font shape `TU/RobotoMono(0)/m/n' will be
-(Font)              scaled to size 12.0pt on input line 101.
+(Font)              scaled to size 12.0pt on input line 109.
 Package microtype Info: Loading generic protrusion settings for font family
 (microtype)             `RobotoMono' (encoding: TU).
 (microtype)             For optimal results, create family-specific settings.
 (microtype)             See the microtype manual for details.
- [15
+ [19
 
-]
+] [20]
 
 Package tudapub Warning: You did not provide a Label for key introduction.
 (tudapub)                Either you need to recompile your document or add a
@@ -1831,7 +1891,7 @@ Package tudapub Warning: You did not provide a Label for key discussion.
 (tudapub)                Either you need to recompile your document or add a
 (tudapub)                label using \IMRADlabel.
 
-[16] (./Thesis_Jingyi.aux)
+[21] (/Users/elaine/Desktop/Thesis-Hector-VR/Thesis_LaTeX/Thesis_Jingyi.aux)
 Package rerunfilecheck Info: File `Thesis_Jingyi.out' has not changed.
 (rerunfilecheck)             Checksum: D41D8CD98F00B204E9800998ECF8427E;0.
 Package logreq Info: Writing requests to 'Thesis_Jingyi.run.xml'.
@@ -1840,18 +1900,18 @@ Package logreq Info: Writing requests to 'Thesis_Jingyi.run.xml'.
 )
 
 Here is how much of LuaTeX's memory you used:
- 33615 strings out of 478531
+ 33890 strings out of 478531
  100000,1977958 words of node,token memory allocated
- 5596 words of node memory still in use:
-   77 hlist, 8 vlist, 13 rule, 25 disc, 19 local_par, 6 math, 142 glue, 27 kern, 31 penalty, 4 margin_kern, 269 glyph, 187 attribute, 70 glue_spec, 103 attribute_list, 1 write, 7 pdf_action, 28 pdf_colorstack, 1 pdf_setmatrix, 1 pdf_save, 1 pdf_restore nodes
-   avail lists: 1:2,2:1395,3:425,4:108,5:982,6:124,7:7509,8:76,9:564,10:25,11:606
- 52308 multiletter control sequences out of 65536+600000
+ 5665 words of node memory still in use:
+   77 hlist, 8 vlist, 13 rule, 25 disc, 19 local_par, 6 math, 143 glue, 27 kern, 31 penalty, 4 margin_kern, 269 glyph, 195 attribute, 74 glue_spec, 107 attribute_list, 1 write, 10 pdf_action, 28 pdf_colorstack, 1 pdf_setmatrix, 1 pdf_save, 1 pdf_restore nodes
+   avail lists: 1:2,2:1548,3:525,4:113,5:1040,6:130,7:8206,8:88,9:574,10:22,11:606
+ 52529 multiletter control sequences out of 65536+600000
  128 fonts using 19470527 bytes
- 125i,19n,148p,10708b,2657s stack positions out of 5000i,500n,10000p,200000b,80000s
-</usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/RobotoMono-Regular.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/public/xcharter/XCharter-Italic.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/public/xcharter/XCharter-Roman.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/Roboto-Bold.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/Roboto-Regular.otf>
-Output written on Thesis_Jingyi.pdf (16 pages, 765626 bytes).
+ 125i,19n,148p,10711b,2657s stack positions out of 5000i,500n,10000p,200000b,80000s
+</usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/RobotoMono-Regular.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/public/xcharter/XCharter-Italic.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/public/xcharter/XCharter-Roman.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/Roboto-Bold.otf></usr/local/texlive/2021/texmf-dist/fonts/opentype/google/roboto/Roboto-Regular.otf></usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb></usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmr12.pfb></usr/local/texlive/2021/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb>
+Output written on Thesis_Jingyi.pdf (21 pages, 951132 bytes).
 
-PDF statistics: 307 PDF objects out of 1000 (max. 8388607)
- 65 named destinations out of 1000 (max. 131072)
- 168 words of extra memory for PDF output out of 10000 (max. 100000000)
+PDF statistics: 379 PDF objects out of 1000 (max. 8388607)
+ 78 named destinations out of 1000 (max. 131072)
+ 176 words of extra memory for PDF output out of 10000 (max. 100000000)
 

BIN
Thesis_LaTeX/Thesis_Jingyi.pdf


BIN
Thesis_LaTeX/Thesis_Jingyi.synctex.gz


+ 10 - 2
Thesis_LaTeX/Thesis_Jingyi.tex

@@ -16,6 +16,8 @@
 
 
 
+
+
 % Der folgende Block ist nur bei pdfTeX auf Versionen vor April 2018 notwendig
 \usepackage{iftex}
 \ifPDFTeX
@@ -32,7 +34,6 @@
 %%%%%%%%%%%%%%%%%%%
 %Literaturverzeichnis
 %%%%%%%%%%%%%%%%%%%
-%\usepackage{biblatex}   % Literaturverzeichnis
 \usepackage[backend=biber,style=numeric,sorting=none]{biblatex}
 
 \addbibresource{BibTexDatei.bib}
@@ -53,6 +54,13 @@
 %\usepackage{amssymb}   % erweiterter Zeichensatz
 %\usepackage{siunitx}   % Einheiten
 
+%%%%%%%%%%%%%%%%%%%
+%Paket Graphic
+%%%%%%%%%%%%%%%%%%%
+\usepackage{graphicx} 
+\usepackage{float} 
+\usepackage{subfigure} 
+
 %Formatierungen für Beispiele in diesem Dokument. Im Allgemeinen nicht notwendig!
 \let\file\texttt
 \let\code\texttt
@@ -90,7 +98,7 @@
 
 \input{chapters/abstract.tex}
 \input{chapters/introduction.tex}
-%\input{chapters/related_work.tex}
+\input{chapters/related_work.tex}
 \input{chapters/implementation.tex}
 \input{chapters/evaluate.tex}
 \input{chapters/result.tex}

+ 19 - 17
Thesis_LaTeX/Thesis_Jingyi.toc

@@ -7,38 +7,40 @@
 \defcounter {refsection}{0}\relax 
 \contentsline {chapter}{\numberline {2}Introduction}{5}{chapter.2}%
 \defcounter {refsection}{0}\relax 
-\contentsline {chapter}{\numberline {3}Implementation}{7}{chapter.3}%
+\contentsline {chapter}{\numberline {3}Related Work}{7}{chapter.3}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {3.1}Overview}{7}{section.3.1}%
+\contentsline {chapter}{\numberline {4}Implementation}{9}{chapter.4}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {3.2}System Architecture}{7}{section.3.2}%
+\contentsline {section}{\numberline {4.1}Overview}{9}{section.4.1}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {3.3}Robot}{8}{section.3.3}%
+\contentsline {section}{\numberline {4.2}System Architecture}{9}{section.4.2}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {3.4}Interaction techniques}{8}{section.3.4}%
+\contentsline {section}{\numberline {4.3}Robot}{10}{section.4.3}%
 \defcounter {refsection}{0}\relax 
-\contentsline {subsection}{\numberline {3.4.1}Handle Mode}{10}{subsection.3.4.1}%
+\contentsline {section}{\numberline {4.4}Interaction techniques}{10}{section.4.4}%
 \defcounter {refsection}{0}\relax 
-\contentsline {subsection}{\numberline {3.4.2}Lab Mode}{10}{subsection.3.4.2}%
+\contentsline {subsection}{\numberline {4.4.1}Handle Mode}{11}{subsection.4.4.1}%
 \defcounter {refsection}{0}\relax 
-\contentsline {subsection}{\numberline {3.4.3}Remote Mode}{10}{subsection.3.4.3}%
+\contentsline {subsection}{\numberline {4.4.2}Lab Mode}{11}{subsection.4.4.2}%
 \defcounter {refsection}{0}\relax 
-\contentsline {subsection}{\numberline {3.4.4}UI Mode}{10}{subsection.3.4.4}%
+\contentsline {subsection}{\numberline {4.4.3}Remote Mode}{13}{subsection.4.4.3}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {3.5}Test Scene}{10}{section.3.5}%
+\contentsline {subsection}{\numberline {4.4.4}UI Mode}{13}{subsection.4.4.4}%
 \defcounter {refsection}{0}\relax 
-\contentsline {chapter}{\numberline {4}Evaluation of User Experience}{11}{chapter.4}%
+\contentsline {section}{\numberline {4.5}Test Scene}{14}{section.4.5}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {4.1}Study Design}{11}{section.4.1}%
+\contentsline {chapter}{\numberline {5}Evaluation of User Experience}{15}{chapter.5}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {4.2}Procedure}{11}{section.4.2}%
+\contentsline {section}{\numberline {5.1}Study Design}{15}{section.5.1}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {4.3}Entering the world of VR}{11}{section.4.3}%
+\contentsline {section}{\numberline {5.2}Procedure}{15}{section.5.2}%
 \defcounter {refsection}{0}\relax 
-\contentsline {section}{\numberline {4.4}Practice and evaluation of modes}{12}{section.4.4}%
+\contentsline {section}{\numberline {5.3}Entering the world of VR}{15}{section.5.3}%
 \defcounter {refsection}{0}\relax 
-\contentsline {chapter}{\numberline {5}Results}{13}{chapter.5}%
+\contentsline {section}{\numberline {5.4}Practice and evaluation of modes}{16}{section.5.4}%
 \defcounter {refsection}{0}\relax 
-\contentsline {chapter}{\numberline {6}Conclusion and future work}{14}{chapter.6}%
+\contentsline {chapter}{\numberline {6}Results}{17}{chapter.6}%
+\defcounter {refsection}{0}\relax 
+\contentsline {chapter}{\numberline {7}Conclusion and future work}{18}{chapter.7}%
 \defcounter {refsection}{0}\relax 
 \providecommand \tocbasic@end@toc@file {}\tocbasic@end@toc@file 

+ 9 - 5
Thesis_LaTeX/chapters/evaluate.tex

@@ -2,13 +2,12 @@
 \label{evaluate}
 
 
-This chapter describes the design and detailed process of the user evaluation. The purpose of this user study is to measure the impact of four different modes of operation on rescue efficiency, robot driving performance, and psychological and physiological stress and fatigue, etc. For this purpose, participants are asked to find victims in a test scene using different modes of operation and to answer a questionnaire after the test corresponding to each mode of operation.
-
+This chapter describes the design and detailed process of the user evaluation. The purpose of this user study is to measure the impact of four different modes of operation on rescue efficiency, robot driving performance, and psychological and physiological stress and fatigue, etc. For this purpose, participants are asked to find victims in a test scene using different modes of operation and to answer questionnaires after the test corresponding to each mode of operation.
 
 
 \section{Study Design}
 
-The evaluation for each mode of operation consists of two main parts. The first part is the data recorded during the process of the participant driving the robot in the VR environment to find the victims. The recorded data includes information about the robot's collision and the speed of driving etc. The rescue of the victims was also considered as part of the evaluation. Besides the number of victims rescued, the number of victims who were already visible but neglected is also important data. The Official NASA Task Load Index (TLX) was used to measure the participants subjective workload asessments. Additionally, participants were asked specific questions for each mode of operation and were asked to select their favorite and least favorite modes of operation.
+The evaluation for each mode of operation consists of two main parts. The first part is the data recorded during the process of the participant driving the robot in the VR environment to find the victims. The recorded data includes information about the robot's collision and the speed of driving etc. The rescue of the victims was also considered as part of the evaluation. The Official NASA Task Load Index (TLX) was used to measure the participants subjective workload asessments. Additionally, participants were asked specific questions for each mode and were asked to select their favorite and least favorite operation mode. In order to reduce the influence of order effects on the testl results, the Balanced Latin Square was used when arranging the test order for the four operation modes.
 
 
 
@@ -18,11 +17,16 @@ Before the beginning of the actual testing process, participants were informed o
 
 
 \section{Entering the world of VR}
-After the basic introduction part, participants would directly put on the VR headset and enter the VR environment to complete the rest of the tutorial. Considering that participants might not have experience with VR and that it would take time to learn how to operate the four different modes, the proposed system additionally sets up a practice pattern and places some models of victims in the practice scene. After entering the VR world, participants first needed to familiarize themselves with the opening and selecting options of the menu, as this involves switching between different modes and entering the test scenes. Then participants would use the motion controllers to try to teleport themselves, or raise themselves into mid-air. Finally participants were asked to interact with the victim model through virtual hands. After this series of general tutorials, participants were already generally familiar with the use of VR and how to move around in the VR world.
+After the basic introduction part, participants would directly put on the VR headset and enter the VR environment to complete the rest of the tutorial. Considering that participants might not have experience with VR and that it would take time to learn how to operate the four different modes, the proposed system additionally sets up a practice pattern and places some models of victims in the practice scene. After entering the VR world, participants first needed to familiarize themselves with the opening and closing menu, as well as useing the motion controllers to try to teleport themselves, or raise themselves into mid-air. Finally participants were asked to interact with the victim model through virtual hands. After this series of general tutorials, participants were already generally familiar with the use of VR and how to move around in the VR world.
 
 
 
 \section{Practice and evaluation of modes}
-Given the different manipulation approaches for each mode, in order to avoid confusion between the different modes, participants would then take turns practicing and directly evaluating each mode immediately afterwards. The participant first switched to the mode of operation to be tested and manipulated the robot to move in that mode. After attempting to rescue 1-2 victim models and the participant indicated that he or she was familiar enough with this operation mode, the participant would enter the test scene. In the test scene, participants had to save as many victims as possible in a given time limit. Participants were required to move the robot around the test scene to explore the post-disaster city and to find and rescue victims. In this process, if the robot crashes with buildings, obstacles, etc., besides the collision information being recorded as test data, participants would also receive sound and vibration feedback. The test will automatically end when time runs out or when all the victims in the scene have been rescued. Participants were required to complete the evaluation questionnaire and the NASA evaluation form at the end of each test. This process was repeated in each mode of operation. 
+Given the different manipulation approaches for each mode, in order to avoid confusion between the different modes, participants would then take turns practicing and directly evaluating each mode immediately afterwards. 
+
+The sequence of modes to be tested  is predetermined. The order effect is an important factor affecting the test results. If the order of the operational modes to be tested was the same for each participant, the psychological and physical exhaustion caused by the last operation mode would inevitably be greater. In order to minimize the influence of the order effect on the results of the test, the Balanced Latin Square with the size of four was used to arrange the test order of the four operation modes.
+
+Participants automatically entered the practice scene corresponding to the relevant operation mode in the predefined order. After attempting to rescue 1-2 victim models and the participant indicated that he or she was familiar enough with this operation mode, the participant would enter the test scene. In the test scene, participants had to save as many victims as possible in a given time limit. Participants were required to move the robot around the test scene to explore the post-disaster city and to find and rescue victims. In this process, if the robot crashes with buildings, obstacles, etc., besides the collision information being recorded as test data, participants would also receive sound and vibration feedback. The test will automatically end when time runs out or when all the victims in the scene have been rescued. Participants were required to complete the evaluation questionnaire and the NASA evaluation form at the end of each test. This process was repeated in each mode of operation. 
 
 After all the tests were completed, participants were asked to compare the four operation modes and select the one they liked the most and the one they liked the least. In addition, participants could give their reasons for the choice and express their opinions as much as they wanted, such as suggestions for improvement or problems found during operation.
+

+ 23 - 7
Thesis_LaTeX/chapters/implementation.tex

@@ -6,16 +6,15 @@ In this chapter, the tools and techniques used in building this human-computer c
 
 
 \section{Overview}
-The main goal of this work is to design and implement a VR-based human-robot collaboration system with different methods of operating the robot in order to find out which method of operation is more suitable to be used to control the rescue robot. Further, it is to provide some basic insights for future development directions and to provide a general direction for finding an intuitive, easy-to-use and efficient operation method. Therefore, the proposed system was developed using Unity, including four modes of operation and a corresponding test environment for simulating post-disaster scenarios. In each operation mode, the user has a different method to control the robot. In addition, in order to better simulate the process by which the robot scans its surroundings and the computer side cumulatively gets a reconstructed 3D virtual scene, the test environment was implemented in such a way that the picture seen by the user depends on the direction of the robot's movement and the trajectory it travels through.
-
+The main goal of this work is to design and implement a VR-based human-robot collaboration system with different methods of operating the robot in order to find out which method of operation is more suitable to be used to control the rescue robot. Further, it is to provide some basic insights for future development directions and to provide a general direction for finding an intuitive, easy-to-use and efficient operation method. Therefore, the proposed system was developed using Unity, including four modes of operation and a corresponding test environment for simulating post-disaster scenarios. In each operation mode, the user has a different method to control the robot. In addition, in order to better simulate the process by which the robot scans its surroundings and the computer side cumulatively gets a reconstructed 3D virtual scene, the test environment was implemented in such a way that the picture seen by the user depends on the robot's movement and the trajectory it travels through.
 
 \section{System Architecture}
 The proposed system runs on a computer with the Windows 10 operating system. This computer has been equipped with an Intel Core i7-8700K CPU, 32 GB RAM as well as a NVIDIA GTX 1080 GPU with 8 GB VRAM. HTC Vive is used as a VR device. It has a resolution of 1080 × 1200 per eye, resulting in a total resolution of 2160 × 1200 pixels, a refresh rate of 90 Hz, and a field of view of 110 degrees. It includes two motion controllers and uses two Lighthouses to track the position of the headset as well as the motion controllers.
 
-Unity was chosen as the platform to develop the system. Unity is a widely used game engine with a Steam VR plugin \footnote{https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647}, which allows developers to focus on the VR environment and interactive behaviors in programming, rather than specific controller buttons and headset positioning, making VR development much simpler. Another reason why Unity was chosen as a development platform was the potential for collaboration with the Robot Operating System (ROS), a frequently used operating system for robot simulation and manipulation, which is flexible, low-coupling, distributed, open source, and has a powerful and rich third-party feature set. In terms of collaboration between Unity and ROS, Siemens provides open source software libraries and tools in C\# for communicating with ROS from .NET applications \footnote{https://github.com/siemens/ros-sharp}. Combining ROS and Unity to develop a collaborative human-robot interaction platform proved to be feasible\cite{Whitney:2018wk}. Since the focus of this paper is on human-robot interaction, collaboration and synchronization of ROS will not be explored in detail here.
+Unity was chosen as the platform to develop the system. Unity is a widely used game engine with a Steam VR plugin \footnote{https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647}, which allows developers to focus on the VR environment and interactive behaviors in programming, rather than specific controller buttons and headset positioning, making VR development much simpler. Another reason why Unity was chosen as a development platform was the potential for collaboration with the Robot Operating System (ROS), a frequently used operating system for robot simulation and manipulation, which is flexible, low-coupling, distributed, open source, and has a powerful and rich third-party feature set. In terms of collaboration between Unity and ROS, Siemens provides open source software libraries and tools in C\# for communicating with ROS from .NET applications \footnote{https://github.com/siemens/ros-sharp}. Combining ROS and Unity to develop a collaborative human-robot interaction platform proved to be feasible \cite{Whitney:2018wk}. Since the focus of this paper is on human-robot interaction, collaboration and synchronization of ROS will not be explored in detail here.
 
 \section{Robot}
-To simulate the process of a robot using a probe camera to detect the real environment and synchronise it to Unity, a conical collision body was set up on the robot. The robot will transform the Layers of the objects in the scene into visible Layers by collision detection and a trigger event (onTriggerEnter function). The robot's driving performance, such as the number of collisions, average speed, total distance, etc., will be recorded in each test. The detailed recorded information can be seen in Fig.\ref{fig:uml}. The movement of the robot depends on the value of the signal that is updated in each mode. In addition, the robot's Gameobject has the NavMeshAgent \footnote{https://docs.unity3d.com/ScriptReference/AI.NavMeshAgent.html} component, which supports the robot's navigation to the specified destination with automatic obstacle avoidance in the test scene. The virtual robot has three cameras. Each camera is set up in such a way that it can only see the area detected by the robot's radar and change the image bound to it in real time. The four operations described later all use the camera viewport as a monitoring screen by rendering the camera viewport on UI canvas.
+To simulate the process of a robot using a LiDAR remote sensor to detect the real environment and synchronise it to Unity, a sphere collision body was set up on the robot. The robot will transform the Layers of the objects in the scene into visible Layers by collision detection and a trigger event (onTriggerEnter function). The robot's driving performance, such as the number of collisions, average speed, total distance, etc., will be recorded in each test. The detailed recorded information can be seen in Fig.\ref{fig:uml}. The movement of the robot depends on the value of the signal that is updated in each mode. In addition, the robot's Gameobject has the NavMeshAgent \footnote{https://docs.unity3d.com/ScriptReference/AI.NavMeshAgent.html} component, which supports the robot's navigation to the specified destination with automatic obstacle avoidance in the test scene. The virtual robot has three cameras. One of the cameras is a simulation of a surveillance camera mounted on the robot, which can see all the items in the scene, although the distant items are not yet detected by LiDAR. Two of these camera are set up in such a way that they can only see the area detected by the robot's LiDAR remote sensor. Each camera captures what it sees and modifies the bound image bound in real time. The four operation modes described later all use the camera viewport as a monitoring screen by rendering the camera viewport on UI canvas.
 
 
 \section{Interaction techniques}
@@ -30,21 +29,38 @@ This system has 4 different approaches to control the robot. Each mode has its o
 
 In order to improve the reusability of the code and to facilitate the management of subsequent development, the classes that manage the interaction actions of each mode implement the same interface. A graphical representation of the system structure is given in the UML activity diagram in Fig.\ref{fig:uml}.
 
-\begin{figure}[h]
+\begin{figure}[htbp]
     \centering
-    \includegraphics[height=12cm]{graphics/uml.png}
+    \includegraphics[width=\textwidth]{graphics/uml.png}
     \caption{UML Class diagram for the main structure of the system}
     \label{fig:uml}
 \end{figure}
 
 \subsection{Handle Mode}
+In this mode, the user is controlling the robot's movement directly through the motion controller in the right hand. The touch pad of the motion controller determines the direction of rotation of the robot. The user can control the robot's driving speed by pulling the Trigger button. Fig.\ref{fig:htc} shows how to get the values from the HTC motion controller. The robot rotation direction will read the value of the touchpad X-axis. The range of values is $[-1,1]$. Forward speed reads the Trigger button passed in as a variable of type SteamVR\_Action\_Single, and the range of the variable is $[0,1]$. With the right-hand menu button, the surveillance screen around the robot can be turned on or off. The monitor window can be adjusted to a suitable position by dragging and rotating it. In the literature dealing with VR and human-computer collaboration, many researchers have used a similar operational approach. Therefore, as a widely used, and in a sense default operation approach, this mode was designed and became one of the proposed operation modes.
+
+\begin{figure}[htbp]
+    \centering
+	\includegraphics[height=10cm]{graphics/htc.png}
+	\caption{HTC handle illustration.}
+	\label{fig:htc}
+\end{figure}
+
 \subsection{Lab Mode}
+The original intention of designing this mode is that there is a part of the literature where the immersive human-robot collaborative framework are used to train operators how to operate the robot, avoiding risks and saving learning costs or directly as a platform for operating the robot \cite{Perez:2019ub}\cite{Matsas:2017aa}. Therefore, in this mode, a virtual laboratory environment is constructed, in which simulated buttons, controllers, and monitoring equipment are placed. The laboratory consists of two parts. The first part is the monitoring equipment: the monitoring screen is enlarged and placed at the front of the lab as a huge display. The second part is the operating console in the center of the laboratory, which can be moved by the user as desired. The user can use the buttons on the right side to lock the robot or let it walk forward automatically. In the middle of the console are two operating joysticks that determine the robot's forward motion and rotation respectively. The part that involves virtual joystick movement and button effects uses an open source github project VRtwix\footnote{https://github.com/rav3dev/vrtwix}. With the sliding stick on the left, the user can edit the speed of the robot's forward movement and rotation.
+
 \subsection{Remote Mode}
+In this mode, the user can set the driving target point directly or control the robot by picking up the remote control that is placed on the toolbar. The target point is set by the ray emitted by the right motion controller. This process is similar to setting a teleportation point. After the target point is set, a square representing the destination is shown in the scene, and the robot will automatically travel to the set destination. The entire driving process uses the NavMeshAgent component and is therefore capable of automatic obstacle avoidance.
+By clicking on the menu button, a movable toolbar is opened with a remote control and a monitoring device. The remote control is a safety precaution in case the automatic navigation fails to navigate to the target point properly. The user can adjust the direction of the robot's travel by using the remote control. The pickup and auto-release parts use the ItemPackage component available in the SteamVR plugin.
+
+
+
 \subsection{UI Mode}
+The virtual menu is also an interaction method that is often used in VR, so this mode is proposed. In this mode, the user must interact with the virtual menu using the ray emitted by the right motion controller. The virtual menu is set up with buttons for the direction of movement, speed controller, and buttons to open and close the monitor screen. In addition to this, an additional follow function is added to the menu, allowing the robot to follow the user's position in the virtual world. This is intended to let the user concentrate on observing the rendered VR environment. Also, having a real robot follow the user's location in the virtual world is a novel, unique human-machine integration mode in VR. The robot's automatic navigation uses the NavMeshAgent.
 
 
 \section{Test Scene}
-In order to simulate the use of rescue robots in disaster scenarios, the test scenes were built to mimic the post-disaster urban environment as much as possible. The POLYGON Apocalypse\footnote{https://assetstore.unity.com/packages/3d/environments/urban/polygon-apocalypse-low-poly-3d-art-by-synty-154193}, available on the Unity Asset Store, is a low poly asset pack with a large number of models of buildings, streets, vehicles, etc. Using this resource pack as a base, additional collision bodies of the appropriate size were manually added to each building and obstacle after the pack was imported, which was needed to help track the robot's driving crash in subsequent tests.
+In order to simulate the use of rescue robots in disaster scenarios, the test scenes were built to mimic the post-disaster urban environment as much as possible. The POLYGON Apocalypse \footnote{https://assetstore.unity.com/packages/3d/environments/urban/polygon-apocalypse-low-poly-3d-art-by-synty-154193}, available on the Unity Asset Store, is a low poly asset pack with a large number of models of buildings, streets, vehicles, etc. Using this resource pack as a base, additional collision bodies of the appropriate size were manually added to each building and obstacle after the pack was imported, which was needed to help track the robot's driving crash in subsequent tests.
 
 Considering that there are four modes of operation to be tested, four scenes with similar complexity, similar composition of buildings but different road conditions and placement of buildings were constructed. The similarity in complexity of the scenes ensures that the difficulty of the four tests is basically identical. The different scene setups ensure that the scene information learned by the user after one test will not make him understand the next test scene and thus affect the accuracy of the test data. 
 

+ 2 - 3
Thesis_LaTeX/chapters/introduction.tex

@@ -17,12 +17,11 @@ The use of VR in human-computer collaboration also has the potential. In terms o
 However, there remains a need to explore human-computer interaction patterns and improve the level of human-computer integration\cite{Wang:2017uy}. Intuitive and easy-to-use interaction patterns can enable the user to explore the environment as intentionally as possible and improve the efficiency of search and rescue. The appropriate interaction method should cause less mental and physical exhaustion, which also extends the length of an operation, making it less necessary for the user to frequently exit the VR environment for rest.
 
 % What I have done (overview)
-For this purpose, this paper presents a preliminary VR-based system for the simulation of ground rescue robots with four different modes of operation and corresponding test scenes imitating a post-disaster city. The test scene simulates a robot collaborating with Unity to construct a virtual 3D scene. The robot has a simulated radar, which makes the display of the scene dependent on the robot's movement.
-In order to find an interaction approach that is as intuitive and low mental fatigue as possible, a user study was executed after the development was completed.
-
+For this purpose, this paper presents a preliminary VR-based system for the simulation of ground rescue robots with four different modes of operation and corresponding test scenes imitating a post-disaster city. The test scene simulates a robot collaborating with Unity to construct a virtual 3D scene. The robot has a simulated LiDAR remote sensor, which makes the display of the scene dependent on the robot's movement. In order to find an interaction approach that is as intuitive and low mental fatigue as possible, a user study was executed after the development was completed.
 
 
 % Paper Architecture
+Chapter \ref{related} talks about some of the research involving the integration of VR and human-computer interaction.
 Chapter \ref{implementation} provides details of the purposed system, including the techniques used for the different interaction modes and the structure of the test scenes.
 Chapter \ref{evaluate} will talk about the design and process of user study.
 Chapter \ref{result} presents the results of the user study and analyzes the advantages and disadvantages of the different modes of operation and the directions for improvement.

+ 9 - 0
Thesis_LaTeX/chapters/related_work.tex

@@ -1 +1,10 @@
 \chapter{Related Work}
+\label{related}
+
+In this chapter, some research on the integration of VR and human-computer interaction will be discussed. The relevant literature and its contributions will be briefly presented. The topic of VR and human-computer integration is an open research with many kinds of focus perspectives.
+
+Robotic manipulation platforms combined with virtual worlds have several application scenarios. It can be used, for example, to train operators or to collaborate directly with real robots. Elias Matsas et al. \cite{Matsas:2017aa} provided a VR-based training system using hand recognition. Kinect cameras are used to capture the user's positions and motions, and virtual user models are constructed in the VR environment based on the collected data to operate robots as well as virtual objects, such as buttons. Users will learn how to operate the robot in a VR environment. The framework of VR purposed by Luis Pérez et al. \cite{Perez:2019ub} is applied to train operators to learn to control the robot. Since the environment does not need to change in real time, but rather needs to realistically recreate the factory scene, the VR scene here is not reconstructed in a way that it is captured and rendered in real time. Rather, a highly accurate 3D environment was reconstructed in advance using Blender after being captured with a 3D scanner.
+
+Building 3D scenes in virtual worlds based on information collected by robots is also a research highlight. Wang, et al. \cite{Wang:2017uy} were concerned with the visualization of the rescue robot and its surroundings in a virtual environment. The proposed human-robot interaction system uses incremental 3D-NDT map to render the robot's surroundings in real time. The user can view the robot's surroundings in a first-person view through the HTC-Vive and send control commands through the handle's arrow keys. The novel VR-based practical system provided by Patrick Stotko et al. \cite{Stotko:2019ud} uses distributed systems to reconstruct 3D scene. The data collected by the robot is fiest transmitted to the client responsible for reconstructing the scene. After the client has constructed the 3d scene, the set of actively reconstructed visible voxel blocks is sent to the server responsible for communication, which has a robot-based live telepresence and teleoperation system. This server will then broadcast the data back to the client used by the operator, thus enabling an immersive visualization of the robot within the scene.
+
+Others are more concerned about the manipulation of the robotic arm mounted on the robot. Moniri et al. \cite{Moniri:2016ud} provided a VR-based operating model for the robotic arm. The user wearing a headset can see a simulated 3D scene at the robot's end and send pickup commands to the remote robot by clicking on the target object with the mouse. The system proposed by Ostanin et al. \cite{Ostanin:2020uo} is worth mentioning. Although their proposed system for operating a robotic arm is based on mixed reality(MR), the article is highly relevant to this paper, considering the high relevance of MR and VR and the proposed system detailing the combination of ROS and robotics. In their system,the ROS Kinect was used as middleware and was responsible for communicating with the robot and the Unity side. The user can control the movement of the robot arm by selecting predefined options in the menu. In addition, the orbit and target points of the robot arm can be set by clicking on a hologram with a series of control points.

BIN
Thesis_LaTeX/graphics/.DS_Store


+ 0 - 0
Thesis_LaTeX/graphics/htc controller.png → Thesis_LaTeX/graphics/htc.png


+ 4 - 4
Thesis_LaTeX/pdfa.xmpi

@@ -76,15 +76,15 @@
   </rdf:Description> 
   <rdf:Description rdf:about="" xmlns:xmp="http://ns.adobe.com/xap/1.0/"> 
    <xmp:CreatorTool>LaTeX using TUDa-CI</xmp:CreatorTool> 
-   <xmp:ModifyDate>2021-07-01T14:25:09+02:00</xmp:ModifyDate> 
-   <xmp:CreateDate>2021-07-01T14:25:09+02:00</xmp:CreateDate> 
-   <xmp:MetadataDate>2021-07-01T14:25:09+02:00</xmp:MetadataDate> 
+   <xmp:ModifyDate>2021-07-04T16:59:55+02:00</xmp:ModifyDate> 
+   <xmp:CreateDate>2021-07-04T16:59:55+02:00</xmp:CreateDate> 
+   <xmp:MetadataDate>2021-07-04T16:59:55+02:00</xmp:MetadataDate> 
   </rdf:Description> 
   <rdf:Description rdf:about="" xmlns:xmpRights = "http://ns.adobe.com/xap/1.0/rights/"> 
   </rdf:Description> 
   <rdf:Description rdf:about="" xmlns:xmpMM="http://ns.adobe.com/xap/1.0/mm/"> 
    <xmpMM:DocumentID>uuid:664AE29E-00CF-749D-ECCF-A13289ED18C3</xmpMM:DocumentID> 
-   <xmpMM:InstanceID>uuid:F08B926F-9DBE-4576-207A-0A70E2940501</xmpMM:InstanceID> 
+   <xmpMM:InstanceID>uuid:92A9D82E-694F-8521-1203-F481A2A594BF</xmpMM:InstanceID> 
   </rdf:Description> 
  </rdf:RDF> 
 </x:xmpmeta>