Browse Source

change readme

Jia, Jingyi 2 years ago
parent
commit
50895df2a3
1 changed files with 51 additions and 16 deletions
  1. 51 16
      README.md

+ 51 - 16
README.md

@@ -1,48 +1,79 @@
 # Hector VR v2.0
 
-
-
 ## Introduction
 
-
+Ground-based rescue robot simulation with 4 operating modes and test scenarios
 
 
 
 ## Setup
 
-##### 0.
+##### 0. Install
 
 ![image-20210628133300487](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628133300.png?token=APASYCSPQNQGUAURUL3L3I3A3GZ22)
 
+
+
 ##### 1. SteamVR Input
 
-##### 2. NavAgent
+Set SteamVR bindings under `Window>SteamVR Input>Open binding UI`
+
+
+
+##### 2. Navmesh Agent
+
+The project uses the NavMesh to allow it to navigate the Scene, so please set the walkable area under `Window>AI>Navigation`.
+
+The following scenarios need to be set: Simulation, RemoteTest and UITest.
 
 ![image-20210628133108405](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628142145.png)
 
 
 
-##### Operation Mode
+## Operation Mode
+
+##### 1. Handle 
+
+The user can control the robot directly by operating the handle
 
-1.
+##### 2. Lab
 
-2.Lab
+The environment is designed to resemble a real laboratory environment as closely as possible, and is operated by holding the joystick or clicking on buttons with the virtual hand.
 
 ![image-20210628133617313](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628133617.png?token=APASYCURCHKGAIPQKLLAMDDA3G2HE)
 
-https://github.com/rav3dev/vrtwix
+The part that involves virtual joystick movement and button effects uses an open source github project [VRtwix](https://github.com/rav3dev/vrtwix).
 
-3.
+##### 3. Remote Mode
 
-4.UI
+In this mode the user can pick up the tools of operation manually: for example the remote control. Alternatively, the target point can be set directly from the right hand. The robot will automatically run to the set target point.
+
+##### 4. UI Mode
+
+In this mode the user must use the rays emitted by the handle to manipulate the robot. By clicking on the direction button, the user can control the direction of movement of the robot. In addition to this, the user can also turn on the follow function, in which case the robot will always follow the user's position in the virtual world.
 
 ![image-20210628133832114](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628133832.png?token=APASYCXENQ4AIWCX5U3COVDA3G2PQ)
 
-##### Test Scene
 
-![image-20210628133946878](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628133946.png?token=APASYCSJGIXOJI7FT7DPDB3A3G2UI)
 
-![image-20210628134138882](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134139.png?token=APASYCQ3PSF5ZIFYCUTCPVTA3G23K)
+## Test Scene
+
+In the test scenario, the user is asked to find as many sufferers as possible within a limited period of time. The test is terminated when the time runs out or when all victims have been found.
+Throughout the test, the robot's performance, including the total distance travelled, the average speed travelled and the number of collisions, is recorded. In addition to this, the rescue of victims, for example if they are already in view but ignored by the user, is also recorded.
+
+![image-20210628134138882](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628155701.png)
+
+###### Collision Detection
+
+Each object in the scene that will be used for collision detection comes with a suitable collision body.
+
+![image-20210628134930493](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134930.png?token=APASYCU266JS7J3J7GACYUDA3G3YY)
+
+
+
+###### Scene overview
+
+![image-20210628133946878](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628133946.png?token=APASYCSJGIXOJI7FT7DPDB3A3G2UI)
 
 ![image-20210628134229529](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134229.png?token=APASYCUJPRZNCGUHHI62CGDA3G26M)
 
@@ -50,11 +81,15 @@ https://github.com/rav3dev/vrtwix
 
 ![image-20210628134325451](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134325.png?token=APASYCVJWFMPXJFHCOCIFT3A3G3B4)
 
-![image-20210628134930493](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134930.png?token=APASYCU266JS7J3J7GACYUDA3G3YY)
 
 
 
-#### Robot
+
+
+
+## Robot
+
+To simulate the process of a robot using a probe camera to detect the real environment and synchronise it to Unity, a conical collision body was set up on the robot.
 
 ![image-20210628134809076](https://raw.githubusercontent.com/elaineJJY/Storage/main/Picture/20210628134809.png?token=APASYCWWK6KXFTZNBVZFZH3A3G3TS)