Ground-based rescue robot simulation with 4 operating modes and test scenarios

Jingyi Jia 117463e0e6 fix code structure. add comments 3 years ago
Hector_v2 117463e0e6 fix code structure. add comments 3 years ago
TestResult 07b48dc113 add test seq. change robot speed,camera,radar. modified the target posture 3 years ago
Thesis_LaTeX 66a551b661 update thesis 3 years ago
.DS_Store 66a551b661 update thesis 3 years ago
README.md 07b48dc113 add test seq. change robot speed,camera,radar. modified the target posture 3 years ago
Thesis.md 66a551b661 update thesis 3 years ago

README.md

Hector VR v2.0

Introduction

Ground-based rescue robot simulation with 4 operating modes and test scenarios

Setup

In theory, you can run the project directly by downloading and setting up SteamVR Input.

0. Install

For similar pop-up alerts please click:

image-20210628133300487

1. SteamVR Input

Set SteamVR bindings under Window>SteamVR Input>Open binding UI

2. Navmesh Agent

The project uses the NavMesh to allow it to navigate the Scene. Please make sure there is a corresponding Navmesh generated folder in the Scene folder, if the autowalk is not available after running, please re-bake the scene under Window>AI>Navigation.

The following scenarios need to be set: Simulation, RemoteTest and UITest.

image-20210628133108405

3. Set the path to save the .csv file

see TestManagement below

Operation Mode

1. Handle

The user can control the robot directly by operating the handle

2. Lab

The environment is designed to resemble a real laboratory environment as closely as possible, and is operated by holding the joystick or clicking on buttons with the virtual hand.

image-20210628133617313

The part that involves virtual joystick movement and button effects uses an open source github project VRtwix.

3. Remote Mode

In this mode the user can pick up the tools of operation manually: for example the remote control. Alternatively, the target point can be set directly from the right hand. The robot will automatically run to the set target point.

4. UI Mode

In this mode the user must use the rays emitted by the handle to manipulate the robot. By clicking on the direction button, the user can control the direction of movement of the robot. In addition to this, the user can also turn on the follow function, in which case the robot will always follow the user's position in the virtual world.

image-20210628133832114

Test Scene

In the test scenario, the user is asked to find as many sufferers as possible within a limited period of time. The test is terminated when the time runs out or when all victims have been found. Throughout the test, the robot's performance, including the total distance travelled, the average speed travelled and the number of collisions, is recorded. In addition to this, the rescue of victims, for example if they are already in view but ignored by the user, is also recorded.

image-20210628134138882

Collision Detection

Each object in the scene that will be used for collision detection comes with a suitable collision body.

image-20210628134930493

Scene overview

image-20210628133946878

image-20210628134229529

image-20210628134255927

image-20210628134325451

Robot

To simulate the process of a robot using a LiDAR to detect the real environment and synchronise it to Unity, a sphere collision body was set up on the robot.

image-20210703194322743

TestManagement

The order of the tests depends on Latin-Square with size =4.

TestID & .csv file

The TestID (sequential number of this participant) must be set before testing, as well as the path to store the .csv test data.

The .csv file should be closed during the test.

image-20210703194845820

How to stop the test immediately

If you want to stop the current test immediately, change the value of Total Time. When the value of Total Time is 0, the test will automatically end and the test data will be saved.