UnityEngine.VRModule The VR module implements support for virtual reality devices in Unity. Enumeration of available modes for XR rendering in the Game view or in the main window on a host PC. XR rendering only occurs when the Unity Editor is in Play Mode. Renders both eyes of the XR device side-by-side in the Game view or in the main window on a host PC. Renders the left eye of the XR device in the Game View window or in main window on a host PC. Renders both eyes of the XR device, and the occlusion mesh, side-by-side in the Game view or in the main window on a host PC. Renders the right eye of the XR device in the Game View window or in main window on a host PC. Represents the size of physical space available for XR. Represents a space large enough for free movement. Represents a small space where movement may be constrained or positional tracking is unavailable. Represents the current user presence state detected by the device. The device does not detect that the user is present. The device detects that the user is present. The device is currently in a state where it cannot determine user presence. The device does not support detecting user presence. he Holographic Remoting interface allows you to connect an application to a remote holographic device, and stream data between the application and that device. Whether the app is displaying protected content. The Holographic Settings contain functions which effect the performance and presentation of Holograms on Windows Holographic platforms. Option to allow developers to achieve higher framerate at the cost of high latency. By default this option is off. True to enable or false to disable Low Latent Frame Presentation. Represents the kind of reprojection an app is requesting to stabilize its holographic rendering relative to the user's head motion. The image should not be stabilized for the user's head motion, instead remaining fixed in the display. This is generally discouraged, as it is only comfortable for users when used sparingly, such as when the only visible content is a small cursor. The image should be stabilized only for changes to the user's head orientation, ignoring positional changes. This is best for body-locked content that should tag along with the user as they walk around, such as 360-degree video. The image should be stabilized for changes to both the user's head position and orientation. This is best for world-locked content that should remain physically stationary as the user walks around. Whether the app is displaying protected content. This method returns whether or not the display associated with the main camera reports as opaque. Returns true if Holographic rendering is currently running with Latent Frame Presentation. Default value is false. The kind of reprojection the app is requesting to stabilize its holographic rendering relative to the user's head motion. Sets a point in 3d space that is the focal point of the scene for the user for this frame. This helps improve the visual fidelity of content around this point. This must be set every frame. The position of the focal point in the scene, relative to the camera. Surface normal of the plane being viewed at the focal point. A vector that describes how the focus point is moving in the scene at this point in time. This allows the HoloLens to compensate for both your head movement and the movement of the object in the scene. Sets a point in 3d space that is the focal point of the scene for the user for this frame. This helps improve the visual fidelity of content around this point. This must be set every frame. The position of the focal point in the scene, relative to the camera. Surface normal of the plane being viewed at the focal point. A vector that describes how the focus point is moving in the scene at this point in time. This allows the HoloLens to compensate for both your head movement and the movement of the object in the scene. Sets a point in 3d space that is the focal point of the scene for the user for this frame. This helps improve the visual fidelity of content around this point. This must be set every frame. The position of the focal point in the scene, relative to the camera. Surface normal of the plane being viewed at the focal point. A vector that describes how the focus point is moving in the scene at this point in time. This allows the HoloLens to compensate for both your head movement and the movement of the object in the scene. Enum indicating the reason why connection to remote device has failed. Enum indicating the reason why remote connection failed. Handskahe failed while traying to establish connection with remote device. No failure. Protocol used by the app does not match remoting app running on remote device. Couldn't identify the reason why connection failed. Remove device is not reachable. Current state of the holographis streamer remote connection. Indicates app being connected to remote device. Indicates app trying to connect to remote device. Indicates app being currently disconnected from any other remote device. Contains all functionality related to a XR device. The name of the family of the loaded XR device. Zooms the XR projection. Successfully detected a XR device in working order. Specific model of loaded XR device. Refresh rate of the display in Hertz. Indicates whether the user is present and interacting with the device. Sets whether the camera passed in the first parameter is controlled implicitly by the XR Device The camera that we wish to change behavior on True if the camera's transform is set externally. False if the camera is to be driven implicitly by XRDevice, Nothing. This method returns an IntPtr representing the native pointer to the XR device if one is available, otherwise the value will be IntPtr.Zero. The native pointer to the XR device. Returns the device's current TrackingSpaceType. This value determines how the camera is positioned relative to its starting position. For more, see the section "Understanding the camera" in. The device's current TrackingSpaceType. Sets the device's current TrackingSpaceType. Returns true on success. Returns false if the given TrackingSpaceType is not supported or the device fails to switch. The TrackingSpaceType the device should switch to. True on success. False if the given TrackingSpaceType is not supported or the device fails to switch. Global XR related settings. Globally enables or disables XR for the application. Fetch the eye texture RenderTextureDescriptor from the active stereo device. The current height of an eye texture for the loaded device. Controls the actual size of eye textures as a multiplier of the device's default resolution. The current width of an eye texture for the loaded device. Sets the render mode for the XR device. The render mode controls how the view of the XR device renders in the Game view and in the main window on a host PC. Read-only value that can be used to determine if the XR device is active. Type of XR device that is currently loaded. A scale applied to the standard occulsion mask for each platform. This field has been deprecated. Use XRSettings.eyeTextureResolutionScale instead. Controls how much of the allocated eye texture should be used for rendering. This property has been deprecated. Use XRSettings.gameViewRenderMode instead. Returns a list of supported XR devices that were included at build time. Specifies whether or not the occlusion mesh should be used when rendering. Enabled by default. Loads the requested device at the beginning of the next frame. Name of the device from XRSettings.supportedDevices. Prioritized list of device names from XRSettings.supportedDevices. Loads the requested device at the beginning of the next frame. Name of the device from XRSettings.supportedDevices. Prioritized list of device names from XRSettings.supportedDevices. Timing and other statistics from the XR subsystem. Total GPU time utilized last frame as measured by the XR subsystem. Retrieves the number of dropped frames reported by the XR SDK. Outputs the number of frames dropped since the last update. True if the dropped frame count is available, false otherwise. Retrieves the number of times the current frame has been drawn to the device as reported by the XR SDK. Outputs the number of times the current frame has been presented. True if the frame present count is available, false otherwise. Retrieves the time spent by the GPU last frame, in seconds, as reported by the XR SDK. Outputs the time spent by the GPU last frame. True if the GPU time spent last frame is available, false otherwise.