Home

Awesome

FindSurface-RealityKit-visionOS-Real-Time

CurvSurf FindSurfaceâ„¢ Real-time demo app for visionOS (Swift)

Overview

This demo app demonstrates a sample project that searches vertex points (provided by ARKit as mesh anchors) for geometry shapes continuously fully utilizing features of FindSurface-visionOS package. This is a minimized and optimized version of FindSurface-RealityKit-visionOS-Response-to-Request.

This app introduces Preview mode that makes it continuously invoke FindSurface without any delay, searching for geometry shapes from mesh points around the center of your sight. If found, it displays a 3D model of the detected shape transiently as a preview. Spatial tap will capture and freeze the previewed geometry.

As mentioned in here, FindSurface-visionOS limits the number of input points to less than 50k.

Requirements

The app runs on an Apple Vision Pro (visionOS 2.0+) device only, and requires your permissions to track your hands and to scan your environment (world sensing) in order to operate as intended. Dialogs asking the permissions will be prompted upon launching the app.

How to use

After launching the app, Startup window will open. Click on Enter button to open the immersive space.

The device will start scanning your physical environment and generate mesh anchors. A black sphere will float in front of you to indicate the center of your sight. Unlike our previous app, which utilizes eye-tracking position, the seed point location is determined by ray casting with the device location and the direction of CoP (center of projection) in your sight in Preview mode of this app. This indicator helps you aim at the place to search for geometries in Preview mode. The triangle hit by the ray casting will be highlighted in green.

Startup window

startup-window.png

If you have never clicked Don't allow on the permission dialog, you won't see the Permission Not Granted section. It only shows up when you disallowed the necessary permissions. Go to Settings button will lead you to this app's section in the Settings app. After all the necessary permissions are granted, the Enter button will be enabled and it will restart the app when you click it.

Controls Window

controls-window.png

Controls window provides you with app controls that you will frequently access while using the app. It will be displayed on the right side of your sight and can be relocated by pinching with the middle finger and thumb of your right hand.

For detailed explanations of the parameters, please refer to FindSurface.

For the recommended preset of the parameters, please refer to Optimal Parameter Set for Apple Vision Pro.

Status View

status-view.png

This view can be relocated by pinching with the middle finger and thumb of your left hand.

Invoking FindSurface package's APIs

This section explains how to detect geometry using FindSurface-visionOS package's APIs in source code.

To detect geometry using FindSurface, at least two steps are necessary: the first is to specify parameters that correspond to the environment and target geometry. The parameters can be adjusted in the control window, as explained above. The second is to define a seed point from which the search in the point cloud obtained from the environment will begin. This process involves ray casting towards the direction of a point shown at the center of the screen. The ray picks a triangle and then the closest point among its three vertices will be used as the seed point.

https://github.com/CurvSurf/FindSurface-RealityKit-visionOS-Real-Time/blob/69da589566f36307120a64eeaeeea6597b64735f/FindSurfaceRT-visionOS/App/AppState.swift#L203-L209

The meshVertexManager.raycast function performs ray casting using the position and direction of the user device (DeviceAnchor). The direction is slightly adjusted about 10 degrees below the center of the screen to align with the user's natural line of sight (for more details, see the Adjust Device Anchor direction section in this document). The nearestTriangleVertices function returns the three vertices of the triangle in a form of tuple of simd_float3. The vertices are sorted by their distance from the ray (points.0 is nearest).

Once these steps are completed, the API can be called as follows:

https://github.com/CurvSurf/FindSurface-RealityKit-visionOS-Real-Time/blob/69da589566f36307120a64eeaeeea6597b64735f/FindSurfaceRT-visionOS/App/AppState.swift#L220-L235

Due to the limitation of the FindSurface's current implementation, FindSurface does not support multithreading. Therefore, if multiple threads call FindSurface's detection function simultaneously, unexpected behavior may occur.

To prevent this, perform method ensures that if FindSurface is already running, it will neither execute the provided closure nor start another detection, and it will immediately return nil. This mechanism helps prevent other threads from invoking FindSurface while one is already in progress.

In this project, FindSurface is called in an infinite loop on a single thread, so this issue shouldn't arise. However, in order to emphasize the importance of avoiding such problems, criticalSection wrapper function has been implemented using a semaphore. In practice, this semaphore will never stall in the wait method.

Once the perform function returns a result that is not nil, you can use it for rendering or verifying the measured values. For methods to query the measured values, refer to other examples, such as FindSurface-RealityKit-visionOS-Response-to-Request.