LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model

Scene Reconstruction

Pity but I am still unable to capture model's texture in realtime using the LiDAR scanning process. Neither at WWDC20 nor at WWDC22 Apple announced a native API for that (so texture capturing is only possible now using third-party APIs - don't ask me which ones :-) ).

However, there's good news – a new methodology has emerged at last. It will allow developers to create textured models from a series of shots.

Photogrammetry

Object Capture API, announced at WWDC 2021, provides developers with the long-awaited photogrammetry tool. At the output we get USDZ model with UV-mapped hi-rez texture. To implement Object Capture API you need macOS 12 and Xcode 13.

enter image description here

To create a USDZ model from a series of shots, submit all taken images to RealityKit's PhotogrammetrySession.

Here's a code snippet that spills a light on this process:

import RealityKit
import Combine

let pathToImages = URL(fileURLWithPath: "/path/to/my/images/")

let url = URL(fileURLWithPath: "model.usdz")

var request = PhotogrammetrySession.Request.modelFile(url: url, 
                                                   detail: .medium)

var configuration = PhotogrammetrySession.Configuration()
configuration.sampleOverlap = .normal
configuration.sampleOrdering = .unordered
configuration.featureSensitivity = .normal
configuration.isObjectMaskingEnabled = false

guard let session = try PhotogrammetrySession(input: pathToImages, 
                                      configuration: configuration)
else { return 
} 

var subscriptions = Set<AnyCancellable>()

session.output.receive(on: DispatchQueue.global())
              .sink(receiveCompletion: { _ in
                  // errors
              }, receiveValue: { _ in
                  // output
              }) 
              .store(in: &subscriptions)

session.process(requests: [request])

You can reconstruct USD and OBJ models with their corresponding UV-mapped textures.


You can check out the answer over here

It's a description of this project: MetalWorldTextureScan which demonstrates how to scan your environment and create a textured mesh using ARKit and Metal.


How it can be done in Unity

I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.

To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates, Color and Mesh Density.

enter image description here

If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.